# [Official] NVIDIA RTX 4090 Owner's Club



## Zurv

Let's do it!
I'm looking forward to power mods


----------



## Nizzen




----------



## rauf0

Where 900W bios please?


----------



## zzztopzzz

Please loan me 3 large so I can join this club.


----------



## Christopher2178

Looking at my 3090 like it’s trash now.. Let’s go!!!


----------



## Nizzen

gg


----------



## J7SC

Nizzen said:


>


...4090 Strix looks good, but 4090 _Ti _Strix (w/2x power connectors?) would look better. Either way, too bad the nice cooler has to go once the water-blocks arrive...


----------



## dr/owned

Gonna be good for watercoolers because waterblockers don't care about the dick-extension 6" of heatsink hanging off the end of the PCB. So relatively short PCB. 

Strix I'm betting on like 550W or something. They can't go over 600? on the single connector and we don't have EVGA anymore to put out 1000W bios.


----------



## yzonker

dr/owned said:


> Gonna be good for watercoolers because waterblockers don't care about the dick-extension 6" of heatsink hanging off the end of the PCB. So relatively short PCB.
> 
> Strix I'm betting on like 550W or something. They can't go over 600? on the single connector and we don't have EVGA anymore to put out 1000W bios.


Galax might help us out with a 1000w, but it might be 2 connector. Asus usually has one as well I think.


----------



## Christopher2178

Where is best place to buy the Asus ROG card as soon as possible once it drops? Do they have any preferred retailers or best place to buy from for USA. Thanks!


----------



## tps3443

It’s kinda funny, because while I have owned my 3090KP HC for a long while now. It still feels super freaking powerful to this day. It is OCed to the absolute max.

The 4090 better be a worth while upgrade. Just saying.


----------



## dr/owned

tps3443 said:


> It’s kinda funny, because while I have owned my 3090KP HC for a long while now. It still feels super freaking powerful to this day. It is OCed to the absolute max.


The 3000 series generally sucked for overclocking with KPE being the same overclocks as anything else. Core clock -> 2200 or 2250 tops. I guess memory overclocks were decent but FPS didn't really care so much.


----------



## StAndrew

dr/owned said:


> The 3000 series generally sucked for overclocking with KPE being the same overclocks as anything else. Core clock -> 2200 or 2250 tops. I guess memory overclocks were decent but FPS didn't really care so much.


I'd be happy to get my power draw from a screaming, overclocked, 600+ watt 3090 to a "meager" 450W.


----------



## PLATOON TEKK

Hope you all been bless!

Wildness for now. Since there’s no more KP to speak of, I’m guessing HOF will be the only gpu with software voltage control out the box.

Lets hope that Strix (or any other high end 4090) still has an I2C header this time round for some EVC.

Look forward to pcb shots.


----------



## dr/owned

PLATOON TEKK said:


> Lets hope that Strix (or any other high end 4090) still has an I2C header this time round for some EVC.


Step 1 though will be voltage actually mattering. The 3090 didn't benefit at all from more voltage (except maybe under LN2). I had the I2C header soldered on my TUF.


----------



## BigMack70

Here's hoping I can get a founders edition card... been really happy with my 3090 FE. Ampere was absolutely neutered for any real overclocking and with a launch TDP of 450W I expect that to be the case here also.


----------



## PLATOON TEKK

dr/owned said:


> Step 1 though will be voltage actually mattering. The 3090 didn't benefit at all from more voltage (except maybe under LN2). I had the I2C header soldered on my TUF.



I think we took different steps lol. I broke highest non ln2 Port Royal (at time of bench) and am still 11th globally.

Voltage absolutely matters, maybe on air or normal water it does little, but from my experience it did.

TUF wasn’t a great card, that i2c header was basically useless unless on a good bin Strix.

But I absolutely agree that adding voltage has been getting less effective with each series.



Spoiler: Step 1


----------



## J7SC

PLATOON TEKK said:


> I think we took different steps lol. I broke highest non ln2 Port Royal (at time of bench) and am still 11th globally.
> 
> Voltage absolutely matters, maybe on air or normal water it does little, but from my experience it did.
> 
> TUF wasn’t a great card, that i2c header was basically useless unless on a good bin Strix.
> 
> But I absolutely agree that adding voltage is getting less effective with each series.
> 
> 
> 
> Spoiler: Step 1
> 
> 
> 
> 
> View attachment 2572868


...are you sure you're not running low on Primoflex tubing ? 

Also, I would love to see Asus (re)introduce the Matrix series, so Galax HoF doesn't feel so lonely with KPE in detention


----------



## tps3443

dr/owned said:


> The 3000 series generally sucked for overclocking with KPE being the same overclocks as anything else. Core clock -> 2200 or 2250 tops. I guess memory overclocks were decent but FPS didn't really care so much.


How I have my 3090KP set up it’s 18% faster than a 3090FE. That’s a great in my book. While you could get other 3090’s to do this you’ll spend the same or even more than a 3090KP cost.

The 3090KP was a great card because right out of the box you had your 520 watts available. 1995Mhz Bin written on the die, good memory. And at just default speeds, it puts out more Tflops than a 3090Ti FE.

But, enough is enough! I’ve enjoyed it so much. I’m looking forward to the RTX4000 series.


----------



## PLATOON TEKK

J7SC said:


> ...are you sure you're not running low on Primoflex tubing ?
> 
> Also, I would love to see Asus (re)introduce the Matrix series, so Galax HoF doesn't feel so lonely with KPE in detention


haha the ridiculous amount of tubing was to run chilled water through the floors to all 5 pcs, including 2 floors. 11 chillers later and it looks like I copy and pasted the box 50 times.

Damn I still remember my 780ti platinum matrix! I totally forgot they were up the range with specs.

Will be interesting to see where Lucido himself goes, him and ASUS would be a great fit, judging by how well the Strix held up to the KPs and HOFs last series.




tps3443 said:


> How I have my 3090KP set up it’s 18% faster than a 3090FE. That’s a great in my book. While you could get other 3090’s to do this you’ll spend the same or even more than a 3090KP cost.
> 
> The 3090KP was a great card because right out of the box you had your 520 watts available. 1995Mhz Bin written on the die, good memory. And at just default speeds, it puts out more Tflops than a 3090Ti FE.
> 
> But, enough is enough! I’ve enjoyed it so much. I’m looking forward to the RTX4000 series.


I absolutely agree with you on that. Even compared to the HOFs the KPs fared far better on average last series (unlike 20 series). I dealt with 3 KPs (one rma) and they all performed pretty nice out the box. Even the odd air/radiator setup could be pushed pretty far with ice or direct into AC. Also had Optimus blocks available.

Im def keeping my eye on the Strix for now, judging from last gen. Can’t even imagine how the HOF pcb will turn out, let’s hope it actually performs properly this time round tho.

edit:looks like EK already on the waterblocks, but only founders for now (I don’t recommend EK at all however)


----------



## Glerox

I'm ready boyz. LFGO 💪


----------



## yzonker

PLATOON TEKK said:


> haha the ridiculous amount of tubing was to run chilled water through the floors to all 5 pcs, including 2 floors. 11 chillers later and it looks like I copy and pasted the box 50 times.
> 
> Damn I still remember my 780ti platinum matrix! I totally forgot they were up the range with specs.
> 
> Will be interesting to see where Lucido himself goes, him and ASUS would be a great fit, judging by how well the Strix held up to the KPs and HOFs last series.
> 
> 
> 
> 
> I absolutely agree with you on that. Even compared to the HOFs the KPs fared far better on average last series (unlike 20 series). I dealt with 3 KPs (one rma) and they all performed pretty nice out the box. Even the odd air/radiator setup could be pushed pretty far with ice or direct into AC. Also had Optimus blocks available.
> 
> Im def keeping my eye on the Strix for now, judging from last gen. Can’t even imagine how the HOF pcb will turn out, let’s hope it actually performs properly this time round tho.
> 
> edit:looks like EK already on the waterblocks, but only founders for now (I don’t recommend EK at all however)
> 
> View attachment 2572887


I wonder if they are using the same (or nearly the same) board as the 3090 ti then. Kinda feels like it given they already have a block.


----------



## mirkendargen

I'm also hedging towards a Strix purely based on 3090's, and I'm sure Bykski will have a block for them (and every other card most likely).


----------



## PLATOON TEKK

yzonker said:


> I wonder if they are using the same (or nearly the same) board as the 3090 ti then. Kinda feels like it given they already have a block.


Was thinking the same, or maybe they sent out samples earlier this time.

Also I just realized, it’s official; RIP Nvlink/ SLI ⚰ (unless Ti re-adds hardware, which I doubt). At least we don’t have to get two ha.


----------



## J7SC

PLATOON TEKK said:


> Was thinking the same, or maybe they sent out samples earlier this time.
> 
> Also I just realized, it’s official; RIP Nvlink/ SLI ⚰ (unless Ti re-adds hardware, which I doubt). At least we don’t have to get two ha.


...I was always very fond of NVL/SLI's symmetry, especially in an HEDT system with RAM on either side of the CPU.


Spoiler














...but as you said, at least we don't have to buy two - welcome to the age of 'asymmetric warfare'☺


----------



## dr/owned

mirkendargen said:


> I'm also hedging towards a Strix purely based on 3090's, and I'm sure Bykski will have a block for them (and every other card most likely).


I'm skipping Bykski and going Barrow this time around. I've had 2 nickel plating failures now with the 3090 on Bykski blocks. Whatever their plating is nowadays, it's really weak. 

And this is with Mayhem X1 Clear premix so it's not like it's sketchy fluid.


----------



## z390e

PLATOON TEKK said:


> Will be interesting to see where Lucido himself goes, him and ASUS would be a great fit, judging by how well the Strix held up to the KPs and HOFs last series.


Praying he joins Asus now that EVGA is about to be RIP.


----------



## ZealotKi11er

dr/owned said:


> I'm skipping Bykski and going Barrow this time around. I've had 2 nickel plating failures now with the 3090 on Bykski blocks. Whatever their plating is nowadays, it's really weak.
> 
> And this is with Mayhem X1 Clear premix so it's not like it's sketchy fluid.


Their blocks are garbage. No instruction, no proper mounting screws and same as you nickel plating failed. Worse of all performance was garbage got my 6900XT vs Heatkiler.


----------



## isamu

I've got my 4090 in the mail today. It's......OK. Nothing mind blowing.


----------



## bastian

4090 will be mine. Most likely MSI or Founders.


----------



## dr/owned

ZealotKi11er said:


> Their blocks are garbage. No instruction, no proper mounting screws and same as you nickel plating failed. Worse of all performance was garbage got my 6900XT vs Heatkiler.


I don't mind the lack of instructions or thermal pads cause I can sort that out and save money on the block. Just the nickel plating is not acceptable. And it wasn't a problem on my 1080Ti at all so they must have switched suppliers or process for their plating.


----------



## J7SC

My fav water blocks these days are from Phanteks (using one for the 3090 Strix, and 2x X570s). I also do have a Bykski block on a 6900XT as well, and the nickel coating is still pristine. That said, the manual had outright errors (confusing thermal pads with thermal paste), and the GPU block they sent was actually the wrong one for my custom card, but I just modded it with a Dremel.

I hope Phanteks makes a nice block for the 4090 / Ti Strix, by early next year at least...


----------



## Glerox

Any rumors on 7900XT performance vs 4090? I'm no fan boy, just want to get the fastest GPU.


----------



## Blameless

Glerox said:


> Any rumors on 7900XT performance vs 4090? I'm no fan boy, just want to get the fastest GPU.


Only extrapolations from supposed specs. Going to need to wait for more information.

I'm betting on the RTX 4090 being meaningfully faster than the top RX 7000 series part, but the top RX 7000 being faster than the RTX 4080 16G.


----------



## J7SC

Glerox said:


> Any rumors on 7900XT performance vs 4090? I'm no fan boy, just want to get the fastest GPU.


...speaking of rumours, this has been floating around. The 7900 XT_X _could be an enterprise model, though. I am also hoping to learn a little bit more about the upcoming 4090 Ti, ie. core count, release date and pricing


----------



## Glerox

7900 XTX !?

I used to order the Nvidia's top card on day 1 since Pascal, but this time I will wait comparisons with AMD's top card. 

With all the shady business going on at Nvidia recently (loss of EVGA, rtx 4070 disguised as 4080 12GB), I would be happy to give AMD a try if their card is competitive. My 2 cents.


----------



## Krzych04650

Do we know DisplayPort version yet? I've seen claims it is still 1.4.

EDIT Yea it is 1.4a. Need to get the model that has more than one HDMI then. This complicates stuff quite a lot. Guess it also means DP 2.0 monitors aren't going to be a thing anytime soon.


----------



## Spiriva

Listed at a Swedish store, no price tho.









ASUS GeForce RTX 4090 24GB ROG Strix Gaming OC


ASUS GeForce RTX 4090 24GB ROG Strix Gaming OC finns hos Inet! Sveriges bästa datorbutik med det senaste för gaming- & teknikentusiaster.




www.inet.se













ASUS GeForce RTX 4090 24GB TUF Gaming OC


ASUS GeForce RTX 4090 24GB TUF Gaming OC finns hos Inet! Sveriges bästa datorbutik med det senaste för gaming- & teknikentusiaster.




www.inet.se


----------



## Benni231990

Now the 1 million € Question is 

If i buy the "cheap" Tuf can i flash simply to the Strix Bios? Both cards has only 1 Power Connector so on this generation we dont have any Problems like 3090 with 2 vs 3x8pin?


----------



## PLATOON TEKK

J7SC said:


> ...I was always very fond of NVL/SLI's symmetry, especially in an HEDT system with RAM on either side of the CPU.
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2572909
> 
> 
> 
> ...but as you said, at least we don't have to buy two - welcome to the age of 'asymmetric warfare'☺


I like the sounds of that, asymmetric warfare! SLI had a charm for sure, quad starting looking like servers though ha.

Has anyone spotted a 4090 with more than one power connector? As Benni mentioned above the Strix only sports one of them. If not, that will probably be their move for the Ti.


----------



## Benni231990

2 Power connectors cards will save only the HOF (Kingpin xD but now we will know we never see this card) and the Titan or we must wait for Ti


----------



## Krzych04650

Single connector can theoretically go to 675W I think (600W connector, 75W slot) and there were rumors that maximum allowed power limit is going to be 660W, maybe it is.


----------



## Nizzen

gg


Krzych04650 said:


> Single connector can theoretically go to 675W I think (600W connector, 75W slot) and there were rumors that maximum allowed power limit is going to be 660W, maybe it is.


Theory never stopped us overclockers 

780ti classified with 2x 8pin running 650w with evbot 
"375w max" 🤣


----------



## Krzych04650

Nizzen said:


> gg
> 
> Theory never stopped us overclockers
> 
> 780ti classified with 2x 8pin running 650w with evbot
> "375w max" 🤣


Yea of course not, but if this limit for AIB models is in place then it could explain the lack of dual connector models. 

Also, having to run 3x8pin to 16-pin octopus adapter for each connector may have something to do with that as well


----------



## BigMack70

So what website do we need to F5 spam on the 12th for a Founders Edition card - Is Nvidia selling those direct again from their website, or is it just going to be Best Buy?


----------



## Tyl3n0L

Any official words on Canadian MSRP yet? Couldn't find anything online.


----------



## jtclfo

PLATOON TEKK said:


> edit:looks like EK already on the waterblocks, but only founders for now (I don’t recommend EK at all however)


I’m being sent an EK block and haven’t in general had any issues in the past but just curious what to be aware of about them from your experience?

Also I could only wish he would go to Galax but I don’t think it’ll happen. Asus will most likely be the new sponsor.


----------



## ttnuagmada

Preordered an EK block for the FE yesterday. Hopefully I can actually get one when they launch. This will replace a 3080 FE in my HTPC. I won't do it right away, but i'll probably get a Strix for my main rig to replace my 3090 strix.

I see people complain about EK, but I've gone through probably 6-7 CPU and 5 GPU blocks in the last 5-6 years and haven't encountered a single issue.


----------



## mirkendargen

dr/owned said:


> I'm skipping Bykski and going Barrow this time around. I've had 2 nickel plating failures now with the 3090 on Bykski blocks. Whatever their plating is nowadays, it's really weak.
> 
> And this is with Mayhem X1 Clear premix so it's not like it's sketchy fluid.


I had one plating failure after about 6 months on one I got right at release, and they sent me a new one free that's been going strong since then. This was with Mayhem's Pastel v2. I'm hoping they have whatever their issues were resolved.


----------



## TMavica

Why there is no AIO of ROG 4090....
is there any news about it?


----------



## Glerox

TMavica said:


> Why there is no AIO of ROG 4090....
> is there any news about it?


 No, just MSI Suprim Liquid


----------



## Templar2k

Put in an order for a Palit 4090 GameRock OC, should get within a few days when they go public.


----------



## Nizzen

Templar2k said:


> Put in an order for a Palit 4090 GameRock OC, should get within a few days when they go public.


 "Order"


----------



## z390e

gonna hold out for TI at the very minimum this round

wishing all buyers a great chip


----------



## Templar2k

Nizzen said:


> "Order"


They have a list and first come first served, I'm actually nr.1 and the shop told me they will be getting an allocation from the first batch. I got my 3090 on the same day they were release from the same shop so I'm hopeful it'll be fast again.


----------



## RetroWave78

ttnuagmada said:


> Preordered an EK block for the FE yesterday. Hopefully I can actually get one when they launch. This will replace a 3080 FE in my HTPC. I won't do it right away, but i'll probably get a Strix for my main rig to replace my 3090 strix.
> 
> I see people complain about EK, but I've gone through probably 6-7 CPU and 5 GPU blocks in the last 5-6 years and haven't encountered a single issue.


Thanks for relaying EKWB pre-order availability, I pre-ordered today. That said, my EK 3090 FE block did exhibit nickel plating erosion. In regards to quality, Bykski is on par with EKWB for half the price. I have a mix of Bykski and EKWB fittings in my loop.


----------



## RetroWave78

BigMack70 said:


> Here's hoping I can get a founders edition card... been really happy with my 3090 FE. Ampere was absolutely neutered for any real overclocking and with a launch TDP of 450W I expect that to be the case here also.


It's not that GA-102 sucked at overclocking per se, it's that Ampere was clocked so aggressively from the factory, right at the edge of the power / efficiency curve (quite a bit over actually, performance per watt really plateaus around 1850 MHz @ .908v @ 250w and tapers off there, yielding marginally more performance, i.e. 5-10% gain for 25% more power at 2GHz @ 400w+).

This is why I'm not even interested in Strix or other offerings that attempt to push the envelope. If you look, 3090 Strix, KPE etc., were capable of another 10% freq over FE (i.e. 2250 MHz) but this required 600w to hold. Another reason I'm going FE again this time around is water-block availability. Also, 4090 FE + $250 EKWB will be around Strix asking price. 

Strix looks amazing and I applaud the engineering ASUS is packing into it, if I didn't already have a serious loop plumbed I would definitely consider that if I intended to stay on air.


----------



## yzonker

RetroWave78 said:


> It's not that GA-102 sucked at overclocking per se, it's that Ampere was clocked so aggressively from the factory, right at the edge of the power / efficiency curve (quite a bit over actually, performance per watt really plateaus around 1850 MHz @ .908v @ 250w and tapers off there, yielding marginally more performance, i.e. 5-10% gain for 25% more power at 2GHz @ 400w+).
> 
> This is why I'm not even interested in Strix or other offerings that attempt to push the envelope. If you look, 3090 Strix, KPE etc., were capable of another 10% freq over FE (i.e. 2250 MHz) but this required 600w to hold. Another reason I'm going FE again this time around is water-block availability. Also, 4090 FE + $250 EKWB will be around Strix asking price.
> 
> Strix looks amazing and I applaud the engineering ASUS is packing into it, if I didn't already have a serious loop plumbed I would definitely consider that if I intended to stay on air.


You're going to have 600w anyway. Lol. 









NVIDIA Details GeForce RTX 4090 Founders Edition Cooler & PCB Design, new Power Spike Management


NVIDIA detailed the design of its GeForce RTX 4090 Founders Edition graphics card in a briefing with us today. While the new Founders Edition looks very similar to the RTX 3090 Ti Founders Edition, NVIDIA says it has made several changes to its design. The metal outer frame now comes with a...




www.techpowerup.com


----------



## RetroWave78

yzonker said:


> You're going to have 600w anyway. Lol.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA Details GeForce RTX 4090 Founders Edition Cooler & PCB Design, new Power Spike Management
> 
> 
> NVIDIA detailed the design of its GeForce RTX 4090 Founders Edition graphics card in a briefing with us today. While the new Founders Edition looks very similar to the RTX 3090 Ti Founders Edition, NVIDIA says it has made several changes to its design. The metal outer frame now comes with a...
> 
> 
> 
> 
> www.techpowerup.com


I just read this in the other forum, I'm actually stoked about this. 400W meant holding 2GHz was only possibly in some scenarios by an aggressive undervolt (i.e. 2000 MHz @ .950v) which was only really stable if the core was kept under 47-48C which meant liquid metal and 1200w of radiator surface area quickly became a necessity. It still dips to 1950 MHz in Timespy even @ .950v @ 400w.


----------



## dr/owned

yzonker said:


> You're going to have 600w anyway. Lol.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA Details GeForce RTX 4090 Founders Edition Cooler & PCB Design, new Power Spike Management
> 
> 
> NVIDIA detailed the design of its GeForce RTX 4090 Founders Edition graphics card in a briefing with us today. While the new Founders Edition looks very similar to the RTX 3090 Ti Founders Edition, NVIDIA says it has made several changes to its design. The metal outer frame now comes with a...
> 
> 
> 
> 
> www.techpowerup.com


Thank for posting this up. Yeah I doubt NV is going to let anyone have a BIOS above 600W anyways unless they have 2 connectors, and Strix don't got dat. So FE it is....

Although I wonder if there's going to be some nonsense about needing a 4x8 pin adapter to enable 600W...cause in theory the 3x is only "rated" for 450...even though that's complete BS and you can easily slam 400W through a good 8 pin connector.


----------



## yzonker

dr/owned said:


> Thank for posting this up. Yeah I doubt NV is going to let anyone have a BIOS above 600W anyways unless they have 2 connectors, and Strix don't got dat. So FE it is....
> 
> Although I wonder if there's going to be some nonsense about needing a 4x8 pin adapter to enable 600W...cause in theory the 3x is only "rated" for 450...even though that's complete BS and you can easily slam 400W through a good 8 pin connector.


Dunno. Sounds like most companies are including an adapter but I haven't seen one to know if it's 3 or 4 8pins. The AIBs might go to the rumored 660w limit by utilizing the pcie slot power, but I doubt that will make a much difference. (if any)


----------



## BigMack70

I don't understand enthusiasm about taking these cards to 600W. You wind up generating 30% more power and heat for 10% more clocks for 5% more performance. 

It's cool for benchmarking and 3Dmark runs, but meaningless for gaming.

IMO the age of overclocking for real-world performance is dead. Both CPU and GPU companies have figured out how to get their products to all operate within 5-10% of their limits out of the box. It used to be that 20-35% overclocks were normal and you could easily bump your card up at least one entire performance class by overclocking. Those days are done, and probably never coming back.


----------



## dr/owned

BigMack70 said:


> I don't understand enthusiasm about taking these cards to 600W. You wind up generating 30% more power and heat for 10% more clocks for 5% more performance.
> 
> It's cool for benchmarking and 3Dmark runs, but meaningless for gaming.
> 
> IMO the age of overclocking for real-world performance is dead. Both CPU and GPU companies have figured out how to get their products to all operate within 5-10% of their limits out of the box. It used to be that 20-35% overclocks were normal and you could easily bump your card up at least one entire performance class by overclocking. Those days are done, and probably never coming back.


I'd be willing to consume 100% more power for 10% more clocks...

Allegedly they're talking 3Ghz overclock vs. 2.5Ghz boost. That would be huge.

EDIT: Also I think you get better stability if your overclock isn't bouncing off a power limiter every millisecond.


----------



## BigMack70

dr/owned said:


> I'd be willing to consume 100% more power for 10% more clocks...
> 
> Allegedly they're talking 3Ghz overclock vs. 2.5Ghz boost. That would be huge.


There's no way that 3 GHz is a 24/7 clock using air or water cooling. That's a sub-ambient number that means nothing.


----------



## Benni231990

100% sub-zero only 3ghz never you will see a 4090 on water with 3ghz 

maybe 2750-2800 on water


----------



## dr/owned

Benni231990 said:


> 100% sub-zero only 3ghz never you will see a 4090 on water with 3ghz
> 
> maybe 2750-2800 on water


Unless this is real: Alleged GeForce RTX 4090 GPU runs at 3.0 GHz, scores 20K points in TimeSpy Extreme - VideoCardz.com

There's no way 65C peak is LN2.

(If it's real it's probably some AIB engineer in their lab with cherry picked stuff though)


----------



## Blameless

No idea how these will actually clock. Samsung 8N to TSMC 4N is a hugely different process. I'm not betting on 3GHz on noisy air in favorable ambients, or even top water, but I'm not going to rule out the possibility without seeing some samples in action.


----------



## Glerox

Most rumors/opinions I've read say 7900XT won't be faster than 4090 so... I've pre-ordered EK's block. Let's go!


----------



## Nizzen

BigMack70 said:


> I don't understand enthusiasm about taking these cards to 600W. You wind up generating 30% more power and heat for 10% more clocks for 5% more performance.
> 
> It's cool for benchmarking and 3Dmark runs, but meaningless for gaming.
> 
> IMO the age of overclocking for real-world performance is dead. Both CPU and GPU companies have figured out how to get their products to all operate within 5-10% of their limits out of the box. It used to be that 20-35% overclocks were normal and you could easily bump your card up at least one entire performance class by overclocking. Those days are done, and probably never coming back.


Overclocking dead?
For gaming, memory oc is the meta. Example: 4800xmp green dell ddr5 vs tweaked to 7000c30.
Got 35% higher minimumfps in battlefield. This was just overclocking memory, without touching cpu.

OC is dead?

Looks like this forum is more active than ever, so OC is far from dead


----------



## 7empe

Nizzen said:


> Overclocking dead?
> For gaming, memory oc is the meta. Example: 4800xmp green dell ddr5 vs tweaked to 7000c30.
> Got 35% higher minimumfps in battlefield. This was just overclocking memory, without touching cpu.
> 
> OC is dead?
> 
> Looks like this forum is more active than ever, so OC is far from dead


You're playing at 1080p, right?


----------



## ArcticZero

Looks like performance difference is enough to justify upgrading my shunt modded 3090. As much as I want to preorder EK's block, it is literally the only component in my loop that looks corroded thanks to the nickel wearing off.

I know they've since disappeared from public communication but I'm hoping Optimus comes up with a 4090 block. Otherwise probably going for a Heatkiller or Alphacool.


----------



## Nizzen

7empe said:


> You're playing at 1080p, right?


Some games on 1080p 360hz and some on 3440x1440 160hz 

4k monitor is just for discord, hwinfo etc


----------



## 7empe

Nizzen said:


> Some games on 1080p 360hz and some on 3440x1440 160hz
> 
> 4k monitor is just for discord, hwinfo etc


Imagine hwinfo at 8k! So much data to look at


----------



## yzonker

Here's this answer I suspect in regards to the adapters for older PSUs. 

"4x 8-pin-to-12VHPWR cable (Limited service life with up to 30 connect / disconnects)"



https://www.zotac.com/us/product/graphics_card/zotac-gaming-geforce-rtx-4090-amp-extreme-airo#spec


----------



## 7empe

yzonker said:


> Here's this answer I suspect in regards to the adapters for older PSUs.
> 
> "4x 8-pin-to-12VHPWR cable (Limited service life with up to 30 connect / disconnects)"
> 
> 
> 
> https://www.zotac.com/us/product/graphics_card/zotac-gaming-geforce-rtx-4090-amp-extreme-airo#spec


Is it made of paper or what?


----------



## Glottis

yzonker said:


> "4x 8-pin-to-12VHPWR cable (Limited service life with up to 30 connect / disconnects)"


LMAO what? Does this mean 12+4 pin connector is very flimsy and fragile? It has to be the weak link here, as I don't see why 8pin connector side of the cable would have this limitation.


----------



## yzonker

Glottis said:


> LMAO what? Does this mean 12+4 pin connector is very flimsy and fragile? It has to be the weak link here, as I don't see why 8pin connector side of the cable would have this limitation.


None of those connectors are really meant to be plugged/unplugged a large number of times. The connector is always the weak link. It will potentially gradually increase in resistance. If power draw is low, not an issue but that's not what we're doing here.


----------



## Benni231990

How connects and disconnets the adapter 30 times xD in a normal life maybe 2-3 times you connect and disconnet the adapter


----------



## BigMack70

Nizzen said:


> Overclocking dead?
> For gaming, memory oc is the meta. Example: 4800xmp green dell ddr5 vs tweaked to 7000c30.
> Got 35% higher minimumfps in battlefield. This was just overclocking memory, without touching cpu.
> 
> OC is dead?
> 
> Looks like this forum is more active than ever, so OC is far from dead


I stand corrected. One niche scenario where you *can *achieve meaningful performance gains by overclocking memory for low resolution super high refresh rate CPU bottlenecked games *completely *negates my point about CPU and GPU overclocking being dead for meaningful performance gains. Completely. I can't believe how wrong I am.


----------



## sew333

will be again stock problems in shops after 4090 launch?


----------



## Hulk1988

Benni231990 said:


> How connects and disconnets the adapter 30 times xD in a normal life maybe 2-3 times you connect and disconnet the adapter


I see already some testers on Youtube will testing it


----------



## Benni231990

der8auer will test it save 

But i have a other question in some pictures we see a 3x8pin adapter to the new 1x16pin on others pictures you see a 4x8pin adapter for 600watt

When i maybe buy a "cheap" Tuf or zotac or something else and flash the card with the strix bios (according to rumours it will be 600 watt)an this cards only has a 3x8pin adapter can i also give the card 600watt or need i a 4x8pin adapter?


----------



## PLATOON TEKK

“As soon as DLSS 3 was enabled, the GPU saw the wattage drop to 348W or a 25% reduction. This also increased the perf per watt to 0.513, an increase of 3.8x.”

Interesting to see the wattage fall so drastically because of switching to tensor cores. Also apparently with “overclocking” the gpu does hit 3k clock.

Wccftech article



jtclfo said:


> I’m being sent an EK block and haven’t in general had any issues in the past but just curious what to be aware of about them from your experience?
> 
> Also I could only wish he would go to Galax but I don’t think it’ll happen. Asus will most likely be the new sponsor.


At first, literally all of my wc components were from EK (back when you could only order from them) and overtime, the failures have become so great and the machining so bad, I decided to switch. Every time I’ve tested 1:1 comparisons, even with “lower level”bykski of barrow blocks, they always performed better and rarely failed. I’ve had probably 6-7 ek blocks and components fail before I phased it all out.

Having seen a few ek blocks for the 3090s and the machining getting even worse, don’t know if I’ll buy from them again.


----------



## bmagnien

I highly doubt NV will allow 600w over their included 3:1 connector, as it's outside of spec. Beyond 450w might be limited to if the device detects a direct connection over the 12+4 pin PCIe Gen 5 ATX 12VHPWR connector, utilizing those additional 4 sense pins, meaning you'd need a new PSU with a PCIe Gen 5 ATX. OR, it might require a third-party 4:1 connector.


----------



## BigMack70

PLATOON TEKK said:


> Interesting to see the wattage fall so drastically because of switching to tensor cores. Also apparently with “overclocking” the gpu does hit 3k clock.
> 
> Wccftech article


That article suggests the "stock" clock as 2800-2850 MHz, so 3 GHz would be a 5% overclock - about what I'd expect.


----------



## Blameless

bmagnien said:


> I highly doubt NV will allow 600w over their included 3:1 connector, as it's outside of spec. Beyond 450w might be limited to if the device detects a direct connection over the 12+4 pin PCIe Gen 5 ATX 12VHPWR connector, utilizing those additional 4 sense pins, meaning you'd need a new PSU with a PCIe Gen 5 ATX. OR, it might require a third-party 4:1 connector.


Short all four sense pins to ground and it's a 600w connector as far as the card can tell.



PLATOON TEKK said:


> “As soon as DLSS 3 was enabled, the GPU saw the wattage drop to 348W or a 25% reduction. This also increased the perf per watt to 0.513, an increase of 3.8x.”
> 
> Interesting to see the wattage fall so drastically because of switching to tensor cores.


It probably has less to do with any load on the tensor cores themselves and a bottleneck elsewhere keeping frame rates from being higher.


----------



## yzonker

bmagnien said:


> I highly doubt NV will allow 600w over their included 3:1 connector, as it's outside of spec. Beyond 450w might be limited to if the device detects a direct connection over the 12+4 pin PCIe Gen 5 ATX 12VHPWR connector, utilizing those additional 4 sense pins, meaning you'd need a new PSU with a PCIe Gen 5 ATX. OR, it might require a third-party 4:1 connector.


"If you are using a previous-gen power supply, an adapter cable is included in the box, allowing you to plug in three 8-pin power connectors, with an optional fourth connector for more overclocking headroom." 









NVIDIA GeForce News


Introducing GeForce RTX 40 Series Graphics Cards



www.nvidia.com


----------



## bmagnien

yzonker said:


> "If you are using a previous-gen power supply, an adapter cable is included in the box, allowing you to plug in three 8-pin power connectors, with an optional fourth connector for more overclocking headroom."
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce News
> 
> 
> Introducing GeForce RTX 40 Series Graphics Cards
> 
> 
> 
> www.nvidia.com


so....it's a 4:1 connector in the FE box? What a convuluted way to say that lol ***


----------



## yzonker

But to support my question of the number of 12v pins, 2x8pin!! What a circus... 



https://www.corsair.com/us/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-Gen-5-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284


----------



## bmagnien

yzonker said:


> But to support my question of the number of 12v pins, 2x8pin!! What a circus...
> 
> 
> 
> https://www.corsair.com/us/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-Gen-5-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284


Lol that's incredible. Plug that bad boy into a 650w PSU with a mobile CPU.


----------



## savormix

Let's see what the hybrid cooled models go for here in the EU, probably no less than 3k.

I still don't understand the AIB love for aircooling for a chip that starts with a 450W "throttled" (a.k.a. compatibility) VBIOS. There's no way to efficiently aircool this thing even on open bench (short of some industrial blower or directly hooking AC), even with the 450w VBIOS.


----------



## Hawk777th

Glad to see the forum more active! I am planning to get the 4090 and build a new rig when it comes out. I am still on a 5960X @ 5Ghz and Titan Xp so should be a huge upgrade for me. I am excited to see the reviews.


----------



## Krzych04650

On the topic of power connectors, Corsair just announced 12VHPWR cable for their existing PSU's 600W PCIe 5.0 / Gen 5 12VHPWR Type-4 PSU Power Cable (corsair.com)

Clock behavior is going to be the same as previous generations, that 2500 MHz boost will really be ~2800 MHz actual game clock, which was already confirmed NVIDIA GeForce RTX 4090 Runs At Up To 2850 MHz at Stock Within 50C Temps In Cyberpunk 2077, DLSS 3 Cuts GPU Wattage By 25% (wccftech.com)

Based on that, 3 GHz should be easy and possible even on air. Hard to say what will be the max, it depends on how choked stock settings are, but if they were considering coming out with 600W model, there must be scaling. Rumors were you can go from 20K TSE to 24K TSE with OC, which would be in line with gains on previous generations like 2080 Ti or 3090, they had around 18% OC potential vs choked FE. 

FE is already confirmed to have 600W max power limit from what I understand, which is quite generous for FE, 133% power limit. Turing and Ampere needed 150% for almost full potential on water, so it may still be a bit low, but maybe Lovelace is pushed harder out of the box already. Don't know if AIBs are allowed to go higher, I think rumors are 660W max. Which makes sense, 675W is max within spec with one connector and pci-e slot.


----------



## ttnuagmada

Benni231990 said:


> 100% sub-zero only 3ghz never you will see a 4090 on water with 3ghz
> 
> maybe 2750-2800 on water


I wouldn't be so sure, don't forget that the stock 3090 boost clock was 1695mhz. Nvidia always clocks the big chips super-conservative.


----------



## mirkendargen

Looks like Cablemod makes 3x8pin at the PSU to 12VHPWR for the common enthusiast modular PSUs. It's not completely clear from their wording that these plug straight into the PSU like the Corsair one rather than being an adapter, but the fact that they sell them for different PSU models tells me they must be.

I have a Corsair PSU but looking at the picture of their cable I'm worried it might not be long enough, so it's nice that I can get something longer from Cablemod if needed. I know every card is gonna come with an adapter, but less connections is a good thing here if it's possible.


----------



## ttnuagmada

yzonker said:


> But to support my question of the number of 12v pins, 2x8pin!! What a circus...
> 
> 
> 
> https://www.corsair.com/us/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-Gen-5-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284


Well that's scary.


----------



## dr/owned

ttnuagmada said:


> Well that's scary.


Not really. I ran 800W through 2x8 pin on my shunted 3090 TUF and it was fine. If Corsair is selling a product to do 600W with 2 connectors then it's WELL within spec. The "150W" thing was a joke assuming ****ty connectors, high ambient temperatures, 50 insertion cycles, 20awg cable, etc.


----------



## dr/owned

mirkendargen said:


> Looks like Cablemod makes 3x8pin at the PSU to 12VHPWR for the common enthusiast modular PSUs. It's not completely clear from their wording that these plug straight into the PSU like the Corsair one rather than being an adapter, but the fact that they sell them for different PSU models tells me they must be.
> 
> I have a Corsair PSU but looking at the picture of their cable I'm worried it might not be long enough, so it's nice that I can get something longer from Cablemod if needed. I know every card is gonna come with an adapter, but less connections is a good thing here if it's possible.


Link for people who don't want to google:

CableMod (3x8 pin, corsair, 60cm): 
CableMod C-Series Pro ModMesh Sleeved 12VHPWR PCI-e Cable for Corsair – CableMod Global Store 
CableMod C-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for Corsair – CableMod Global Store

Corsair (2x8pin, unknown length) : https://www.corsair.com/us/en/Categories/Products/Accessories-|-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284


----------



## J7SC

I think we might be in for a pleasant surprise re. clocks...apart from 3 GHz speculation for the 4090, there also this 4 GHz, ahem, rumour (bring salt shaker) for the > next AMD GPU ... In any event, my 7nm 6900XT already does 2800+ MHz effective w/o much trouble (just really good w-cooling) and the 3090 Strix is regularly around 2200+ MHz (again, extensive w-cooling). On the 3090, stock vbios is 450W, with 520W on secondary and 1Kw as an option (= realistically kept < 600W). 

What I do wonder about though are the transient spikes with the 4090 (and subsequent 4090 Ti) and some makes of PSUs, in addition to cabling.


----------



## Glottis

yzonker said:


> But to support my question of the number of 12v pins, 2x8pin!! What a circus...
> 
> 
> 
> https://www.corsair.com/us/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-Gen-5-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284





bmagnien said:


> Lol that's incredible. Plug that bad boy into a 650w PSU with a mobile CPU.





ttnuagmada said:


> Well that's scary.


You all do realize that these 2x8pin are on the PSU side? It's not the same as 2x8pin standard on the GPU side. PSU side anything goes, it's up to PSU maker. This cable is fine.


----------



## dr/owned

ttnuagmada said:


> Well that's scary.





Glottis said:


> You all do realize that these 2x8pin are on the PSU side? It's not the same as 2x8pin standard on the GPU side. PSU side anything goes, it's up to PSU maker. This cable is fine.


This.

12VHPWR is 6 +12V pins:









Each PSU side connector is 4 +12V pins. So 2 of them is sufficient to cover the 12-pin with 2 pins to spare.









8 pin PCIe connectors only have 3 +12V pins









I assume they were doing 3x 8 pin adapters because they're figuring some losses in the adapter connectors.

4x8 pin adapters are stupid and probably only being done assuming the stupid 150W 8pin limit that isn't relevant to real PSUs.


----------



## Benni231990

so i dont need a 4x8pin adapter for 600watt? and the 3x8pin adapter will also works for the 600watt strix bios?


----------



## ttnuagmada

https://www.corsair.com/us/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/12-pin-GPU-Power-Cable/p/CP-8920274



Is this the same thing?


----------



## yzonker

ttnuagmada said:


> https://www.corsair.com/us/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/12-pin-GPU-Power-Cable/p/CP-8920274
> 
> 
> 
> Is this the same thing?


No that's the old one probably for the 3090Ti. It doesn't have the sense pins for one thing.


----------



## bmagnien

J7SC said:


> What I do wonder about though are the transient spikes with the 4090 (and subsequent 4090 Ti) and some makes of PSUs, in addition to cabling.


"The card features a 23-phase VRM (20-phase GPU + 3-phase memory). NVIDIA claims that it has re-architected its VRM to offer superior transient voltage management. This is specifically to minimize the kind of spikes or excursions we've seen with previous-gen high-end graphics cards such as the RTX 3090 Ti and Radeon RX 6900 Series. These spikes often causes spontaneous shutdowns on some power supplies, even if they had a much higher wattage rating than required. "

This is exactly what would happen to me. 3090 520W bios would shut off my PC on Witcher 3 of all games with a 1000w PSU. The graph below shows the spikes being kept inline, and the article states that the same might not hold true for AIB cards if their VRM solution is subpar.
















NVIDIA Details GeForce RTX 4090 Founders Edition Cooler & PCB Design, new Power Spike Management


NVIDIA detailed the design of its GeForce RTX 4090 Founders Edition graphics card in a briefing with us today. While the new Founders Edition looks very similar to the RTX 3090 Ti Founders Edition, NVIDIA says it has made several changes to its design. The metal outer frame now comes with a...




www.techpowerup.com


----------



## GraphicsWhore

Aiming for 4090 Strix + Optimus. Let's go.


----------



## J7SC

...it's interesting how small the actual PCB of the 4090 seems to be (time-stamped YT below) on full water-block models compared to the air-cooled 'variety'...


----------



## mirkendargen

J7SC said:


> ...it's interesting how small the actual PCB of the 4090 seems to be (time-stamped YT below) on full water-block models compared to the air-cooled 'variety'...


Everyone is just doing what Nvidia did with the FE's on 3090's and having a short card with a blow through cooler on the end.


----------



## Glerox

mirkendargen said:


> Everyone is just doing what Nvidia did with the FE's on 3090's and having a short card with a blow through cooler on the end.


I wonder if the reference design of the 4090 for AIBs is now the same as the FE design (for the 3090, the reference design was longer than the FE)


----------



## RetroWave78

I just ordered this from CableMod, this is for Asus ROG Thor 1200w but my understanding is that PSU side, 8 pin PCIE cables are all the same so this should work for anyone who is interested (white, 5x white alum combs, 700mm, black alum cable shroud on 16 pin side, 4x8 PCIE PSU side):

https://custom.cablemod.com/de2d69ee7cce


----------



## J7SC

mirkendargen said:


> Everyone is just doing what Nvidia did with the FE's on 3090's and having a short card with a blow through cooler on the end.


...yeah, Asus did that with the 3090 Strix as well, which came in at 315 mm w/stock air-cooler but ~275 mm 'just PCB' wise (spoiler)...still a bit shorter than the custom 6900XT PCB though. The 4090s' PCB width could vary a bit as there are different phase counts and such depending on the specific model, but the inno3D shown above is exactly *200 mm*. With s.th. like an Asus Gene, that (or the 4090 Ti version) could be a killer smallish-formfactor build project next year .


Spoiler


----------



## dr/owned

RetroWave78 said:


> I just ordered this from CableMod, this is for Asus ROG Thor 1200w but my understanding is that PSU side, 8 pin PCIE cables are all the same so this should work for anyone who is interested (white, 5x white alum combs, 700mm, black alum cable shroud on 16 pin side, 3x8 PCIE PSU side):
> 
> 
> 
> 
> 
> 
> Configurator – CableMod Global Store
> 
> 
> 
> 
> 
> 
> 
> custom.cablemod.com


No, you can't rely on PSU-side cables to ever be the same pinout between brands...even if the plug fits. That's a real quick way to blow up something.


----------



## ttnuagmada

So do we know how Nvidia is going to sell the FE's? I know the 3xxx's could be bought from Best Buy, but I think Best Buy is doing them in-store only now. Is that the only way we'll be able to get them?


----------



## GRABibus

tps3443 said:


> How I have my 3090KP set up it’s 18% faster than a 3090FE. That’s a great in my book. While you could get other 3090’s to do this you’ll spend the same or even more than a 3090KP cost.
> 
> The 3090KP was a great card because right out of the box you had your 520 watts available. 1995Mhz Bin written on the die, good memory. And at just default speeds, it puts out more Tflops than a 3090Ti FE.
> 
> But, enough is enough! I’ve enjoyed it so much. I’m looking forward to the RTX4000 series.


Same for me.
I enjoy so much my 3090 KP Hybrid….
BUT, let’s for 4090 !


----------



## GRABibus

MSI passe la RTX 4090 Suprim en version Liquid !


MSI vient de dévoiler une RTX 4090 Suprim mais en version Liquid qui utilise un radiateur de 240 mm pour tenir au frais le GPU.




overclocking.com










MSI GeForce RTX™ 4090 SUPRIM LIQUID 24G


GeForce RTX™ 4090 SUPRIM LIQUID 24G features liquid cooling for the GPU and air cooling for VRMs and a sturdy brushed metal backplate providing passive cooling. The MSI SUPRIM LIQUID is easy to install, requires no additional maintenance.




www.msi.com


----------



## Knight091

z390e said:


> Praying he joins Asus now that EVGA is about to be RIP.


Ya I never seen a company kill themselves like EVGA has done. I read the news and thought it was a joke....


----------



## GRABibus

Knight091 said:


> Ya I never seen a company kill themselves like EVGA has done. I read the news and thought it was a joke....


They don’t kill themselves.
They will still earn money without nvidia GPU’s, otherwise they wouldn’t have taken this decision.

They just don’t want to be « Nvidia slaves » anymore.


----------



## z390e

GRABibus said:


> They don’t kill themselves.
> They will still earn money without nvidia GPU’s, otherwise they wouldn’t have taken this decision.
> 
> They just don’t want to be « Nvidia slaves » anymore.


Their front page splash ad is still an EVGA RTX 30 series with Spider Man.

Their company will be bankrupt shortly. I am happy to bet any person on here they will not be a real company in 5 years.


----------



## RetroWave78

dr/owned said:


> No, you can't rely on PSU-side cables to ever be the same pinout between brands...even if the plug fits. That's a real quick way to blow up something.


This is where I'm confused, if the PSU side pinout for 8 pin PCIE is all the same and they are all going to the same GPU (in this case 12+4 micro), how can there possibly be a scenario of mismatched pins? I understand CPU, 24 pin, molex and SATA pinouts can and do differ PSU side, but 8 pin PCIE they are all the same.


----------



## Knight091

GRABibus said:


> They don’t kill themselves.
> They will still earn money without nvidia GPU’s, otherwise they wouldn’t have taken this decision.
> 
> They just don’t want to be « Nvidia slaves » anymore.


Nvidia cards sales makes up like 80% of there sales. I think that is killing themselves.


----------



## mirkendargen

Knight091 said:


> Nvidia cards sales makes up like 80% of there sakes. I think that is killing themselves.


Apparently they were taking a loss on everything 3080 and above so...GPUs being 80% of their revenue isn't a good indication of how solvent they still are as a company. If anything it's possible they're more profitable without GPU's, although I suspect even the unprofitable GPU's drove traffic and sales of their more profitable stuff.

I also think there's room for them to expand in the motherboard market. They were great in the nForce/x59/x79 days but only dip their toes in a bit now, and I think there's room in that space for another more serious player.


----------



## J7SC

Going from not making any profit to an actual loss per card is a big step since profit = revenue minus cost (with cost including R+D, manufacturing, cost of capital and labour etc). What I find a bit sad is that 'very senior NVidia managers' (?) knew in April, according to Gamers Nexus and a few others per their first-hand interview with the bosses of EVGA. That sounds like hard bargaining and entrenched positions that did not resolve - but what do EVGA's employees think about that lead time as they seemed to be genuinely surprised by the EVGA NVidia announcement ?

But back to the business at hand...if many of the 4090s come with 4x8 pin cable dongles, will we actually see up to 675W (incl. PCI slot) with full PL slider on some 4090s (before any potential XOC vbios) ? I have 1300W platinum-rated PSUs which have been juicing 2x 2080 Ti with a combined max of 760W over 4x 8 pin PCIe w/o any problems since late '18.


----------



## GRABibus

J7SC said:


> Going from not making any profit to an actual loss per card is a big step since profit = revenue minus cost (with cost including R+D, manufacturing, cost of capital and labour etc). What I find a bit sad is that 'very senior NVidia managers' (?) knew in April, according to Gamers Nexus and a few others per their first-hand interview with the bosses of EVGA. That sounds like hard bargaining and entrenched positions that did not resolve - but what do EVGA's employees think about that lead time as they seemed to be genuinely surprised by the EVGA NVidia announcement ?
> 
> But back to the business at hand...if many of the 4090s come with 4x8 pin cable dongles, will we actually see up to 675W (incl. PCI slot) with full PL slider on some 4090s (before any potential XOC vbios) ? I have 1300W platinum-rated PSUs which have been juicing 2x 2080 Ti with a combined max of 760W over 4x 8 pin PCIe w/o any problems since late '18.


My Corsair HX1200 should be ok also...I hope.


----------



## ArcticZero

Also hoping my Seasonic Prime PX-1300 will be fine since I just got it a few weeks ago. Corsair has already released a statement about compatibility on existing SKUs though so it should be good to go.


----------



## Netarangi

Since we're talking PSU's here.. I currently have a hx750 and I assume this isn't enough for a 4080 16gb.

What's the best course of action here? I've just seen the new pci5 atx 3.0 from Sea Sonic are due for release in December. Are my options to either wait for new PSU's in December or get a 1200w and use adapter cables?


----------



## GRABibus

https://www.corsair.com/fr/fr/Cat%C3%A9gories/Produits/Accessoires-%7C-Pi%C3%A8ces/Composants-de-PC/Alimentations/600W-PCIe-5-0-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284


----------



## Glottis

Netarangi said:


> Since we're talking PSU's here.. I currently have a hx750 and I assume this isn't enough for a 4080 16gb.
> 
> What's the best course of action here? I've just seen the new pci5 atx 3.0 from Sea Sonic are due for release in December. Are my options to either wait for new PSU's in December or get a 1200w and use adapter cables?


Pretty sure your Corsair PSU is Type 4, so all you need is this cable https://www.corsair.com/12vhpwr
HX750 is enough for a 4080 16GB (as per Nvidia's spec sheet).


----------



## stahlhart

Uh oh.


----------



## Netarangi

Glottis said:


> Pretty sure your Corsair PSU is Type 4, so all you need is this cable https://www.corsair.com/12vhpwr
> HX750 is enough for a 4080 16GB (as per Nvidia's spec sheet).


Awesome I don't need a new PSU then!

Bad news, now I can justify buying one.. My morals are saying no but my body is saying yes.

So I'd just need one of those cables?


----------



## Knight091

People need to watch this...


----------



## Netarangi

Knight091 said:


> People need to watch this...


Fine I'll get a new PSU


----------



## mirkendargen

Knight091 said:


> People need to watch this...


That video makes no sense and is complete fear mongering click bait.

1. If the problem is the number of insertions the 16pin connector can survive, how the hell does a new PSU help? It's the same connector.

2. He goes on and on about the 16 pin connector insertions, then is like "oh Nvidia said the problem was actually on the PSU side of the adapter" .....k?

3. He talks about the sense pins missing. They're missing cause the adapter in his hand isn't actually a 12VHPWR adapter, the actual ones have the sense pins.

4. He talks like the sense pins are some sort of constant 2 way negotiation with the GPU requesting power, the PSU saying what is available, etc. Then shows *on the screen* what the spec of the sense pins actually is. It's just a constant ground/neutral combo on the pins to let the GPU know what the PSU supports. No reason at all an adapter can't pass through the ground to the appropriate sense pins based on the number of 8pins connected to it.

5. He says your GPU might pull <ridiculous large current> at boot up and fry your PSU. Huh? Then gives a hypothetical that your PSU has 2+ rails and you plug all the plugs to the adapter into the same one and bad things happen. What if you plug all the 8pins to your current GPU into that same rail....? What is suddenly different now?

At the end of the day, you either have a PSU that can handle a potentially 600w GPU or you don't, that's what actually matters. What plugs come out of it being some giant concern is just FUD.


----------



## yzonker

He's been wrong so many times lately it isn't even funny. This is the guy that told everyone to run out and buy a 30 series card right before the giant price drop (because there would not be a better time!). He also stated in one video that Nvidia wasn't releasing anything this year and didn't care what AMD did. And there was one more, but I've forgotten what it was now. Nevertheless, he has proven himself clueless lately. 

And I just watched the video above and everything @mirkendargen said is correct.

The plugging in multiple times issue always exists, the difference here is that we are putting a lot of current through 6 pins in that plug (8 amps or more depending on the load distribution between them). That could be an issue certainly for a worn plug, but has nothing to do with the PSU.


----------



## mirkendargen

I mean don't get me wrong, I plan to get some form of direct PSU cable rather than use an adapter, I talked about it earlier in this thread....but not because I think an adapter is going to burn my house down. It's because I have enough of a cabling rat nest as is, and I'm willing to pay $20-$30 to not add to it, lol.


----------



## RetroWave78

G


mirkendargen said:


> That video makes no sense and is complete fear mongering click bait.
> 
> 1. If the problem is the number of insertions the 16pin connector can survive, how the hell does a new PSU help? It's the same connector.
> 
> 2. He goes on and on about the 16 pin connector insertions, then is like "oh Nvidia said the problem was actually on the PSU side of the adapter" .....k?
> 
> 3. He talks about the sense pins missing. They're missing cause the adapter in his hand isn't actually a 12VHPWR adapter, the actual ones have the sense pins.
> 
> 4. He talks like the sense pins are some sort of constant 2 way negotiation with the GPU requesting power, the PSU saying what is available, etc. Then shows *on the screen* what the spec of the sense pins actually is. It's just a constant ground/neutral combo on the pins to let the GPU know what the PSU supports. No reason at all an adapter can't pass through the ground to the appropriate sense pins based on the number of 8pins connected to it.
> 
> 5. He says your GPU might pull <ridiculous large current> at boot up and fry your PSU. Huh? Then gives a hypothetical that your PSU has 2+ rails and you plug all the plugs to the adapter into the same one and bad things happen. What if you plug all the 8pins to your current GPU into that same rail....? What is suddenly different now?
> 
> At the end of the day, you either have a PSU that can handle a potentially 600w GPU or you don't, that's what actually matters. What plugs come out of it being some giant concern is just FUD.


Going by one of the comments under the video, if the sense pins are absent the PSU is limited to 150w per cable. Jay has it backwards here, implying that if the sense pins are absent there is no limit to power pulled. 

Also, this should be pointed out, Jay basically worked PR for EVGA if you haven't noticed, now that EVGA is no longer partnered with Nvidia it's open season on big N (this is refreshing actually). EVGA sells power supplies, in fact their PSU sales rivaled their GPU sales. The marketing pitch here is that you need a new PSU. All that is missing is for Jay to recommend an EVGA PSU at the end of the video. 

While we are on the subject of how various tech-tubers are covering the 40 launch debacle (yes, listing a $900 70 card is 100% debacle territory) the following videos are recommended viewing to temper our enthusiasm for the launch. 

All of this said, I intend to pick up a 4090, but if I didn't have a Pimax 8KX or Samsung Odyssey G9 Neo to push, I absolutely would wait for RDNA3.

To me the 4090 is oddly the only card that presents a value proposition, and this is only because I'm sitting here with a $1500 3090 that I actually paid $2k for local sale off of craigslist way back in early 2021 because at the time I was making $10 a day off of my 2080 Ti mining Ethereum and rough math at $10-20 return on investment would mean the card would pay for itself in roughly 6-12 months time, accounting for energy use. 

That's another thing that I don't see mentioned much when discussing how out of touch 40 series pricing is, that many if not most of us who paid scalpers prices could expect a return on investment. Now that Ethereum has gone from Proof of Work to Proof of Stake and no money is to be had by mining with a GPU, this is no longer a value proposition, but an extreme luxury item. 

I understand why Nvidia is pricing the 40 series this high, that TSMC told Nvidia to pound sand when they tried to back out of the wafer allocation after realizing that 3rd party vendors, INCLUDING EVGA, were unable to sell their 30 series cards in anticipation of Ethereum going to Proof of Stake. 

At the heart of the matter here is shortsightedness. Nvidia threw it's most diehard, loyal consumer segment, those who return every 3 years to purchase another gaming GPU, under the bus when faced with the prospect of selling all of their inventory to crypto miners hand over fist (remember "Bounce Alert"?) with no care whatsoever as to what would happen should BTC experience another sharp decline like it did in the Fall of 2017. 

And so here we are, with $1600 80 class (yes, the "3090" and "4090" are 100% 80 class cards simply rebadged 90 class, see: the 12GB 4080) and $900 70 class cards, priced this way to make the warehouses upon warehouse of surplus 30 series cards seem like a bargain. 

And the only reason Nvidia is even releasing the 40 series now is because TSMC refused to allow Nvidia to back out of their 5nm wafer allotment. 

Thankfully, those of us who can afford to do so, stand a much better chance of actually buying a 4090 at launch this time around as the demand from miners and gaming enthusiasts, if current sentiment is any indication, should be non-existent. 

Everyone is waiting for RDNA3. Part of me is scratching my head as to why I'm getting a 4090 at all after having said all of this. Nvidia trying to shove Zuckerberg's Great Reset Metaverse down our throats during the reveal was nearly too much to stomach, in fact, that's precisely when I stopped watching. 

Nvidia justifies their ridiculous prices with garden wall features (Optical Flow DLSS 3.0, "only the 40 series can do it") whereas AMD offers all of their technologies for free, irrespective of the card or manufacturer (FSR, Free-Sync). 

Also, their 2x rasterization uplift over Ampere, remember people, we've seen this language before. In fact, they said the 3080 was 2x faster than the 2080 when in actuality it was maybe 40% faster. 

Same again here, the 4090 will be roughly 60-70% faster than 3090 in rasterization. An improvement over the gain we saw between Turing and Ampere to be sure but again, get your wallet out, youre going to pay for it. 

I'm hearing top-tier RDNA3 card will be within 15% of the 4090 in rasterization, will still undoubtedly get owned in RT, may hit 4GHz, may be more efficient and will certainly be cheaper. If AMD releases a card 15% of 4090 for the price of a 16GB 4080 it's absolutely game over for Nvidia. 

Nvidia made their bed with the crypto fiasco, we warned them this was a potential outcome, we asked them to curb the purchase quantity to miners, to make some effort to get the majority of the cards into the hands of gamers. All they saw was dollars signs. Here we are. It's Humanity's main weakness, we always choose short term gain / benefit over long term stability.


----------



## dr/owned

mirkendargen said:


> At the end of the day, you either have a PSU that can handle a potentially 600w GPU or you don't, that's what actually matters. What plugs come out of it being some giant concern is just FUD.


EE checking in: Yup. Your PSU either works fine...or it clicks off for whatever protection feature being triggered. Or you bought a really cheap and crappy one that makes no sense to be pairing with >$1000 GPU...(you can get a great PSU that'll last forever for a couple hundred bucks)

There is no point in pre-emptively upgrading your PSU.

Also for every time I heard "but but but tra-transients" like they're the boogieman....Steve @GN didn't even FIND any power supplies except a single NZXT SFF one that was incorrectly tripping OCP. He'd be blasting a 3 hour video out jerking himself off if he found any real evidence of the problem he wants to find. But no one seems to have watched the video and actually understood it. "oh look we used an oscilloscope zoomed down to nanosecond resolution and found power spikes!". Ok cool story bro. Let me know when that's a problem for the ton of capacitors on the PCB and in the power supply...


----------



## mirkendargen

RetroWave78 said:


> G
> 
> Going by one of the comments under the video, if the sense pins are absent the PSU is limited to 150w per cable. Jay has it backwards here, implying that if the sense pins are absent there is no limit to power pulled.
> 
> Also, this should be pointed out, Jay basically worked PR for EVGA if you haven't noticed, now that EVGA is no longer partnered with Nvidia it's open season on big N (this is refreshing actually). EVGA sells power supplies, in fact their PSU sales rivaled their GPU sales. The marketing pitch here is that you need a new PSU. All that is missing is for Jay to recommend an EVGA PSU at the end of the video.
> 
> While we are on the subject of how various tech-tubers are covering the 40 launch debacle (yes, listing a $900 70 card is 100% debacle territory) the following videos are recommended viewing to temper our enthusiasm for the launch.
> 
> All of this said, I intend to pick up a 4090, but if I didn't have a Pimax 8KX or Samsung Odyssey G9 Neo to push, I absolutely would wait for RDNA3.
> 
> To me the 4090 is oddly the only card that presents a value proposition, and this is only because I'm sitting here with a $1500 3090 that I actually paid $2k for local sale off of craigslist way back in early 2021 because at the time I was making $10 a day off of my 2080 Ti mining Ethereum and rough math at $10-20 return on investment would mean the card would pay for itself in roughly 6-12 months time, accounting for energy use.
> 
> That's another thing that I don't see mentioned much when discussing how out of touch 40 series pricing is, that many if not most of us who paid scalpers prices could expect a return on investment. Now that Ethereum has gone from Proof of Work to Proof of Stake and no money is to be had by mining with a GPU, this is no longer a value proposition, but an extreme luxury item.
> 
> I understand why Nvidia is pricing the 40 series this high, that TSMC told Nvidia to pound sand when they tried to back out of the wafer allocation after realizing that 3rd party vendors, INCLUDING EVGA, were unable to sell their 30 series cards in anticipation of Ethereum going to Proof of Stake.
> 
> At the heart of the matter here is shortsightedness. Nvidia threw it's most diehard, loyal consumer segment, those who return every 3 years to purchase another gaming GPU, under the bus when faced with the prospect of selling all of their inventory to crypto miners hand over fist (remember "Bounce Alert"?) with no care whatsoever as to what would happen should BTC experience another sharp decline like it did in the Fall of 2017.
> 
> And so here we are, with $1600 80 class (yes, the "3090" and "4090" are 100% 80 class cards simply rebadged 90 class, see: the 12GB 4080) and $900 70 class cards, priced this way to make the warehouses upon warehouse of surplus 30 series cards seem like a bargain.
> 
> And the only reason Nvidia is even releasing the 40 series now is because TSMC refused to allow Nvidia to back out of their 5nm wafer allotment.
> 
> Thankfully, those of us who can afford to do so, stand a much better chance of actually buying a 4090 at launch this time around as the demand from miners and gaming enthusiasts, if current sentiment is any indication, should be non-existent.
> 
> Everyone is waiting for RDNA3. Part of me is scratching my head as to why I'm getting a 4090 at all after having said all of this. Nvidia trying to shove Zuckerberg's Great Reset Metaverse down our throats during the reveal was nearly too much to stomach, in fact, that's precisely when I stopped watching.
> 
> Nvidia justifies their ridiculous prices with garden wall features (Optical Flow DLSS 3.0, "only the 40 series can do it") whereas AMD offers all of their technologies for free, irrespective of the card or manufacturer (FSR, Free-Sync).
> 
> Also, their 2x rasterization uplift over Ampere, remember people, we've seen this language before. In fact, they said the 3080 was 2x faster than the 2080 when in actuality it was maybe 40% faster.
> 
> Same again here, the 4090 will be roughly 60-70% faster than 3090 in rasterization. An improvement over the gain we saw between Turing and Ampere to be sure but again, get your wallet out, youre going to pay for it.
> 
> I'm hearing top-tier RDNA3 card will be within 15% of the 4090 in rasterization, will still undoubtedly get owned in RT, may hit 4GHz, may be more efficient and will certainly be cheaper. If AMD releases a card 15% of 4090 for the price of a 16GB 4080 it's absolutely game over for Nvidia.
> 
> Nvidia made their bed with the crypto fiasco, we warned them this was a potential outcome, we asked them to curb the purchase quantity to miners, to make some effort to get the majority of the cards into the hands of gamers. All they saw was dollars signs. Here we are. It's Humanity's main weakness, we always choose short term gain / benefit over long term stability.


I don't disagree with a lot of this, but I would push back on the "3090/4090's should be 80-series". They are both *way* closer to a full 102 chip than 980's or 1080's were (I don't remember the 2080/2080ti gap off the top of my head). In fact, weren't 980's and 1080's the 103 chip, and 980ti/1080ti's were the almost full 102 chip, and Titans were the full 102 chip? You could argue the 90's should be 80ti's and I'd be on board with that.

And on RDNA3, I feel like if AMD felt like they had a winner on their hands they'd be smelling the blood in the water and we'd be seeing some "leaks" about it. The fact that we aren't seems to me like it's either not competitive at the top end, or is competitive but with similar (in)efficiency. Now it is possible they'll undercut everything but 4090's in pricing with competitive (in those tiers) cards and drag Nvidia down to match them, but Nvidia would probably still hold strong in 4090 pricing in that situation. 3090's and 3080's were closer in performance than 4090's and 4080 16gb's, yet the price difference is half as wide. I think Nvidia is ready for AMD to drop a 7900XT for $1000 or less that trades punches with a 4080 16gb and drop it's price accordingly, but keep the 4090 price where it is.


----------



## dr/owned

RetroWave78 said:


> This is where I'm confused, if the PSU side pinout for 8 pin PCIE is all the same and they are all going to the same GPU (in this case 12+4 micro), how can there possibly be a scenario of mismatched pins? I understand CPU, 24 pin, molex and SATA pinouts can and do differ PSU side, but 8 pin PCIE they are all the same.


Just to be clear, PSU side means the stuff glued in the PSU. GPU side is the component end that is standard ATX for everybody.

Different manufactures can put the 12V / GND pins in any layout they want. They can make the 12V pins be top, bottom, sideways, alternating...whatever. There's no standardization on it or the shapes of the connectors that plug in to the PSU. I think EVGA is reversed from Corsair when I tested it a couple weeks ago. If you plugged in the Corsair cable to the EVGA PSU you'd apply -12V (and yes the cable fit in the both PSU's)


----------



## mirkendargen

dr/owned said:


> EE checking in: Yup. Your PSU either works fine...or it clicks off for whatever protection feature being triggered. Or you bought a really cheap and crappy one that makes no sense to be pairing with >$1000 GPU...(you can get a great PSU that'll last forever for a couple hundred bucks)
> 
> There is no point in pre-emptively upgrading your PSU.
> 
> Also for every time I heard "but but but tra-transients" like they're the boogieman....Steve @GN didn't even FIND any power supplies except a single NZXT SFF one that was incorrectly tripping OCP. He'd be blasting a 3 hour video out jerking himself off if he found any real evidence of the problem he wants to find. But no one seems to have watched the video and actually understood it. "oh look we used an oscilloscope zoomed down to nanosecond resolution and found power spikes!". Ok cool story bro. Let me know when that's a problem for the ton of capacitors on the PCB and in the power supply...


I think the sense pins are the biggest FUD. People talk about them like it's some high speed communication channel where the GPU is like "Hey PSU! I got a fat load coming in, brace yourself!" and the PSU responds "Ayeaye captain, I'm ready in 1ms!" and the GPU pulls a transient spike of 2x it's TBP or something and it's ok because it could "warn the PSU".

...when reality is just that it's a safety mechanism so noobs can't plug a 500w future PSU straight into a 600W GPU and have the GPU trip the OCP, the PSU identifies itself as only providing 150w or 300w, whichever.


----------



## J7SC

As the Eagles put it, _'We haven't had that spirit here since 1969_'  

On the PSU front, I'm not too worried as I've run single cards w/ ~ 600W, SLI w/ ~ 800W and multi 780 Ti Classies w/ EVBot all the way to tripping the master circuit breaker. The key is obviously to run quality, high-capacity PSUs and cables.

On the AND RDNA3 front, things could get interesting as there are persistent rumours of multi-core versions coming out 'later' well after the launch of the 7900XT, and AMD filed some interesting patents a few years back on that which help make multicore GPU chiplets appear as a single GPU to Windows and drivers, so no Crossfire/SLI issues (if it indeed works seamlessly). Whatever else, the still-monolithic-die approach by NVidia and TSMC for the 4090 does deserve kudos - we're talking s.th. like 76.3 billion transistors now with the 4090 compared to 28.3 billion for the 3090 . But according to a recent interview by Gordon Mah Ung (PC World) and Nvidia CEO Jensen Huang, this also makes the new chip much more expensive to produce...big monolithic dies also inherently have a lower yield per wafer than smaller sized chiplets used in multi-configurations.

IMO, another angle will be the increased role of special driver software and custom patches, not just raw hw horsepower. Whatever else, the next 6 - 8 months should prove 'very interesting' - and probably also very expensive - for the OC enthusiast.


----------



## Knight091

I post a vid and the topic blows up lol 😆


----------



## Krzych04650

yzonker said:


> But to support my question of the number of 12v pins, 2x8pin!! What a circus...
> 
> 
> 
> https://www.corsair.com/us/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-Gen-5-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284


This is the PSU side 8-pin socket, not PCI-E 8-pin. It is the same socket that you connect your 2x8pin into. Each is 348W @ 11.6V according to Corsair.

All this cable does is going straight from two PSU 8-pin sockets into 12VHPWR, instead of what 4x8pin adapters are doing which is from the same two PSU 8-pin sockets into two 2x8-pin PCIE cables and only then into 12VHPWR.


----------



## sblantipodi

full stop, problem solved.


----------



## ArcticZero

sblantipodi said:


> View attachment 2573238
> 
> 
> full stop, problem solved.


Really wish Seasonic would release an official standalone cable as well for existing SKUs.


----------



## RetroWave78

mirkendargen said:


> I think the sense pins are the biggest FUD. People talk about them like it's some high speed communication channel where the GPU is like "Hey PSU! I got a fat load coming in, brace yourself!" and the PSU responds "Ayeaye captain, I'm ready in 1ms!" and the GPU pulls a transient spike of 2x it's TBP or something and it's ok because it could "warn the PSU".
> 
> ...when reality is just that it's a safety mechanism so noobs can't plug a 500w future PSU straight into a 600W GPU and have the GPU trip the OCP, the PSU identifies itself as only providing 150w or 300w, whichever.


Is it not correct that in the absence of the sense pins an 8 pin cable will be limited, either by the GPU or PSU, to 150w?


----------



## RetroWave78

mirkendargen said:


> I don't disagree with a lot of this, but I would push back on the "3090/4090's should be 80-series". They are both *way* closer to a full 102 chip than 980's or 1080's were (I don't remember the 2080/2080ti gap off the top of my head). In fact, weren't 980's and 1080's the 103 chip, and 980ti/1080ti's were the almost full 102 chip, and Titans were the full 102 chip? You could argue the 90's should be 80ti's and I'd be on board with that.
> 
> And on RDNA3, I feel like if AMD felt like they had a winner on their hands they'd be smelling the blood in the water and we'd be seeing some "leaks" about it. The fact that we aren't seems to me like it's either not competitive at the top end, or is competitive but with similar (in)efficiency. Now it is possible they'll undercut everything but 4090's in pricing with competitive (in those tiers) cards and drag Nvidia down to match them, but Nvidia would probably still hold strong in 4090 pricing in that situation. 3090's and 3080's were closer in performance than 4090's and 4080 16gb's, yet the price difference is half as wide. I think Nvidia is ready for AMD to drop a 7900XT for $1000 or less that trades punches with a 4080 16gb and drop it's price accordingly, but keep the 4090 price where it is.


Great points, yes but how do we know that Nvidia isn't also engaging in shenanigans with their GPU naming scheme as well? Now suddenly 102 is the 90 class card and not the 80 Ti? And then "90 Ti" comes along later and offers the full fat die? It could be that 90 has replaced Titan? (But not really as Not An Apple Fan points out) Not sure.

Great observation on AMD as well, there is definitely blood in the water, this is their signal to come in for the kill or.....is this more tactical prudence on their part? Notice the gaping hole in between the 16GB 80 card and the 90? If you don't think NV already have the card that fills that hole already built and ready to go I have some beachfront property in Arizona for sale.

If AMD offers a card 15% slower than 4090 for $999 it's absolutely Nvidias Intel moment. This would explain the performance gap between 16GB 80 and 90. Then NV's only selling point will be RT, which isn't widely adopted.

Also DLSS 3.0 40 series exclusivity means that developers won't be implementing it, unlike FSR, which every card supports.

Nvidia's 4080 12GB naming shenanigan cast doubt into the integrity of the entire reveal. Every, and I do mean every tech-tuber and commentator not paid off by NV (Digital Foundry, Tom's "just buy it" Hardware) has mentioned it.

This is a harmful consumer practice. Imagine how many unwitting kids building a PC for the 1st time will get duped by this. Like this is totally not cool.

And they had to do this because a $900 70 card looks like it sounds, a 200% price increase.

And the prices are this high because NV wants to clear 30 series inventory. They want you and I to pay for their misguided crypto greed lust bonanza shortsightedness. They CAN and ought to take the hit for their blunder to restore goodwill among the PC gaming community. But instead they've elected to give us the middle finger.

Also, I'm not very excited for DLSS 3.0. It looks like Smooth Video Project frame interpolation, which I'm a fan of BUT is plagued with artifacts.

It's like NV was out of ideas and needed to justify the out of touch with economic reality prices.

Did I mention we are entering the Greatest Depression and potentionally WW3?

Basically no-one in Europe can afford these nor wants a 400w GPU to begin with. So this is a product primarily for domestic consumption in the US.


----------



## mirkendargen

RetroWave78 said:


> Is it not correct that in the absence of the sense pins an 8 pin cable will be limited, either by the GPU or PSU, to 150w?


That is correct, but the adapters do have the sense pins grounded appropriately. I'm not even sure one of the old 12pin plugs would fit/is keyed properly since that wasn't actually any kind of standard. Also looking at the spec for the other 2 pins of the 4 sideband ones, one of them is "cable presence", so a 12pin cable even if it fit wouldn't work properly because the card would think no cable is plugged in. Picture of the sense spec, both pins open = max 150w:





















> Great points, yes but how do we know that Nvidia isn't also engaging in shenanigans with their GPU naming scheme as well? Now suddenly 102 is the 90 class card and not the 80 Ti? And then "90 Ti" comes along later and offers the full fat die? It could be that 90 has replaced Titan? Not sure.
> 
> Great observation on AMD as well, there is definitely blood in the water, this is their signal to come in for the kill or.....is this more tactical prudence on their part? Notice the gaping hole in between the 16GB 80 card and the 90? If you don't think NV already have the card that fills that hole already built and ready to go I have some beachfront property in Arizona for sale.


Yeah there could totally be a slightly more cutdown 102 card destined to be a 4080ti at $1000-$1200 and the 4080's bumped down in price accordingly if needed to counter AMD. I don't see a situation where 4090's drop in price though, unless it's literally just a matter of overstocking and not enough demand.


----------



## Knight091

RetroWave78 said:


> Great points, yes but how do we know that Nvidia isn't also engaging in shenanigans with their GPU naming scheme as well? Now suddenly 102 is the 90 class card and not the 80 Ti? And then "90 Ti" comes along later and offers the full fat die? It could be that 90 has replaced Titan? (But not really as Not An Apple Fan points out) Not sure.
> 
> Great observation on AMD as well, there is definitely blood in the water, this is their signal to come in for the kill or.....is this more tactical prudence on their part? Notice the gaping hole in between the 16GB 80 card and the 90? If you don't think NV already have the card that fills that hole already built and ready to go I have some beachfront property in Arizona for sale.
> 
> If AMD offers a card 15% slower than 4090 for $999 it's absolutely Nvidias Intel moment. This would explain the performance gap between 16GB 80 and 90. Then NV's only selling point will be RT, which isn't widely adopted.
> 
> Also DLSS 3.0 40 series exclusivity means that developers won't be implementing it, unlike FSR, which every card supports.
> 
> Nvidia's 4080 12GB naming shenanigan cast doubt into the integrity of the entire reveal. Every, and I do mean every tech-tuber and commentator not paid off by NV (Digital Foundry, Tom's "just buy it" Hardware) has mentioned it.
> 
> This is a harmful consumer practice. Imagine how many unwitting kids building a PC for the 1st time will get duped by this. Like this is totally not cool.
> 
> And they had to do this because a $900 70 card looks like it sounds, a 200% price increase.
> 
> And the prices are this high because NV wants to clear 30 series inventory. They want you and I to pay for their misguided crypto greed lust bonanza shortsightedness. They CAN and ought to take the hit for their blunder to restore goodwill among the PC gaming community. But instead they've elected to give us the middle finger.
> 
> Also, I'm not very excited for DLSS 3.0. It looks like Smooth Video Project frame interpolation, which I'm a fan of BUT is plagued with artifacts.
> 
> It's like NV was out of ideas and needed to justify the out of touch with economic reality prices.
> 
> Did I mention we are entering the Greatest Depression and potentionally WW3?
> 
> Basically no-one in Europe can afford these nor wants a 400w GPU to begin with. So this is a product primarily for domestic consumption in the US.


I have a feeling AMD is coming with lower priced cards as good or better than what Nvidia has. Nvidia got wind and ran and rushed this with the 4090. You have two 4080 cards when the lower 4080 should be a 4070.


----------



## Glottis

mirkendargen said:


> That video makes no sense and is complete fear mongering click bait.
> 
> 1. If the problem is the number of insertions the 16pin connector can survive, how the hell does a new PSU help? It's the same connector.
> 
> 2. He goes on and on about the 16 pin connector insertions, then is like "oh Nvidia said the problem was actually on the PSU side of the adapter" .....k?
> 
> 3. He talks about the sense pins missing. They're missing cause the adapter in his hand isn't actually a 12VHPWR adapter, the actual ones have the sense pins.
> 
> 4. He talks like the sense pins are some sort of constant 2 way negotiation with the GPU requesting power, the PSU saying what is available, etc. Then shows *on the screen* what the spec of the sense pins actually is. It's just a constant ground/neutral combo on the pins to let the GPU know what the PSU supports. No reason at all an adapter can't pass through the ground to the appropriate sense pins based on the number of 8pins connected to it.
> 
> 5. He says your GPU might pull <ridiculous large current> at boot up and fry your PSU. Huh? Then gives a hypothetical that your PSU has 2+ rails and you plug all the plugs to the adapter into the same one and bad things happen. What if you plug all the 8pins to your current GPU into that same rail....? What is suddenly different now?
> 
> At the end of the day, you either have a PSU that can handle a potentially 600w GPU or you don't, that's what actually matters. What plugs come out of it being some giant concern is just FUD.


My thoughts exactly when I saw this vid. He even admits he doesn't know much about PSUs and then continues to ramble for 16 mins spreading misinformation and FUD. He didn't even see that Corsair already released a statement almost 2 days before his video, confirming that their existing higher wattage PSUs are fully compatible with 4000 series, as well as released 12VHPWR with sense wires configured for 600W. What a dummy...


----------



## Benni231990

Holy **** is this complicated with the new 16pin connectors 🙈

We need simple answers can we use a 3x8pin adapter from the cheap cards (maybe trinity oder Tuf) and flash the 600watt Strix bios then we can use 600 watt or only 450watt because of the 3x8pin adapter? or we can only afterwards using a 4x8pin adapter for 600 watt? or we only can use 450watt what ever adapter we use because we dont have the data lines on the "old" Psu?


----------



## mirkendargen

Knight091 said:


> I have a feeling AMD is coming with lower priced cards as good or better than what Nvidia has. Nvidia got wind and ran and rushed this with the 4090. You have two 4080 cards when the lower 4080 should be a 4070.


There would have been some "leaks" by now to rain on Nvidia's party if AMD were confident they had something definitively better than a 4090. I highly doubt that's gonna happen.


----------



## LunaP

sblantipodi said:


> View attachment 2573238
> 
> 
> full stop, problem solved.


Yeah buying this soon as I see one go up somewhere, so far no links just references to retailers. On a 1600i atm.

Looking to go w/ the Founders edition + Aqua blocks for now, unless more copper/chromium blocks get announced.


----------



## bmagnien

LunaP said:


> Yeah buying this soon as I see one go up somewhere, so far no links just references to retailers. On a 1600i atm.
> 
> Looking to go w/ the Founders edition + Aqua blocks for now, unless more copper/chromium blocks get announced.


Aqua blocks are currently only’Reference’ design and I’m having trouble confirming with them if FE is reference (have emails in and posts on their forum). Some saying they aren’t. EKWB only block available for order that’s confirmed working for FE.


----------



## Netarangi

sblantipodi said:


> View attachment 2573238
> 
> 
> full stop, problem solved.


Would you just need to buy one?


----------



## mirkendargen

Netarangi said:


> Would you just need to buy one?


All the cards announced so far have a single 12VHPWR connecter, so yes just one. There might be cards down the road (HOF, etc) that have 2.


----------



## menko2

Quick question.

Will the RTX 4090 work in a Z590 with a 11900k which is PciE 4.0?


----------



## dr/owned

Benni231990 said:


> Holy **** is this complicated with the new 16pin connectors 🙈
> 
> We need simple answers can we use a 3x8pin adapter from the cheap cards (maybe trinity oder Tuf) and flash the 600watt Strix bios then we can use 600 watt or only 450watt because of the 3x8pin adapter? or we can only afterwards using a 4x8pin adapter for 600 watt? or we only can use 450watt what ever adapter we use because we dont have the data lines on the "old" Psu?


Corsair is saying that 2x8 pins will support 600W, meaning their adapter is going to ground the 2 sense pins. 2x8 pin has no reason it shouldn't be capable of supporting 600W, because it has the same number of 12V pins as a 12VHPWR connector. CableMod is making a 3x8 pin adapter and will also support 600W. I suspect the adapters that everyone includes in the box of their GPU are going to be 600W and the BIOS is going to limit to 450 or 475 or whatever they want.



bmagnien said:


> Aqua blocks are currently only’Reference’ design and I’m having trouble confirming with them if FE is reference (have emails in and posts on their forum). Some saying they aren’t. EKWB only block available for order that’s confirmed working for FE.


I emailed them a few days ago: reference is not FE. Like the 3000 series there was a "reference" PCB that was used in like the base model cards of various brands (Zotac Trinity for example). AFAIK we won't know which cards are reference PCB until reviews come out. Some brands like EVGA didn't use the reference board in any of their cards...although some like the XC3 were very close to reference and the FTW3 was reference with a couple extra power stages added on the top of the PCB.


----------



## Benni231990

dr/owned said:


> Corsair is saying that 2x8 pins will support 600W, meaning their adapter is going to ground the 2 sense pins. 2x8 pin has no reason it shouldn't be capable of supporting 600W, because it has the same number of 12V pins as a 12VHPWR connector. CableMod is making a 3x8 pin adapter and will also support 600W. I suspect the adapters that everyone includes in the box of their GPU are going to be 600W and the BIOS is going to limit to 450 or 475 or whatever they want.


This sounds very good thanks a lot


----------



## jtclfo

dr/owned said:


> I emailed them a few days ago: reference is not FE. Like the 3000 series there was a "reference" PCB that was used in like the base model cards of various brands (Zotac Trinity for example). AFAIK we won't know which cards are reference PCB until reviews come out. Some brands like EVGA didn't use the reference board in any of their cards...although some like the XC3 were very close to reference and the FTW3 was reference with a couple extra power stages added on the top of the PCB.


The way all the cards look I wouldn't be surprised if they are all essentially reference. Custom seems to be completely gone. Anecdotally the layouts of the board based on pictures and videos appear to be the same for every released "custom" card so far. But yes reviews and teardowns will definitely explain things better and I'm dying to see what's going on because right now this is an extremely bland launch right now, my thoughts.


----------



## yzonker

dr/owned said:


> Corsair is saying that 2x8 pins will support 600W, meaning their adapter is going to ground the 2 sense pins. 2x8 pin has no reason it shouldn't be capable of supporting 600W, because it has the same number of 12V pins as a 12VHPWR connector. CableMod is making a 3x8 pin adapter and will also support 600W. I suspect the adapters that everyone includes in the box of their GPU are going to be 600W and the BIOS is going to limit to 450 or 475 or whatever they want.
> 
> 
> 
> I emailed them a few days ago: reference is not FE. Like the 3000 series there was a "reference" PCB that was used in like the base model cards of various brands (Zotac Trinity for example). AFAIK we won't know which cards are reference PCB until reviews come out. Some brands like EVGA didn't use the reference board in any of their cards...although some like the XC3 were very close to reference and the FTW3 was reference with a couple extra power stages added on the top of the PCB.


FTW3 was nothing resembling the reference PCB. I have both a Zotac 3090 Trinity (reference) and a 3080ti FTW3. Both in water blocks. Completely different.


----------



## Blameless

dr/owned said:


> 2x8 pin has no reason it shouldn't be capable of supporting 600W, because it has the same number of 12V pins as a 12VHPWR connector.


All other things being equal, six +12v wires plus eight or ten ground wires is better than six of each.



yzonker said:


> FTW3 was nothing resembling the reference PCB.


This is part of the reason why EVGA wasn't making any money.

Aside from an handful of limited release enthusiast products in halo price-tiers, I expect every non-reference board of the 4000 series to be _worse_ than reference.


----------



## dr/owned

yzonker said:


> FTW3 was nothing resembling the reference PCB. I have both a Zotac 3090 Trinity (reference) and a 3080ti FTW3. Both in water blocks. Completely different.


I mean electrically, not physically. The FTW3 was worse than the TUF and the Strix was about comparable to the KPE. The FE was a better design but didn't have the BIOS support that EVGA put out to enable 1000W. And the FE was stupid if you wanted to shunt mod it because it had 12 shunts or something.


----------



## mirkendargen

Yeah one thing I'm wary about with FE's is the BIOS. No other BIOS worked on them the last time around, although that may be different now that they're using the same power connection as everyone else. It's also possible that FE's really can go to 600w as reported and if no cards go above 660w that extra 10% is gonna be a wash anyway most likely.


----------



## Dogzilla07

There's too much partial information taken out of context about the cable/adapter issues:









No, 12VHPWR Doesn't Have A Low Rating - Cultists Network


Based off reporting from Videocardz, many have gotten concerned over the lifespan of included 12VHPWR adapters of RTX 4000 series.




cultists.network













Melting 12VHPWR Connectors - The Real Story - Cultists Network


WCCFTECH stated that PCI-SIG sent an email warning against using non-ATX 3.0 PSUs and 8-pin to 12VHPWR adapters. Except, this was made up...




cultists.network


----------



## J7SC

mirkendargen said:


> Yeah one thing I'm wary about with FE's is the BIOS. No other BIOS worked on them the last time around, although that may be different now that they're using the same power connection as everyone else. It's also possible that FE's really can go to 600w as reported and if no cards go above 660w that extra 10% is gonna be a wash anyway most likely.


...in the 3090 thread, there must be 100+ posts of folks with the FE looking for alternate vbios, to no avail. Meanwhile, my 3090 Strix (3x 8 pin) as well as many other custom PCB cards have a ton of vbios options in various flavours...I have 4x 1kw XOC downloaded and to choose from (but only really 'needed' one 😋...). I will probably go for Asus Strix or something similar again w/ the RTX 4K.

...then there's Galax HoF - who knows what they come up with, though the dual 12/16-pin 3090 Ti PCB below apparently was a testbed for RTX 4K HoF on the VRM...time will tell.


----------



## Hulk1988

Hey Threadstarter, 

any chance to update the first page? There are a lot of information already in the internet. If you do not have time, can someone take it over? Thanks


----------



## Glottis

dr/owned said:


> Corsair is saying that 2x8 pins will support 600W, meaning their adapter is going to ground the 2 sense pins. 2x8 pin has no reason it shouldn't be capable of supporting 600W, because it has the same number of 12V pins as a 12VHPWR connector. CableMod is making a 3x8 pin adapter and will also support 600W. I suspect the adapters that everyone includes in the box of their GPU are going to be 600W and the BIOS is going to limit to 450 or 475 or whatever they want.


We should keep terminology consistent to avoid even more confusion.

This is a PSU power cable:








And this is adapter:


----------



## yzonker

J7SC said:


> ...in the 3090 thread, there must be 100+ posts of folks with the FE looking for alternate vbios, to no avail. Meanwhile, my 3090 Strix (3x 8 pin) as well as many other custom PCB cards have a ton of vbios options in various flavours...I have 4x 1kw XOC downloaded and to choose from (but only really 'needed' one 😋...). I will probably go for Asus Strix or something similar again w/ the RTX 4K.
> 
> ...then there's Galax HoF - who knows what they come up with, though the dual 12/16-pin 3090 Ti PCB below apparently was a testbed for RTX 4K HoF on the VRM...time will tell.
> View attachment 2573301


Four 1kw bios? Galax, KP, Asus, ?? I guess I've forgotten the 4th.

I've been going through a similar thought process though and it takes me back to the Strix each time. We don't even know for certain that cross flashing will be possible this time around. I guess I'd say _probably_, but no way to know until people have had cards in hand for a little while. 

I keep fighting the urge to buy at launch. The sensible thing is to wait for AMD and wait to see which 4090's look like the best deal. The market may help me out by selling out the Strix instantly anyway.


----------



## mirkendargen

Not to beat a hopefully-dead-horse, but I finally found the specs on PCIE 8pin connectors.

Guess how many insertions they're rated for?

Yup, 30.



https://www.molex.com/molex/products/part-detail/pcb_headers/0455860005


----------



## bmagnien

What’s the process for determining if a given board is ‘reference’ ? Alphacool staff on their forums claim the 4090 block they’re selling is reference, and that currently no cards have been deemed compatible, and that there is no guarantee that any future cards released will be compatible either. I find it hard to believe that a company would produce a block without knowing whether it would be usable, but that’s their current stance.


----------



## dr/owned

Dogzilla07 said:


> There's too much partial information taken out of context about the cable/adapter issues:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No, 12VHPWR Doesn't Have A Low Rating - Cultists Network
> 
> 
> Based off reporting from Videocardz, many have gotten concerned over the lifespan of included 12VHPWR adapters of RTX 4000 series.
> 
> 
> 
> 
> cultists.network
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Melting 12VHPWR Connectors - The Real Story - Cultists Network
> 
> 
> WCCFTECH stated that PCI-SIG sent an email warning against using non-ATX 3.0 PSUs and 8-pin to 12VHPWR adapters. Except, this was made up...
> 
> 
> 
> 
> cultists.network


----------



## ALSTER868

menko2 said:


> Quick question.
> 
> Will the RTX 4090 work in a Z590 with a 11900k which is PciE 4.0?


You're safe on that









NVIDIA's RTX 4080/4090 to Lack PCIe Gen 5 Support, PCIe 4.0 Instead | Hardware Times


NVIDIA’s next-gen GeForce RTX 40 series graphics cards won’t support the next-gen PCIe Gen 5 express. The Ada Lovelace GPUs will retain the existing Gen 4 interface used in the contemporary 30 series parts. This is unexpected as the RTX 4080/4090 will adopt the 12PVHPWR 16-pin power connector...




www.hardwaretimes.com


----------



## Blameless

I might have to go with the FE. I really don't like FE cards, both due to the cooler and firmware, but it's the only announced version that is short enough to fit (and there is no convenient or cost effective way to squeeze an acceptable water setup in this SFF case). If I do this I'll probably run a duct from the front intake fan directly to the fan on my CPU heatsink, as it's nearly a straight shot with no obstruction. This should keep the CPU from ingesting heated air from the GPU.


----------



## ArcticZero

I actually wish FE cards were more readily available in other countries. The only way to get one here is to import it from the US and pay an exorbitant amount of taxes.

Granted in my case, I am holding off until I see the bare PCBs themselves before picking an SKU since it's going in a loop. Agreed on the firmware issue as well.


----------



## dr/owned

NowInStock has a 4090 page now: NVIDIA RTX 4090 Series In Stock Tracker - NowInStock.net

Allegedly there was a preorder for an hour but who knows if it was real.

Sep 26 - 4:33 PM ESTNewegg : MSI RTX 4090 GAMING TRIO 24G Out of StockSep 26 - 3:16 PM ESTNewegg : MSI RTX 4090 GAMING TRIO 24G Preorder for $1,599.00

I need either Best Buy or Amazon....


----------



## x7007

the only thing for the moment for me to move from 6900xt is the stupid VRR that doesn't work with them. I have 2 Hdmi splitters the most expensive ones, HD fury vrroom and aswell brand something that can do 4k and 8k matrix, they both support everything expect Freesync because you can't pass through it as it propriety. and they didn't fix it, I need the splitter because it's annoying as hell to switch the cables between my TV and projector. I have a switch now but I still need to switch cables. I had a Bi directional switch but it died and couldn't do 4k120 anymore. so I am stuck I need more performance than most people because I play 3D using reshade and superdepth 3d and it takes more performance than what anyone use.


----------



## ttnuagmada

So, I'm aiming to get an FE, but I'm still uncertain of how to even get one once they launch. Is it going to be BB only? If so, is BB doing in-store only?


----------



## Falkentyne

mirkendargen said:


> Yeah one thing I'm wary about with FE's is the BIOS. No other BIOS worked on them the last time around, although that may be different now that they're using the same power connection as everyone else. It's also possible that FE's really can go to 600w as reported and if no cards go above 660w that extra 10% is gonna be a wash anyway most likely.


The FE still doesn't have a standard BIOS chip. Elmor looked at the part number and the "chip" is more like some sort of chinese eeprom chip (at least we found the factory that makes them) rather than a standard flash chip in the normal SOIC8 package. So no point buying a FE instead of a reference card except price.


----------



## J7SC

Falkentyne said:


> The FE still doesn't have a standard BIOS chip. Elmor looked at the part number and the "chip" is more like some sort of chinese eeprom chip (at least we found the factory that makes them) rather than a standard flash chip in the normal SOIC8 package. So no point buying a FE instead of a reference card except price.


...I probably go for an Asus Strix again (though also have good experiences w/ top-end Gigabyte). Anyhow, I tend to choose GPUs based on PCB and 'typical availability' of water blocks per previous gens. I really wish though more would be known about the release date of the 4090 Ti, and for that matter AMD's top offerings...


Spoiler


----------



## bmagnien

Only water blocks available at this point are Reference Alphacool (no confirmed cards compatible yet) and EKWB (FE only). Optimus is producing a Strix block but still in early development (taking suggestions for port placement on Twitter) and with their history might see the 5090 first. Optimus also confirmed the Strix is not Reference. 

While we know that the 2, 3, or 4 pcie8pins feeding a 12VHPWR 16pin connector via an adapter can carry more than 600w to the card, do we know if the 16pin connector and accompanying card interface can actually receive and transmit more than 600w to the card? If it can’t, and all the cards thus far announced only have one 16pin, the upper bounds may conceivably be 675w regardless of bios. FE is already confirmed to go up to 600w, so wondering if the ability to flash bios is a nonissue this time around.


----------



## savormix

J7SC said:


> ...I probably go for an Asus Strix again (though also have good experiences w/ top-end Gigabyte). Anyhow, I tend to choose GPUs based on PCB and 'typical availability' of water blocks per previous gens. I really wish though more would be known about the release date of the 4090 Ti, and for that matter AMD's top offerings...


Based on the current trend (since 20 series), it would seem that NVIDIA's perceived best interest is to stall the 4090 Ti release for as long as possible (so Q4 or Q3 2023), unless AMD actually shows off a card that can directly match or top the 4090, of course.


----------



## J7SC

bmagnien said:


> Only water blocks available at this point are Reference Alphacool (no confirmed cards compatible yet) and EKWB (FE only). Optimus is producing a Strix block but still in early development (taking suggestions for port placement on Twitter) and with their history might see the 5090 first. Optimus also confirmed the Strix is not Reference.
> 
> While we know that the 2, 3, or 4 pcie8pins feeding a 12VHPWR 16pin connector via an adapter can carry more than 600w to the card, do we know if the 16pin connector and accompanying card interface can actually receive and transmit more than 600w to the card? If it can’t, and all the cards thus far announced only have one 16pin, the upper bounds may conceivably be 675w regardless of bios. FE is already confirmed to go up to 600w, so wondering if the ability to flash bios is a nonissue this time around.


On water-blocks, I bought a Phanteks block for my 3090 Strix and love its performance and quality (after dumping the EK block for the Strix due to ...reasons); hopefully, Phanteks also offers one for the RTX4K Strix family. On VRMs handling > 600W, during the 'infomercial' release stream by Asus for the 4090s, they strongly hinted that the dual vbios Strix is set up for additional OC headroom - time will tell.

On PSUs, I already posted here that just one (of two) Antec 1300W Platinums I kept from my HWBot days has been handling 760W max (2x w-cooled 2080 Ti) w/o issues or OCP since late 2018 via 4x 8 pin PCIe. Just in case, the second one (which is slumbering now on pasture) could be added via Antec's proprietary OC Link (spoiler) for a cool 2600W / 217 A and 20x PCIe.


Spoiler

















savormix said:


> Based on the current trend (since 20 series), it would seem that NVIDIA's perceived best interest is to stall the 4090 Ti release for as long as possible (so Q4 or Q3 2023), unless AMD actually shows off a card that can directly match or top the 4090, of course.


...fortunately, I don't have to jump into the 4090 / AMD 7900XT/X (X = dual core mGPU?) fray for a while yet, but point well taken...I just wish Intel would be much further ahead with its GPUs and able to move upmarket so that there would be a real fight between three chip families for the top spots and thus quicker release dates and (relatively) lower prices.


----------



## Falkentyne

savormix said:


> Based on the current trend (since 20 series), it would seem that NVIDIA's perceived best interest is to stall the 4090 Ti release for as long as possible (so Q4 or Q3 2023), unless AMD actually shows off a card that can directly match or top the 4090, of course.


if I were Jensen, I would just release the 4090 Ti on day one for $2500 and say "let them eat cake, then sit back and watch the carnage because you know it will sell out instantly.


----------



## J7SC

If you were Jensen, you would have a really nice kitchen, and we would have...


Spoiler


----------



## mirkendargen

Falkentyne said:


> if I were Jensen, I would just release the 4090 Ti on day one for $2500 and say "let them eat cake, then sit back and watch the carnage because you know it will sell out instantly.


But then you don't get to sell people a 4090, and then a 4090ti a year later!


----------



## RetroWave78

Falkentyne said:


> The FE still doesn't have a standard BIOS chip. Elmor looked at the part number and the "chip" is more like some sort of chinese eeprom chip (at least we found the factory that makes them) rather than a standard flash chip in the normal SOIC8 package. So no point buying a FE instead of a reference card except price.


Yes but FE will have 600w on tap (450w via 3x8 pin PCIE and "a fourth optional 8 pin for overclocking", per Nvidia's own release information), 3090 FE's only real shortcoming was it being limited to 400w and no way around it.


----------



## bmagnien

RetroWave78 said:


> Yes but FE will have 600w on tap (450w via 3x8 pin PCIE and "a fourth optional 8 pin for overclocking", per Nvidia's own release information), 3090 FE's only real shortcoming was it being limited to 400w and no way around it.


This was my rationale with my earlier post. Only way I see of getting burned with FE is if other boards allow more than 600w through their single 600w port via a bios upgrade that FE won’t have access to. I’m not technical enough to know if that’s even possible though.


----------



## RetroWave78

J7SC said:


> On water-blocks, I bought a Phanteks block for my 3090 Strix and love its performance and quality (after dumping the EK block for the Strix due to ...reasons); hopefully, Phanteks also offers one for the RTX4K Strix family. On VRMs handling > 600W, during the 'infomercial' release stream by Asus for the 4090s, they strongly hinted that the dual vbios Strix is set up for additional OC headroom - time will tell.
> 
> On PSUs, I already posted here that just one (of two) Antec 1300W Platinums I kept from my HWBot days has been handling 760W max (2x w-cooled 2080 Ti) w/o issues or OCP since late 2018 via 4x 8 pin PCIe. Just in case, the second one (which is slumbering now on pasture) could be added via Antec's proprietary OC Link (spoiler) for a cool 2600W / 217 A and 20x PCIe.
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2573718
> 
> 
> 
> 
> 
> 
> ...fortunately, I don't have to jump into the 4090 / AMD 7900XT/X (X = dual core mGPU?) fray for a while yet, but point well taken...I just wish Intel would be much further ahead with its GPUs and able to move upmarket so that there would be a real fight between three chip families for the top spots and thus quicker release dates and (relatively) lower prices.


I have a Phanteks block for my 2080 Ti, the nickel quality is better than EK but the fin array isn't as densely packed so there is less surface area and higher temps than what I get from EK on both the card that preceeded it (1080 Ti: EK) and the 3090 FE. I see around 43-45C load with both the 1080 Ti and 3090 and 46-49 with the 2080 Ti and TU-102 has more contact area with block owing to it's larger die. 

That said the nickel plating of my $400 FE block from EK started coming off around 1 year mark.


----------



## RetroWave78

bmagnien said:


> This was my rationale with my earlier post. Only way I see of getting burned with FE is if other boards allow more than 600w through their single 600w port via a bios upgrade that FE won’t have access to. I’m not technical enough to know if that’s even possible though.


Youre not going to get burned, 600W is beyond ludicrous, if Ampere is any indication (AD seems like 4nm Ampere architecturally speaking), the cards were clocked well beyond the power / efficiency curve right from the factory (GA-102 sweet spot is 1850-1900 MHz @ .875v @ 250w) with diminishing returns when overclocked, i.e. another 5% to be had over the aformentioned clocks and power draw but requiring 25% more power, i.e. Kingpin, Strix OC, 2200 MHz requiring 400-500w to hold. 

I haven't been writing this much, because I want others to go after the AIB card so I have a better chance of getting FE, but FE will be the card to get this time around for many reasons: 

1. Air cooler design and quality, while you wait for your water block or if you intend to stay on air.
2. Nvidia will likely use higher binned dies for FE and give AIB's the left overs. This was the case with Ampere. 
3. WB availability. 
4. Nvidia have demonstrated that they can mitigate power delivery spikes and that AIB's will not have this ability unless they develop something on their own.
5. Cost. AIB's will command $100-200 over FE and arrive with an inferior cooler. 

If AD-102 was limited to 400w I would say go AIB, but this is no longer the case.

600w is beyond ridiculous, I can't see anyone using this for day to day use outside of extreme niche situations (VR, Pimax 12KX) under massive amounts of water. 

Lovelace will likely be clocked right at or a bit over the power / efficiency curve just like Ampere was. 

It will probably look like this: 

AD-102 will do 2700 MHz @ 450w (already demonstrated, 2850 MHz actually). 3GHz will be possible but will require all 600w possible and still dip down to 2900 MHz due to power starvation. 

300 MHz = 7.5% performance gain. 
150w = 33% additional power.

Beyond this it gets worse, i.e. AIB's able to do 3.1-3.2 GHz but requiring 675w+. 

Just as with Ampere, Lovelace will most likely see the biggest gains via undervolting, i.e. 2500 MHz solid everywhere @ 350w. 

If you want to decrease your chances of getting an FE start spreading this all over the forums. 

Let people go for Strix. 

If I can't get FE I may even wait for stock. I already have EKWB FE block on pre-order. 

RemindMe Bot 30 days.


----------



## ArcticZero

Learned my lesson with EKWB on the 3090. Definitely doing my best to get another brand this time since I actually want my nickel plating to stay intact this time. This will also factor into my choice of AIB, I suppose.


----------



## RetroWave78

ArcticZero said:


> Learned my lesson with EKWB on the 3090. Definitely doing my best to get another brand this time since I actually want my nickel plating to stay intact this time. This will also factor into my choice of AIB, I suppose.


InnoVISION Multimedia Limited 

There are two 4090's with water blocks, this is one of them, both from InnoChill. 

I don't know of anyone else offering a block right out of the gate other than FE. 

Nickel deterioration isn't the end of the world to be honest. What sucks is the block costing $400 with said price tag with vaunted quality control and CNC machining buzzword marketing from EKWB. 

I'm absolutely not an EKWB fanboy to be sure. They've been slipping BIG TIME. Their fans straight up suck. If I buy 3 120mm Vardar fans for radiator, even their latest ones, one out of 3 will be dead within 6 months, 100% guaranteed. I don't have this problem with any other fan manufacturer. Don't even get me started on how their Mystic Fog coolant fell out within 72 hours and accumulated in the FE block that then exhibited the nickel plating erosion in the exact same spot. 

I used only distilled water up until trying Mystic Fog and then I went right back to it. Widespread problem, EK denies. I submitted multiple tickets and EKWB sent me a cheap "EK" polyester beanie cap in the mail as some form of attempted recompense that just added insult to injury. 

They aren't getting my money because of quality, literally no-one else is offering a block for FE as quickly (early November) and that's it. 

What's insane is that unlike the massively thick 3090 "Special Edition" block, their 4090 block is half the thickness with traditional port placement, basically a smaller version of their 1080 Ti block from 6 years ago that went for $130 but now they feel that it's appropriate to price said block at $280.


----------



## mirkendargen

Bykski N-ST4090TQ-X-V2 GPU BLOCK ZOTAC GAMING RTX4090


Bykski 分体式水冷官方网站




www.bykski.com





Bykski has a Zotac block of all things, I'm sure an FE block is coming. They had the first 3080/90 FE block if I recall, and make blocks for literally everything, even A100 compute cards.

And the Strix is up on Newegg for $1999, oof.


----------



## yzonker

Strix looks to be the most expensive so far. Doh.


----------



## yzonker

bmagnien said:


> This was my rationale with my earlier post. Only way I see of getting burned with FE is if other boards allow more than 600w through their single 600w port via a bios upgrade that FE won’t have access to. I’m not technical enough to know if that’s even possible though.


That's the possible issue. Unless the new connector actually limits to 600w separate from the bios, there may be one or more XOC bios that allow more power and just ignore the 600w limit. The Galax and KP 1kw bios for the 3080ti/3090 (only 3090 for KP obviously) allow the 8pins to go waaay beyond 150w. 
I suspect there will be XOC bios in some form. It's possible only 2 connector cards would get one I suppose (Galax HOF and 4090 Ti come to mind).


----------



## J7SC

RetroWave78 said:


> I have a Phanteks block for my 2080 Ti, the nickel quality is better than EK but the fin array isn't as densely packed so there is less surface area and higher temps than what I get from EK on both the card that preceeded it (1080 Ti: EK) and the 3090 FE. I see around 43-45C load with both the 1080 Ti and 3090 and 46-49 with the 2080 Ti and TU-102 has more contact area with block owing to it's larger die.
> 
> That said the nickel plating of my $400 FE block from EK started coming off around 1 year mark.





ArcticZero said:


> Learned my lesson with EKWB on the 3090. Definitely doing my best to get another brand this time since I actually want my nickel plating to stay intact this time. This will also factor into my choice of AIB, I suppose.


...don't know about the Phanteks blocks for the 2080 Ti, but for the 3090 Strix, the Phanteks block has a much bigger GPU die coverage area (blocks sized ~ the same, red rectangle is identical in pic below). Also, the EK had nickel issues - with the camera flash 'on' (and a Phanteks X570 CPU block in the same frame), you see how thin the EK nickel coating is...there were other issues with the EK 3090 Strix block and backplate.

In any case, I've had a lot of other EK products that were fine but the RTX 3K Strix wasn't one of them. The Phanteks also dropped GPU temps by 3 - 4 C over the EK in the same loop / ambient temp.


----------



## bmagnien

mirkendargen said:


> Bykski N-ST4090TQ-X-V2 GPU BLOCK ZOTAC GAMING RTX4090
> 
> 
> Bykski 分体式水冷官方网站
> 
> 
> 
> 
> www.bykski.com
> 
> 
> 
> 
> 
> Bykski has a Zotac block of all things, I'm sure an FE block is coming. They had the first 3080/90 FE block if I recall, and make blocks for literally everything, even A100 compute cards.
> 
> And the Strix is up on Newegg for $1999, oof.



US $110.97 15％ Off | Bykski 4090 Series GPU Water Cooling Block , For ZOTAC GAMING RTX4090 Apocalypse OC , Liquid Cooler System, N-ST4090TQ-X


https://a.aliexpress.com/_mOhLjRa


----------



## mirkendargen

bmagnien said:


> US $110.97 15％ Off | Bykski 4090 Series GPU Water Cooling Block , For ZOTAC GAMING RTX4090 Apocalypse OC , Liquid Cooler System, N-ST4090TQ-X
> 
> 
> https://a.aliexpress.com/_mOhLjRa


Yeah it usually requires some shopping around to find the best Aliexpress price+shipping to the US, but it's definitely always cheap.


----------



## yzonker

Nevermind. Figured it out.


----------



## Arizor

Good to see some of the usual suspects in this thread from the 3090 owners!

Digital Foundry have uploaded a nice look at DLSS3; the decoupling of framerate and latency is going to be the big conundrum for most gamers I think: 




Also to those pondering Strix vs FE and watercooling, it should be noted Optimus is making WBs for both, just in case that's a factor.


----------



## PLATOON TEKK

Good points made on the FE vs Strix. Might just wait it out this time and see what the “true” power limits are for each card. if an xoc bios drops and “overrides” said power limits; chances are that won’t run on FE though.

I wonder if Galax are even permitted to add a further power connector on the HOF 4090. Am confident that’s what the 4090tis will be.


----------



## Arizor

Yep @PLATOON TEKK . I really wish we knew the timeline for the inevitable 4090ti; if it was say, January-Feb, I'd probably be tempted.


----------



## yzonker

Arizor said:


> Yep @PLATOON TEKK . I really wish we knew the timeline for the inevitable 4090ti; if it was say, January-Feb, I'd probably be tempted.


Where AMD ends up relative to the 4090 may give a pretty good indication. If they're slower then probably not before next summer is my guess.


----------



## J7SC

I watched the Digital Foundry vid about DLSS3 from beginning to end. As mentioned there, DLSS3 can be thought of (sort of) as SLI-AFR given the alternate frame injection etc. eSport gamers might still be better off w/o it for reasons stated in the vid, but everyone else might really want it (myself included). As observed above, what AMD will bring out in terms of next-gen hardware - and also software (re. NV's DLSS3) should be very interesting and telling. Also, if AMD releases a true mGPU with two cores as the rumoured 500W 7900XT*X*, 4090 Ti might be released more quickly.

Apparently, finished-product 4090s were scurried away for months in the warehouses (sidebar: EVGA?), just waiting for the RTX 3K backlog to clear...so is there a secret stash of ready-to-go 4090 Tis somewhere ?


----------



## TMavica

I got this and ready for 4090, model is FSP Hydro PTM Pro ATX3.0 PCIe5.0 1200W Platinum


----------



## Benni231990

Does anybody know a Waterblock Manufacturer for the Strix now? 

EK hast only FE and Alphacool only has the Reference Design


----------



## mirkendargen

Benni231990 said:


> Does anybody know a Waterblock Manufacturer for the Strix now?
> 
> EK hast only FE and Alphacool only has the Reference Design


I'm sure EK, Alphacool, and Byksi at a minimum will have a Strix block because they always make blocks for everything.


----------



## dr/owned

Drive by update: CableMod shipped my Corsair 3x connector-> 16 pin cable thingie from China. Looks like the AX1500i is going to keep riding.


----------



## Sir Beregond

mirkendargen said:


> I'm sure EK, Alphacool, and Byksi at a minimum will have a Strix block because they always make blocks for everything.


Also, blocks are not always ready day 1 of a new card release. Just depends. I am sure we'll see more by the end of the year.


----------



## GraphicsWhore

Benni231990 said:


> Does anybody know a Waterblock Manufacturer for the Strix now?
> 
> EK hast only FE and Alphacool only has the Reference Design


Optimus said they'll be releasing for the Strix.


----------



## dr/owned

Sir Beregond said:


> Also, blocks are not always ready day 1 of a new card release. Just depends. I am sure we'll see more by the end of the year.


I'm going to be happy just running on air until the block situation gets figured out. My desktop is remote-located now so I don't have to care about the noise. Bykski is out so I can wait for Barrow or maybe do Alphacool this go around (never had their GPU blocks but XPX CPU is incredible).

Side muttering: except whoever did the XPX mounting mechanism was lazy AF. The block design guy was a hydraulic genius though.


----------



## Sir Beregond

dr/owned said:


> I'm going to be happy just running on air until the block situation gets figured out. My desktop is remote-located now so I don't have to care about the noise. Bykski is out so I can wait for Barrow or maybe do Alphacool this go around (never had their GPU blocks but XPX CPU is incredible).
> 
> Side muttering: except whoever did the XPX mounting mechanism was lazy AF. The block design guy was a hydraulic genius though.


Pretty happy with my 3080 FE block from Alphacool. I did find the install a bit tedious to put on all the thermal pads, I will admit, but otherwise seems to be working pretty well.


----------



## RetroWave78

BigMack70 said:


> Here's hoping I can get a founders edition card... been really happy with my 3090 FE. Ampere was absolutely neutered for any real overclocking and with a launch TDP of 450W I expect that to be the case here also.


Base TDP is 450w, i.e. 100 PT is 450w, FE can pull 600w, i.e. 120% PT. 



mirkendargen said:


> That video makes no sense and is complete fear mongering click bait.
> 
> 1. If the problem is the number of insertions the 16pin connector can survive, how the hell does a new PSU help? It's the same connector.
> 
> 2. He goes on and on about the 16 pin connector insertions, then is like "oh Nvidia said the problem was actually on the PSU side of the adapter" .....k?
> 
> 3. He talks about the sense pins missing. They're missing cause the adapter in his hand isn't actually a 12VHPWR adapter, the actual ones have the sense pins.
> 
> 4. He talks like the sense pins are some sort of constant 2 way negotiation with the GPU requesting power, the PSU saying what is available, etc. Then shows *on the screen* what the spec of the sense pins actually is. It's just a constant ground/neutral combo on the pins to let the GPU know what the PSU supports. No reason at all an adapter can't pass through the ground to the appropriate sense pins based on the number of 8pins connected to it.
> 
> 5. He says your GPU might pull <ridiculous large current> at boot up and fry your PSU. Huh? Then gives a hypothetical that your PSU has 2+ rails and you plug all the plugs to the adapter into the same one and bad things happen. What if you plug all the 8pins to your current GPU into that same rail....? What is suddenly different now?
> 
> At the end of the day, you either have a PSU that can handle a potentially 600w GPU or you don't, that's what actually matters. What plugs come out of it being some giant concern is just FUD.


Yes this is Jay being Jay, remember EVGA sells power supplies. 



bmagnien said:


> This was my rationale with my earlier post. Only way I see of getting burned with FE is if other boards allow more than 600w through their single 600w port via a bios upgrade that FE won’t have access to. I’m not technical enough to know if that’s even possible though.


If Ampere is any indication, which I believe it is, Lovelace is similarly clocked very aggressively from the factory, probably right at the power / efficiency wall where more performance requires an exponential increase in power, i.e. 5% more clocks for 25% more current. Outside of LN2 where liquid nitrogen can be brought to bear on temps 600w is about as hot as you'd like to go, without needing a chiller or LN2. I don't think people were dailying their 1kw VBIOS for gaming to be honest.

600w is ridiculous. Look at the size of the 90 FE cooler. It's basically a 62mm thick 240 radiator. 250w of power requires a 120x25mm thick rad to keep temps under 85C. The best air cooled cards will trade blows with the FE in terms of temps. 4x slots are required. Think about how much another 300w would require. 



J7SC said:


> ...don't know about the Phanteks blocks for the 2080 Ti, but for the 3090 Strix, the Phanteks block has a much bigger GPU die coverage area (blocks sized ~ the same, red rectangle is identical in pic below). Also, the EK had nickel issues - with the camera flash 'on' (and a Phanteks X570 CPU block in the same frame), you see how thin the EK nickel coating is...there were other issues with the EK 3090 Strix block and backplate.
> 
> In any case, I've had a lot of other EK products that were fine but the RTX 3K Strix wasn't one of them. The Phanteks also dropped GPU temps by 3 - 4 C over the EK in the same loop / ambient temp.
> View attachment 2573747


Fantastic information, the Phanteks block has more surface area to be sure. It's possible that I ommitted the fact that I am using Conductonaut on the 3090 FE (EKWB) and Gelid GC Extreme on the 2080 Ti (Phanteks) but also, Phanteks block is outclassed here, the EKWB special edition block has so much thermal mass, good channeling, and dense fin array not to mention that the backplate is active (a section of the backplate is thermally connected to the block) and the heat is quickly dissipated with the rear 140mm fan turned around as intake. Phanteks block is pretty good honestly, but I don't know that it's better than EKWB Special Edition FE block. 



bmagnien said:


> US $110.97 15％ Off | Bykski 4090 Series GPU Water Cooling Block , For ZOTAC GAMING RTX4090 Apocalypse OC , Liquid Cooler System, N-ST4090TQ-X
> 
> 
> https://a.aliexpress.com/_mOhLjRa


Amazing information sir, thank-you, Zotac is definitely a card to consider. I think FE availability will be better than Zotac unfortunately. I could be mistaken, if I can't get and FE at launch I will probably try to get Zotac and cancel my EKWB 4090 block pre-order. What Zotac is this for exactly?



Arizor said:


> Good to see some of the usual suspects in this thread from the 3090 owners!
> 
> Digital Foundry have uploaded a nice look at DLSS3; the decoupling of framerate and latency is going to be the big conundrum for most gamers I think:
> 
> 
> 
> 
> Also to those pondering Strix vs FE and watercooling, it should be noted Optimus is making WBs for both, just in case that's a factor.


DLSS3.0 is going to be a pretty big deal because it overcomes CPU limited performance scenario's as shown by DLSS 2.0 only able to improve performance by 14% over native during the Kingpin Construction Site "benchmark" with 12900k (presumably at stock clocks, Digital Foundry is Nvidia's PR liason after all) because the performance wasn't limited by the GPU. DLSS 3.0 was 180 FPS here, or 200% faster vs 14%, and the increased latency was only 10ms if I remember correctly (38ms vs 25ms). 3rd person action adventure games that have a lot going on CPU wise (i.e. Marvels Spiderman Remastered) will benefit heavily from DLSS3.0 support. You're not going to feel 10-20 even 30ms latency difference. There are 1000ms in a second. We are talking about nearly imperceptual latency difference. The latency will be a problem in E-Sports where every millisecond absolutely counts. 

DLSS 3.0 is absolutely the real deal. 

They've even managed to nearly completely reduce the artifacting inherent with current Frame Interpolation technology. This absolutely is technological magic. 

Problem is, devs will only implement the technology in games with heavy funding / reward from Nvidia until the industry becomes spoiled and then demands widespread support / adoption, which may not happen because very few can afford an Nvidia GPU this generation, have already done the math and realized that they can have a faster card for less than the lowest tier Ada L. card. 

DLSS3.0 is the real deal, but it will have limited support adoption early on. This is another technology that AMD has no counter to at the moment. RDNA3 has no optical flow processors. Ada L is doing frame interpolation so much faster than is currently possible owing to Optical Flow. We are talking about real time frame interpolation. When I press the forward seek button in Smooth Video Project player there is considerable delay (feels like 500ms) between my input and the frame seek finishing.



PLATOON TEKK said:


> Good points made on the FE vs Strix. Might just wait it out this time and see what the “true” power limits are for each card. if an xoc bios drops and “overrides” said power limits; chances are that won’t run on FE though.
> 
> I wonder if Galax are even permitted to add a further power connector on the HOF 4090. Am confident that’s what the 4090tis will be.


More power will provide benefit to extreme overclocking, people aren't dailying 1kw vbios on Ampere. Over 600w is chiller and LN2 territory.


Arizor said:


> Yep @PLATOON TEKK . I really wish we knew the timeline for the inevitable 4090ti; if it was say, January-Feb, I'd probably be tempted.


NV may respond with both 4080 Ti and 4090 Ti after RDNA3 launches to the public and AMD's mid tier card is as fast as the 4080 for $900 and their top tier is 20% faster than the 4090 at the same price. 

AMD went all in on rasterization this time around and may offer a faster card for the same price or less. NVLSS 3.0 will show Nvidia massively faster but dev integration will depend on how much of the consumer-base can even use / appreciate Optical Flow technology. 

It's going to be interesting. AMD stands a good chance of getting on top via an angry consumer-base that is tired of NV shenanigans. 

No-one will buy the "12GB 4080" over a $900 card as fast as the 16GB 3080. Those who will buy the 90 card find it a better value proposition vs the outgoing 90 class card and are intrigued by the technology, want a better product, can afford to do so. 

DLSS is better than FSR. AMD has no Optical Flow. This is why I'm going NV again, even though I should buy AMD on principle. The masses will by AMD who will win this battle and assume discrete GPU market dominance. NV will probably bow out, they have cosy contracts with DOD and the automotive industry (i.e. Nancy Pelosi PR stunt visit to Taiawanese Semi Conductor factory during heightened tension with China. Guess which manufacturer she visited? She has stock in Tesla. Tesla GPU is NV.). 

Nvidia lost the goodwill of the consumer base. They don't care anymore. They can't compete with AMD based APU systems where adoption of cheaper, inferior technologies is commonplace (FSR). 





J7SC said:


> I watched the Digital Foundry vid about DLSS3 from beginning to end. As mentioned there, DLSS3 can be thought of (sort of) as SLI-AFR given the alternate frame injection etc. eSport gamers might still be better off w/o it for reasons stated in the vid, but everyone else might really want it (myself included). As observed above, what AMD will bring out in terms of next-gen hardware - and also software (re. NV's DLSS3) should be very interesting and telling. Also, if AMD releases a true mGPU with two cores as the rumoured 500W 7900XT*X*, 4090 Ti might be released more quickly.
> 
> Apparently, finished-product 4090s were scurried away for months in the warehouses (sidebar: EVGA?), just waiting for the RTX 3K backlog to clear...so is there a secret stash of ready-to-go 4090 Tis somewhere ?


They have it already made, that and the 4080 Ti, 100% guaranteed. They only need as much as AMD has at the very upper tier, and NV is beating them to the punch. 




dr/owned said:


> Drive by update: CableMod shipped my Corsair 3x connector-> 16 pin cable thingie from China. Looks like the AX1500i is going to keep riding.


I would get 4x PCIE to be safe. I too made the mistake of ordering 3x8 to 16pin micro at first because I hastily wanted to get my order in ahead of the crowd and early on I didn't get the bit that Nvidia gave out in regards to the adapter "and a 4th 8 pin cable adapter is included for overclocking" until after I placed my order (I went back and changed my post to 4x8 pin after realizing this). 3x8 pin is 150w per cable. 4x8 pin is needed for access to more than 100% PT / 450w with AD-102. 



dr/owned said:


> I'm going to be happy just running on air until the block situation gets figured out. My desktop is remote-located now so I don't have to care about the noise. Bykski is out so I can wait for Barrow or maybe do Alphacool this go around (never had their GPU blocks but XPX CPU is incredible).
> 
> Side muttering: except whoever did the XPX mounting mechanism was lazy AF. The block design guy was a hydraulic genius though.


The FE air cooler looks amazing. I'm curious as to what kind of temps I will see at 100% PT / 450w. They said that core was running at 55C with DLSS 3.0 on in Cyberpunk 2077 and that it was only pulling 350w stock clocks (presumably 2500 MHz core). 

I may keep my under the air cooler until 13900KS is out and I have a real excuse to drain my loop an additional time between now and 1st Quarter 2023. 

The FE air cooler, dude it's rad, like seriously


----------



## Arizor

I really wish the FE was even a possibility here in Australia, seems like only USA and Europe ever get any.

If I could grab one I'd probably keep it aircooled and see how things played out over the next 4 months, could probably sell it almost at cost and grab the 4090ti before putting it under water.


----------



## bmagnien

RetroWave78 said:


> What Zotac is this for exactly?


ZOTAC GAMING RTX4090 Apocalypse OC

Alphacools Reference block now available on AliExpress in addition to their website…might be cheaper here:
US $170.65 | Alphacool Eisblock Aurora Acrylic Water Block Compatible Nvidia Geforce RTX 4090 Reference Design Card Cooler With Backplate


https://a.aliexpress.com/_mNJnOek



I have one on preorder from their website but might cancel cuz I’m sketched that there’s no boards announced to be reference compatible yet.


----------



## yzonker

RetroWave78 said:


> Base TDP is 450w, i.e. 100 PT is 450w, FE can pull 600w, i.e. 120% PT.
> 
> 
> 
> Yes this is Jay being Jay, remember EVGA sells power supplies.
> 
> 
> 
> If Ampere is any indication, which I believe it is, Lovelace is similarly clocked very aggressively from the factory, probably right at the power / efficiency wall where more performance requires an exponential increase in power, i.e. 5% more clocks for 25% more current. Outside of LN2 where liquid nitrogen can be brought to bear on temps 600w is about as hot as you'd like to go, without needing a chiller or LN2. I don't think people were dailying their 1kw VBIOS for gaming to be honest.
> 
> 600w is ridiculous. Look at the size of the 90 FE cooler. It's basically a 62mm thick 240 radiator. 250w of power requires a 120x25mm thick rad to keep temps under 85C. The best air cooled cards will trade blows with the FE in terms of temps. 4x slots are required. Think about how much another 300w would require.
> 
> 
> 
> Fantastic information, the Phanteks block has more surface area to be sure. It's possible that I ommitted the fact that I am using Conductonaut on the 3090 FE (EKWB) and Gelid GC Extreme on the 2080 Ti (Phanteks) but also, Phanteks block is outclassed here, the EKWB special edition block has so much thermal mass, good channeling, and dense fin array not to mention that the backplate is active (a section of the backplate is thermally connected to the block) and the heat is quickly dissipated with the rear 140mm fan turned around as intake. Phanteks block is pretty good honestly, but I don't know that it's better than EKWB Special Edition FE block.
> 
> 
> 
> Amazing information sir, thank-you, Zotac is definitely a card to consider. I think FE availability will be better than Zotac unfortunately. I could be mistaken, if I can't get and FE at launch I will probably try to get Zotac and cancel my EKWB 4090 block pre-order. What Zotac is this for exactly?
> 
> 
> 
> DLSS3.0 is going to be a pretty big deal because it overcomes CPU limited performance scenario's as shown by DLSS 2.0 only able to improve performance by 14% over native during the Kingpin Construction Site "benchmark" with 12900k (presumably at stock clocks, Digital Foundry is Nvidia's PR liason after all) because the performance wasn't limited by the GPU. DLSS 3.0 was 180 FPS here, or 200% faster vs 14%, and the increased latency was only 10ms if I remember correctly (38ms vs 25ms). 3rd person action adventure games that have a lot going on CPU wise (i.e. Marvels Spiderman Remastered) will benefit heavily from DLSS3.0 support. You're not going to feel 10-20 even 30ms latency difference. There are 1000ms in a second. We are talking about nearly imperceptual latency difference. The latency will be a problem in E-Sports where every millisecond absolutely counts.
> 
> DLSS 3.0 is absolutely the real deal.
> 
> They've even managed to nearly completely reduce the artifacting inherent with current Frame Interpolation technology. This absolutely is technological magic.
> 
> Problem is, devs will only implement the technology in games with heavy funding / reward from Nvidia until the industry becomes spoiled and then demands widespread support / adoption, which may not happen because very few can afford an Nvidia GPU this generation, have already done the math and realized that they can have a faster card for less than the lowest tier Ada L. card.
> 
> DLSS3.0 is the real deal, but it will have limited support adoption early on. This is another technology that AMD has no counter to at the moment. RDNA3 has no optical flow processors. Ada L is doing frame interpolation so much faster than is currently possible owing to Optical Flow. We are talking about real time frame interpolation. When I press the forward seek button in Smooth Video Project player there is considerable delay (feels like 500ms) between my input and the frame seek finishing.
> 
> 
> 
> More power will provide benefit to extreme overclocking, people aren't dailying 1kw vbios on Ampere. Over 600w is chiller and LN2 territory.
> 
> 
> NV may respond with both 4080 Ti and 4090 Ti after RDNA3 launches to the public and AMD's mid tier card is as fast as the 4080 for $900 and their top tier is 20% faster than the 4090 at the same price.
> 
> AMD went all in on rasterization this time around and may offer a faster card for the same price or less. NVLSS 3.0 will show Nvidia massively faster but dev integration will depend on how much of the consumer-base can even use / appreciate Optical Flow technology.
> 
> It's going to be interesting. AMD stands a good chance of getting on top via an angry consumer-base that is tired of NV shenanigans.
> 
> No-one will buy the "12GB 4080" over a $900 card as fast as the 16GB 3080. Those who will buy the 90 card find it a better value proposition vs the outgoing 90 class card and are intrigued by the technology, want a better product, can afford to do so.
> 
> DLSS is better than FSR. AMD has no Optical Flow. This is why I'm going NV again, even though I should buy AMD on principle. The masses will by AMD who will win this battle and assume discrete GPU market dominance. NV will probably bow out, they have cosy contracts with DOD and the automotive industry (i.e. Nancy Pelosi PR stunt visit to Taiawanese Semi Conductor factory during heightened tension with China. Guess which manufacturer she visited? She has stock in Tesla. Tesla GPU is NV.).
> 
> Nvidia lost the goodwill of the consumer base. They don't care anymore. They can't compete with AMD based APU systems where adoption of cheaper, inferior technologies is commonplace (FSR).
> 
> 
> 
> 
> 
> They have it already made, that and the 4080 Ti, 100% guaranteed. They only need as much as AMD has at the very upper tier, and NV is beating them to the punch.
> 
> 
> 
> 
> I would get 4x PCIE to be safe. I too made the mistake of ordering 3x8 to 16pin micro at first because I hastily wanted to get my order in ahead of the crowd and early on I didn't get the bit that Nvidia gave out in regards to the adapter "and a 4th 8 pin cable adapter is included for overclocking" until after I placed my order (I went back and changed my post to 4x8 pin after realizing this). 3x8 pin is 150w per cable. 4x8 pin is needed for access to more than 100% PT / 450w with AD-102.
> 
> 
> 
> The FE air cooler looks amazing. I'm curious as to what kind of temps I will see at 100% PT / 450w. They said that core was running at 55C with DLSS 3.0 on in Cyberpunk 2077 and that it was only pulling 350w stock clocks (presumably 2500 MHz core).
> 
> I may keep my under the air cooler until 13900KS is out and I have a real excuse to drain my loop an additional time between now and 1st Quarter 2023.
> 
> The FE air cooler, dude it's rad, like seriously


A few counter points. 

600/450 =1.333

There are definitely people gaming at 600w with the KP 1kw bios. Not a bunch, but definitely is being done. A good loop can handle 600w easily. My loop can hold about 45C core at 600w with low to mid 20s ambient. So I could probably handle 700-800w without temps getting totally out of control. 

Zotac had more cards available during the beginning of the shortage than anyone. That's how I ended up with a Zotac 3090. They dropped cards every couple of days on their web store. This was November 2020.


----------



## Arizor

bmagnien said:


> ZOTAC GAMING RTX4090 Apocalypse OC
> 
> Alphacools Reference block now available on AliExpress in addition to their website…might be cheaper here:
> US $170.65 | Alphacool Eisblock Aurora Acrylic Water Block Compatible Nvidia Geforce RTX 4090 Reference Design Card Cooler With Backplate
> 
> 
> https://a.aliexpress.com/_mNJnOek


Wow that looks almost exactly like Inno3D's iChill, maybe alphacool make it for them?

Edit: And now I know what Alphacool's logo is, I see it's very clearly there on the bottom right corner


----------



## dr/owned

RetroWave78 said:


> I would get 4x PCIE to be safe. I too made the mistake of ordering 3x8 to 16pin micro at first because I hastily wanted to get my order in ahead of the crowd and early on I didn't get the bit that Nvidia gave out in regards to the adapter "and a 4th 8 pin cable adapter is included for overclocking" until after I placed my order (I went back and changed my post to 4x8 pin after realizing this). 3x8 pin is 150w per cable. 4x8 pin is needed for access to more than 100% PT / 450w with AD-102.


3x8 pin is even overkill for 600W. Corsair themselves are selling a 2x8pin 600W cable, because you only need 6 12V pins with the 12VHPWR. 2x8pins on the PSU end is 8 12V pins (their 8 pin PCIe cables only use 3 of the 4 12V pins).

There's no circumstance where you need 4x8pins adapter. I've run 800W through 6 12V pins and it was thermally fine. Molex rates the pins at 9A each.


----------



## J7SC

yzonker said:


> A few counter points.
> 
> 600/450 =1.333
> 
> There are definitely people gaming at 600w with the KP 1kw bios. Not a bunch, but definitely is being done. A good loop can handle 600w easily. My loop can hold about 45C core at 600w with low to mid 20s ambient. So I could probably handle 700-800w without temps getting totally out of control.
> 
> Zotac had more cards available during the beginning of the shortage than anyone. That's how I ended up with a Zotac 3090. They dropped cards every couple of days on their web store. This was November 2020.


Good point about the rest of the loop also playing a big role - my 3090 Strix is in a loop w/1320 mmx 63 mm (dual and triple core rads, multiple D5s, push-pull fans) and I could definitely re-use that loop 'as is' for a Strix 4090 / Ti. I even had TG-10 thermal putty saved up for the next gen GPUs, but TG 10 thermal putty is available again anyhow. BTW, temps in spoiler below for ~ 450W vbios, but 520W KPE basically identical . Plenty of thermal putty on VRAM, VRM, back of GPU die plus extra heat sinks on the back.


Spoiler


----------



## yzonker

J7SC said:


> Good point about the rest of the loop also playing a big role - my 3090 Strix is in a loop w/1320 mmx 63 mm (dual and triple core rads, multiple D5s, push-pull fans) and I could definitely re-use that loop 'as is' for a Strix 4090 / Ti. I even had TG-10 thermal putty saved up for the next gen GPUs, but TG 10 thermal putty is available again anyhow. BTW, temps in spoiler below for ~ 450W vbios, but 520W KPE basically identical . Plenty of thermal putty on VRAM, VRM, back of GPU die plus extra heat sinks on the back.
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2573878


The bigger problem a lot of people have including myself is the room gets too friggin' hot with the gpu pumping out 500w+. I played most of the way through CP2077 at 500-550w before finally backing it off to around 450w because I was tired of the heat! Cooling problem between the chair and the keyboard. Lol.


----------



## Arizor

you just need to uninstall the overclocked Kiroshi gorilla arms choom, liquid cooled Mantis blades will drop temps!



yzonker said:


> The bigger problem a lot of people have including myself is the room gets too friggin' hot with the gpu pumping out 500w+. I played most of the way through CP2077 at 500-550w before finally backing it off to around 450w because I was tired of the heat! Cooling problem between the chair and the keyboard. Lol.


----------



## J7SC

yzonker said:


> The bigger problem a lot of people have including myself is the room gets too friggin' hot with the gpu pumping out 500w+. I played most of the way through CP2077 at 500-550w before finally backing it off to around 450w because I was tired of the heat! Cooling problem between the chair and the keyboard. Lol.


...well yeah - step 1 is to get rid of up to ~1200 W (W = heat energy) directly from the CPUs and GPUs (two systems in one case)...step 2 is to move the resulting heated air elsewhere: I built a separate cooling table that houses all the rads and pumps connected via QDs, and the 45+ 120 mm fans actually manage to move a lot of air quietly...the intake is near an open window and the air then moves towards another open window at the other end. The cooling table is hidden underneath a big table with glassware on top and the rads exhausts through on old metal wine rack - can't even hear it or see it. I just had to buy a new wine rack


----------



## Arizor

I like to imagine @J7SC has one of those spy rooms where he pulls on a book and all the furniture flips over to reveal a monstrous PC set up. But the twist is he’s not a spy, he just does it to hide his spending from the other half.


----------



## bmagnien

From Optimus:
We need to get final production sample from ASUS to confirm everything then we'll begin production, then open orders. Turnaround will be a few weeks after we get samples from ASUS, which is TBD.


----------



## J7SC

Arizor said:


> I like to imagine @J7SC has one of those spy rooms where he pulls on a book and all the furniture flips over to reveal a monstrous PC set up. But the twist is he’s not a spy, he just does it to hide his spending from the other half.


🥴


Spoiler


----------



## mirkendargen

yzonker said:


> The bigger problem a lot of people have including myself is the room gets too friggin' hot with the gpu pumping out 500w+. I played most of the way through CP2077 at 500-550w before finally backing it off to around 450w because I was tired of the heat! Cooling problem between the chair and the keyboard. Lol.


This is why I have tubing going into the wall down to the unfinished basement where pumps and radiators live


----------



## RetroWave78

dr/owned said:


> 3x8 pin is even overkill for 600W. Corsair themselves are selling a 2x8pin 600W cable, because you only need 6 12V pins with the 12VHPWR. 2x8pins on the PSU end is 8 12V pins (their 8 pin PCIe cables only use 3 of the 4 12V pins).
> 
> There's no circumstance where you need 4x8pins adapter. I've run 800W through 6 12V pins and it was thermally fine. Molex rates the pins at 9A each.


Great point, that was my understanding, it was and is so confusing, everywhere you look online it's stated that 150w is the limit per 8 pin PCI-E plug but I remember this controversy with the 30 series and it was established that way more power could be safely conveyed over 2x8 pin PCI-E than 300w. 

Luckily I was able to get a refund for my 3x 8pin PCI-E order from CableMod, replaced with the 4x8 pin, even if it's completely unnecessary.


----------



## GRABibus

https://www.ginjfo.com/wp-content/uploads/2022/09/RTX4090_01-1.jpg


----------



## PLATOON TEKK

Arizor said:


> Yep @PLATOON TEKK . I really wish we knew the timeline for the inevitable 4090ti; if it was say, January-Feb, I'd probably be tempted.


Very true, Nvidia’s ambiguous release cycle keeps us all on edge ha. I definitely hope AMD is able to light the Ti fire under their asses.



RetroWave78 said:


> More power will provide benefit to extreme overclocking, people aren't dailying 1kw vbios on Ampere. Over 600w is chiller and LN2 territory.


Agreed 100%. I’m running a 6900w (23,543 BTU) 6x chiller setup (one being 1.5HP) for benching , so the more watts the better in my case. All the other setups at the house are chilled and running 1k bios 24/7. Here’s a further pic.


Spoiler: 6900w setup

















yzonker said:


> The bigger problem a lot of people have including myself is the room gets too friggin' hot with the gpu pumping out 500w+. I played most of the way through CP2077 at 500-550w before finally backing it off to around 450w because I was tired of the heat! Cooling problem between the chair and the keyboard. Lol.


Haha I absolutely feel your pain, I used to be dripping sweat when running benches, lookin like the pilot from Airplane. If I’m not mistaken, didn’t you run a chiller from another room at one point? I did that for all of my setups here and it’s been pretty effective (apart from destroying every wall and floor of my house lol). No heat, sound or moving parts (when using FSP WC hydro PSUs). 24/7 chilled temps and 1k bios (also wireless power remotes to turn off pumps and chillers when systems aren’t in use).

If the 4090s are indeed limited by wattage, the chillers will still benefit in keeping the card temps much lower allowing for higher sustained clocks (especially with less than 1245w power draw from x2 3090s and OC cpu).



Spoiler: Low Power (C&C voice) + 1 of 5 chiller setups


----------



## Arizor

Haha knew it was only a matter of time before @PLATOON TEKK flexed that setup, amazing, wish I could do even 1/3 of that…!


----------



## PLATOON TEKK

Arizor said:


> Haha knew it was only a matter of time before @PLATOON TEKK flexed that setup, amazing, wish I could do even 1/3 of that…!


haha you know I had to! You guys are prob the only people on earth who care or even understand it lol

also, genuinely let me know if you plan on it and if there’s anyway I can help (I promise I’m not sponsored by Koolance lol)


----------



## Arizor

Hahaha you'll be the first person to call buddy!


----------



## yzonker

PLATOON TEKK said:


> Very true, Nvidia’s ambiguous release cycle keeps us all on edge ha. I definitely hope AMD is able to light the Ti fire under their asses.
> 
> 
> 
> Agreed 100%. I’m running a 6900w (23,543 BTU) 6x chiller setup (one being 1.5HP) for benching , so the more watts the better in my case. All the other setups at the house are chilled and running 1k bios 24/7. Here’s a further pic.
> 
> 
> Spoiler: 6900w setup
> 
> 
> 
> 
> View attachment 2573920
> 
> 
> 
> 
> 
> 
> Haha I absolutely feel your pain, I used to be dripping sweat when running benches, lookin like the pilot from Airplane. If I’m not mistaken, didn’t you run a chiller from another room at one point? I did that for all of my setups here and it’s been pretty effective (apart from destroying every wall and floor of my house lol). No heat, sound or moving parts (when using FSP WC hydro PSUs). 24/7 chilled temps and 1k bios (also wireless power remotes to turn off pumps and chillers when systems aren’t in use).
> 
> If the 4090s are indeed limited by wattage, the chillers will still benefit in keeping the card temps much lower allowing for higher sustained clocks (especially with less than 1245w power draw from x2 3090s and OC cpu).
> 
> 
> 
> Spoiler: Low Power (C&C voice) + 1 of 5 chiller setups
> 
> 
> 
> 
> View attachment 2573932
> 
> View attachment 2573933


I still have the chiller in a separate room. I just don't use it all the time. A lot times if I'm going to just game for 30-60min I don't bother with it, but the room will still hit 26-27C in that short amount of time. 

I was partly just lamenting the fact that I need to use a remote cooler, whether that is a chiller or rads, just to be comfortable while gaming.


----------



## PLATOON TEKK

yzonker said:


> I still have the chiller in a separate room. I just don't use it all the time. A lot times if I'm going to just game for 30-60min I don't bother with it, but the room will still hit 26-27C in that short amount of time.
> 
> I was partly just lamenting the fact that I need to use a remote cooler, whether that is a chiller or rads, just to be comfortable while gaming.


Got you, I guess it depends on the setup and if ALL components are water cooled. The liquid stays ambient with the chillers/pcs off so when they are turned on for use, cooling down to a relatively lower temperature takes 5 mins max (from my experience with the chillers here). Also, the PCs can be used immediately as long as the pumps and chiller are on (even when not at target temp yet).

Furthermore, the pcs are literally ”cool” to the touch (and 100% silent apart from coil whine) since everything including motherboard (vrm etc) and psu are water cooled with monoblocks. The silence is so abnormal that when running no LEDs it’s hard to tell if the pcs are powered on, when switched. I’ve noticed that even when using a non WC psu or non WC mobo (only non WC component in that scenario) that the room temp can increase. Also when a chiller is active, the loop or tubing never “heats up” after extended use since the liquid temp is maintained (at the cost of power and heat in a different room obv). However, when there is not a single fan spinning (including mobo, all gpus, psu, ram, pci cards, ssds etc) and everything is being cooled by water, the minimal radiant heat can be absolutely ignored.

edit: wait, ignore the above if you meant the room heats up when NOT using a chiller lol. My bad I misunderstood that


----------



## J7SC

yzonker said:


> I still have the chiller in a separate room. I just don't use it all the time. A lot times if I'm going to just game for 30-60min I don't bother with it, but the room will still hit 26-27C in that short amount of time.
> 
> I was partly just lamenting the fact that I need to use a remote cooler, whether that is a chiller or rads, just to be comfortable while gaming.





PLATOON TEKK said:


> Got you, I guess it depends on the setup and if ALL components are water cooled. The liquid stays ambient with the chillers/pcs off so when they are turned on for use, cooling down to a relatively lower temperature takes 5 mins max (from my experience with the chillers here). Also, the PCs can be used immediately as long as the pumps and chiller are on (even when not at target temp yet).
> 
> Furthermore, the pcs are literally ”cool” to the touch (and 100% silent apart from coil whine) since everything including motherboard (vrm etc) and psu are water cooled with monoblocks. The silence is so abnormal that when running no LEDs it’s hard to tell if the pcs are powered on, when switched. I’ve noticed that even when using a non WC psu or non WC mobo (only non WC component in that scenario) that the room temp can increase. Also when a chiller is active, the loop or tubing never “heats up” after extended use since the liquid temp is maintained (at the cost of power and heat in a different room obv). However, when there is not a single fan spinning (including mobo, all gpus, psu, ram, pci cards, ssds etc) and everything is being cooled by water, the minimal radiant heat can be absolutely ignored.
> 
> edit: wait, ignore the above if you meant the room heats up when NOT using a chiller lol. My bad I misunderstood that


...you folks could run Linux...


Spoiler














 
...else, get a Roboclocker


----------



## RetroWave78

GRABibus said:


> https://www.ginjfo.com/wp-content/uploads/2022/09/RTX4090_01-1.jpg


$2k, for what? Plastic RGB cooler?

Also, comments about heat, to be honest I fell in love with my Steam Deck over the summer here. Pc stayed off 95% of the time. Hot Wheels Unleashed, Cuphead, Wolfenstein: New Colossus even Doom Eternal and other newer AAA titles run at 60 FPS. Power consumption is 30w. I can play from my bed, anywhere, even bathroom. Recently picked up Octopath Traveller on sale, what a joy on this handheld.

I see trends fairly clearly. DLSS days are numbered unless someone counters with an NV APU device.

I'm absolutely not kidding here. We are talking 60 FPS AAA gaming anywhere, $600, 30w Power/ heat. The touch pads and gyro are amazing when you get used to them. Right touch pad is like a mouse, coupled with gyro aiming, the novelty of it in say Wolfenstein, its unrivaled.

Oh I forgot Sunset Overdrive, what an absolute blast and looks beautiful on this display, also 60 FPS everywhere.

Also, you can suspend the system, it's like game saving on steroids. When you wake the system you are right where you left off, no waiting for Windows and the game to load.

It's also Linux based PC, which is cool in itself if you've only been around Windows PC, and with dock can basically function as PC with mouse, keyboard and monitor, just keep your expectations low with monitor as it does 1280x800 native.

Seriously Nvidia is done, they overplayed their hand and now have to sell the TSMC inventory they bought (don't forget they actually tried to back out of deal) and now are pricing their products as "we don't give a **** about you because we are over this" prices. For them, clearing 30 series stock is taking precedence over maintaining trust, goodwill and interest from PC gaming consumer base. Ethereum going POS / crypto mining with GPU's ending is basically the final nail in the coffin for this industry. PC gaming will become incredibly niche as noone will be able to afford it. Like I said, I bought both my 2080 Ti and 3090 with expectatiom of recouping the cost via mining (to heat my apartment), which I actually did and then some at $10 a day for 2.5 years. Now I'm looking at $1600 GPU, no recouping of expense, I now use my Steam Deck 90% of the time and I know I'm not alone with this conclusion.

We are being dragged kicking and screaming into Klaus Scwab's Great Reset "you will have nothing and be happy" where there is mass poverty. It's the whole point of the Fake Pandemic, the intentional destruction of the Global Economy. Nvidia knows that the near term prospects of selling power hungry $1k+ discrete GPU's in an economic environment that exceeds the Great Depression in privation terms is exceedingly grim and are shifting their focus to data center, AI and automotive (oh the millionaire members of Congress working for Big Pharma will still be buying Tesla's, for a little while at least).

I get into the Great Reset at great length below if anyone is interested. My PC can be seen in earlier videos.


----------



## bmagnien

B&H Photo:
Bryant: Ok. Do you have an eta as to when the cards will be on sale?
David M: October 12
Bryant: Thanks. Any specific time?
David M: 9AM EST
Bryant: Thank you very much. Have a great day David
------------End Transcript------------


----------



## yzonker

Speaking of Phanteks, 









Phanteks announces Glacier G40 water blocks for GeForce RTX 4090 GPUs - VideoCardz.com


PHANTEKS UNVEILS RTX 4000 WATER BLOCKS Rotterdam, The Netherlands – September 29th, 2022 – With the new generation of GeForce RTX 4000 cards announcement, Phanteks today unveils the upcoming Glacier G40 GPU water blocks for RTX 4000 Series cards. The Glacier G40 GPU blocks bring the ultimate...




videocardz.com


----------



## J7SC

yzonker said:


> Speaking of Phanteks,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Phanteks announces Glacier G40 water blocks for GeForce RTX 4090 GPUs - VideoCardz.com
> 
> 
> PHANTEKS UNVEILS RTX 4000 WATER BLOCKS Rotterdam, The Netherlands – September 29th, 2022 – With the new generation of GeForce RTX 4000 cards announcement, Phanteks today unveils the upcoming Glacier G40 GPU water blocks for RTX 4000 Series cards. The Glacier G40 GPU blocks bring the ultimate...
> 
> 
> 
> 
> videocardz.com


I like the 'side oiler' option for the fittings  ...should make options for system building even more numerous.


----------



## tubs2x4

RetroWave78 said:


> $2k, for what? Plastic RGB cooler?
> 
> Also, comments about heat, to be honest I fell in love with my Steam Deck over the summer here. Pc stayed off 95% of the time. Hot Wheels Unleashed, Cuphead, Wolfenstein: New Colossus even Doom Eternal and other newer AAA titles run at 60 FPS. Power consumption is 30w. I can play from my bed, anywhere, even bathroom. Recently picked up Octopath Traveller on sale, what a joy on this handheld.
> 
> I see trends fairly clearly. DLSS days are numbered unless someone counters with an NV APU device.
> 
> I'm absolutely not kidding here. We are talking 60 FPS AAA gaming anywhere, $600, 30w Power/ heat. The touch pads and gyro are amazing when you get used to them. Right touch pad is like a mouse, coupled with gyro aiming, the novelty of it in say Wolfenstein, its unrivaled.
> 
> Oh I forgot Sunset Overdrive, what an absolute blast and looks beautiful on this display, also 60 FPS everywhere.
> 
> Also, you can suspend the system, it's like game saving on steroids. When you wake the system you are right where you left off, no waiting for Windows and the game to load.
> 
> It's also Linux based PC, which is cool in itself if you've only been around Windows PC, and with dock can basically function as PC with mouse, keyboard and monitor, just keep your expectations low with monitor as it does 1280x800 native.
> 
> Seriously Nvidia is done, they overplayed their hand and now have to sell the TSMC inventory they bought (don't forget they actually tried to back out of deal) and now are pricing their products as "we don't give a **** about you because we are over this" prices. For them, clearing 30 series stock is taking precedence over maintaining trust, goodwill and interest from PC gaming consumer base. Ethereum going POS / crypto mining with GPU's ending is basically the final nail in the coffin for this industry. PC gaming will become incredibly niche as noone will be able to afford it. Like I said, I bought both my 2080 Ti and 3090 with expectatiom of recouping the cost via mining (to heat my apartment), which I actually did and then some at $10 a day for 2.5 years. Now I'm looking at $1600 GPU, no recouping of expense, I now use my Steam Deck 90% of the time and I know I'm not alone with this conclusion.
> 
> We are being dragged kicking and screaming into Klaus Scwab's Great Reset "you will have nothing and be happy" where there is mass poverty. It's the whole point of the Fake Pandemic, the intentional destruction of the Global Economy. Nvidia knows that the near term prospects of selling power hungry $1k+ discrete GPU's in an economic environment that exceeds the Great Depression in privation terms is exceedingly grim and are shifting their focus to data center, AI and automotive (oh the millionaire members of Congress working for Big Pharma will still be buying Tesla's, for a little while at least).
> 
> I get into the Great Reset at great length below if anyone is interested. My PC can be seen in earlier videos.


This current realm we are in runs on illusions.


----------



## WayWayUp

RetroWave78 said:


> Seriously Nvidia is done, they overplayed their hand and now have to sell the TSMC inventory they bought (don't forget they actually tried to back out of deal) and now are pricing their products as "we don't give a **** about you because we are over this" prices. For them, clearing 30 series stock is taking precedence over maintaining trust, goodwill and interest from PC gaming consumer base. Ethereum going POS / crypto mining with GPU's ending is basically the final nail in the coffin for this industry. PC gaming will become incredibly niche as noone will be able to afford it. Like I said, I bought both my 2080 Ti and 3090 with expectatiom of recouping the cost via mining (to heat my apartment), which I actually did and then some at $10 a day for 2.5 years. Now I'm looking at $1600 GPU, no recouping of expense, I now use my Steam Deck 90% of the time and I know I'm not alone with this conclusion.


i think your confused

you dont have to buy a brand new gpu. If it was up to nvda they wouldnt even release new gpus right now. They cant price the 4080 to compete with the 3000 series, otherwise not only would they lose a boatload of money (as 3000 series would stop selling completely) but their partners holding inventory would also take a hit.
The situation sucks, but I'm 95% certain 4080 prices will get slashed first half of 2023 as 3xxx series inventory clears

_"clearing 30 series stock is taking precedence over maintaining trust, goodwill and interest from PC gaming consumer base"_
absolutely false. 3080 owners are feeling great right now about their purchase. most feel like they can skip a generation and also feel justified in their purchase.
and halo chasers will be thrilled with the 4090 its a massive upgrade in all regards and it's price over previous gen ($100) is less than inflation

_PC gaming will become incredibly niche as noone will be able to afford it._

Right now high end pc gaming is ridiculous in all regards. ddr5 ram, new mobos, cpu prices, ext. This is just transitory.
Prices will come down in a big way and just because someone cant afford a brand new 4090 at day 1 prices doesnt mean pc gaming is unafordable.
You NEED to play at 4k 120fps with max rayracing to play a game?
core gamers can pick up a3060/ti at decent prices and play even the most demanding games with very high settings. they can keep their ddr4, they can get amazing performance from a 12700k. ext

the price hits tsmc is passing on and they inventory situation with 3xxx series is a temporary one. there are no long term implications to pc gaming


----------



## yzonker

FE finally listed, 






NVIDIA GeForce RTX 4090 24GB GDDR6X Graphics Card Titanium and black 900-1G136-2530-000 - Best Buy


Shop NVIDIA GeForce RTX 4090 24GB GDDR6X Graphics Card Titanium and black at Best Buy. Find low everyday prices and buy online for delivery or in-store pick-up. Price Match Guarantee.




www.bestbuy.com


----------



## bmagnien

Were FE cards ever sold in store without an online order at BB?


----------



## ArcticZero

yzonker said:


> Speaking of Phanteks,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Phanteks announces Glacier G40 water blocks for GeForce RTX 4090 GPUs - VideoCardz.com
> 
> 
> PHANTEKS UNVEILS RTX 4000 WATER BLOCKS Rotterdam, The Netherlands – September 29th, 2022 – With the new generation of GeForce RTX 4000 cards announcement, Phanteks today unveils the upcoming Glacier G40 GPU water blocks for RTX 4000 Series cards. The Glacier G40 GPU blocks bring the ultimate...
> 
> 
> 
> 
> videocardz.com


I actually like the look of these. Definitely considering them especially now that we won't be needing an active backplate like on the 3090. 😅


----------



## dr/owned

So looks like Best Buy has a few models listed now. I looked into the Gigabyte variants:

The Windforce and the OC Gaming seem like they're about the same design. The OC Gaming costs $100 more, has some RGB. Maybe an extra heatpipe. The OC Gaming maybe goes through a little more effort to put heatsink contact on the VRM instead of using thicker thermal pads.

They use an aluminum midplate to transfer heat from the VRAM into the heatsink. This is actually OK because it means thinner thermal pads and acts as a sort of "stock" copper shim mod. Gigabyte didn't do this on the 3000 series, but MSI did.

TLDR: I think the Windforce is probably the better buy. Especially if you're waterblocking the thing anyways.

+The Aorus Master which isn't listed yet, but holy hell it's such a tall heatsink (way beyond the PCB height) that I'm not even sure a lot of cases can fit that:










The Strix looks more like the OC Gaming in terms of dimensions so maybe the Master is going to be better air cooling or maybe it's complete overkill.


Also of note, they don't use the spring back plate thing. The entire mounting pressure comes from the screw threads alone...so it's not necessarily a "problem" but the spring backplate makes it a lot more brain dead to tighten it up to the right mounting pressure.

I don't think the VRM design of Strix vs. Tuf vs. Master vs. whatever is really going to make a difference. Any VRM design that NV is approving is going to support well beyond 600W. Even the base model 3090 designs were good for 1000W with a waterblock.


----------



## dr/owned

Strix 4090: 12+4 phases
Tuf 4090: 10+4 phases

Both with 2 power stages per phase soooooo even the TUF is going to easily be able to handle 1000W+. They're also high with a $400 tax on the Strix. If the Newegg price is real there's no way anyone should pay that.


----------



## ArcticZero

I would be happy with any Asus, or the MSI Suprim, if the build quality was the same as the 3000 series. I know Galax is a good option too as far as component selection goes, but I'm not sure how good their RMA service generally is.


----------



## yzonker

Not sure what they are hinting at (I linked to a time stamp),






Maybe nothing different, they basically said that but thought it was interesting they hinted at something. (yes, bored just waiting for the release/reviews)


----------



## bmagnien

yzonker said:


> Not sure what they are hinting at (I linked to a time stamp),
> 
> 
> 
> 
> 
> 
> Maybe nothing different, they basically said that but thought it was interesting they hinted at something. (yes, bored just waiting for the release/reviews)


Reserving binned FE chips?
More FE availability to cut into AIB marketshare?
Not much left really that NV hasn’t already done


----------



## mirkendargen

bmagnien said:


> Reserving binned FE chips?
> More FE availability to cut into AIB marketshare?
> Not much left really that NV hasn’t already done


There were already rumors Nvidia was binning FE chips based on clocks people were getting at lower voltage to stay inside the lower power limit on 3090FE's.


----------



## yzonker

bmagnien said:


> Reserving binned FE chips?
> More FE availability to cut into AIB marketshare?
> Not much left really that NV hasn’t already done


Did the FE reviews for the 3080 release a day earlier than the rest, being the only card with reviews before launch day? 

Looking at the TPU review database (scroll down). 









TechPowerUp


Our review database with over 35,000 product tests helps you find expert opinions on RTX 3080 and other hardware quickly.




www.techpowerup.com


----------



## tubs2x4

yzonker said:


> FE finally listed,
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 24GB GDDR6X Graphics Card Titanium and black 900-1G136-2530-000 - Best Buy
> 
> 
> Shop NVIDIA GeForce RTX 4090 24GB GDDR6X Graphics Card Titanium and black at Best Buy. Find low everyday prices and buy online for delivery or in-store pick-up. Price Match Guarantee.
> 
> 
> 
> 
> www.bestbuy.com


Geesh I need like $2200 Canadian pesos for that FE.


----------



## MrTOOSHORT

$2900 Canadian pesos for the Strix 4090. On my list, but not sure for that price when the TUF is $500 less.


----------



## J7SC

MrTOOSHORT said:


> $2900 Canadian pesos for the Strix 4090. On my list, but not sure for that price when the TUF is $500 less.


...bottom > Strix OC after exchange rate


----------



## Arizor

Expect us to get well and truly, royally bent over here in Australia too. Pass the lube round boys.


----------



## yzonker

Yea that's quite the early adopter tax Asus decided to go with. Upside, should be easier to get one if that's what you want and don't mind dropping the soap. 😯


----------



## J7SC

Canadian prices shown above are actually more or less in line with US - after exchange, NVidia 4090 @ US$ 1599 = CAN$ 2207. I just want AMD to hurry it up with the 7900XT and may be spring a surprise with the 7900XTX so that the Strix 4090 Ti gets released quicker and at, ahem, reasonable prices.


----------



## Arizor

Yeah pushing the Ti for January or near enough would be my ideal scenario.


----------



## dr/owned

Arizor said:


> Expect us to get well and truly, royally bent over here in Australia too. Pass the lube round boys.


----------



## Rbk_3

dr/owned said:


> So looks like Best Buy has a few models listed now. I looked into the Gigabyte variants:
> 
> The Windforce and the OC Gaming seem like they're about the same design. The OC Gaming costs $100 more, has some RGB. Maybe an extra heatpipe. The OC Gaming maybe goes through a little more effort to put heatsink contact on the VRM instead of using thicker thermal pads.
> 
> They use an aluminum midplate to transfer heat from the VRAM into the heatsink. This is actually OK because it means thinner thermal pads and acts as a sort of "stock" copper shim mod. Gigabyte didn't do this on the 3000 series, but MSI did.
> 
> TLDR: I think the Windforce is probably the better buy. Especially if you're waterblocking the thing anyways.
> 
> +The Aorus Master which isn't listed yet, but holy hell it's such a tall heatsink (way beyond the PCB height) that I'm not even sure a lot of cases can fit that:
> 
> View attachment 2573984
> 
> 
> The Strix looks more like the OC Gaming in terms of dimensions so maybe the Master is going to be better air cooling or maybe it's complete overkill.
> 
> 
> Also of note, they don't use the spring back plate thing. The entire mounting pressure comes from the screw threads alone...so it's not necessarily a "problem" but the spring backplate makes it a lot more brain dead to tighten it up to the right mounting pressure.
> 
> I don't think the VRM design of Strix vs. Tuf vs. Master vs. whatever is really going to make a difference. Any VRM design that NV is approving is going to support well beyond 600W. Even the base model 3090 designs were good for 1000W with a waterblock.



I got a Preorder in for the Windforce when it was up on Canada Computers Friday. Not my first choice, but it seems solid. After some research, it appears the cooler they're using is the exact same as the one used on the 3090ti Gaming OC. At about the same TDP, I expect temps in the high 60s under 100% load based on reviews of that card.

The 4090 Gaming OC is a little larger and has 2 more heat pipes. How much that will improve cooling performance I am not sure but I missed the preorder on that one, it was up as well. It seems it sold out super fast so I wonder if the inventory on the Windforce is going to be more since they look to be reusing the old coolers.


----------



## WayWayUp

The way i see it, dlss 3 is like an improved version of SLI

If nvda sold the 4090 as a multi gpu on the same die with the same cost, it still wouldnt be as good as dlss 3. so i dont exactly understand the hate its been getting

Neither would be perfect, but dlss 3 would get more fps than duel gpu, consume significantly less power (it actually reduces power consumption), and wouldnt be affected by microstudder or need to deal with driver support.
While its true there is a latency affect, its actually still less latency than native no dlss as seen in recent digital foundry release

Imagine if nvdia announced a duel gpu version of 4090 for the same cost and magical powers that dont increase TDP, everyone would go crazy for it and call it the greatest gpu release ever.
Yet with dlss 3 everyone is like *** is this crap

I dont get it

it's a gamechanger. And if it gets enough game support i can imagine more 8k monitors will hit the market. with the 4090 and upcoming ti, 8k seems like an actual thing for all games that support dlss 3.0. and dlss quality mode with 4k native upscaling to 8k will look amazing


----------



## Rbk_3

Gaming OC up on Canada Computers for Preorder. I called to confirm and the stores actually already have it in stock in the back ready to go for the 12th






GIGABYTE GeForce RTX 4090 GAMING OC 24G


GIGABYTE GeForce RTX 4090 GAMING OC 24G Graphics Card, 3x WINDFORCE Fans, 24GB 384-bit GDDR6X, GV-N4090GAMING OC-24GD Video Card




www.canadacomputers.com


----------



## Krzych04650

WayWayUp said:


> The way i see it, dlss 3 is like an improved version of SLI
> 
> If nvda sold the 4090 as a multi gpu on the same die with the same cost, it still wouldnt be as good as dlss 3. so i dont exactly understand the hate its been getting
> 
> Neither would be perfect, but dlss 3 would get more fps than duel gpu, consume significantly less power (it actually reduces power consumption), and wouldnt be affected by microstudder or need to deal with driver support.
> While its true there is a latency affect, its actually still less latency than native no dlss as seen in recent digital foundry release
> 
> Imagine if nvdia announced a duel gpu version of 4090 for the same cost and magical powers that dont increase TDP, everyone would go crazy for it and call it the greatest gpu release ever.
> Yet with dlss 3 everyone is like *** is this crap
> 
> I dont get it
> 
> it's a gamechanger. And if it gets enough game support i can imagine more 8k monitors will hit the market. with the 4090 and upcoming ti, 8k seems like an actual thing for all games that support dlss 3.0. and dlss quality mode with 4k native upscaling to 8k will look amazing


Nobody sane thinks it is crap. We didn't even have a chance to test it ourselves yet, so it is not even possible to have a definitive opinion. People who do have one already and hate on it right now do that as a compensation mechanism for the fact that they cannot afford those cards. Those are exact same people who were talking about 900W GPUs, needing 3 power supplies, new power circuit system for the house, or whatever other absurd things needed just to be able to have a new GPU. They were already trying to cope months before those cards were even announced and DLSS 3 unveiled. They would be doing that no matter what. The worse they can make those cards seem, the better.

In theory DLSS3 is a real mind-blowing stuff. For example, DLSS Performance already creates frames from just quarter the pixels and now putting completely made-up frames in between basically means that only 1/8th of the picture is actually rendered and remaining 7/8th is made up. And while DLSS Performance is not that good, DLSS Quality is and consistently produces better and more stable images than native TAA at this point, and even with DLSS3 Quality only slightly above quarter will be rendered, and the rest made-up. For something like this to even look presentable in real time gaming is insane, let alone directly comparable. And if that wasn't enough, it works "outside of the engine", so you can get that frame doubling even in CPU bound scenarios and potentially even in 60 FPS locked games.

The potential is definitely immense, and while it will surely have some problems and won't be suitable for everything, for example if you are after maximum speed and minimum latency then it does nothing for you, as it does not improve latency despite higher framerates and may even increase it slightly in games where Reflex is not as effective, but it will be a complete game changer for many games. Given what was already shown by Digital Foundry, I don't really see what problem could you have with running it in games the The Witcher 3 for example, or any story driven single-player game for that matter. Or why would you not use it in some very CPU bound MMO game that has terrible native performance to begin with. Or in 60 FPS locked game. 

We have been doing a lot crazier and more problematic things in the past to increase performance. Doubling the performance at similar latency without requiring second GPU and with some occasional motion artifacts instead of potential microstutter or flicker sounds reasonable to me.


----------



## mirkendargen

dr/owned said:


> Strix 4090: 12+4 phases
> Tuf 4090: 10+4 phases
> 
> Both with 2 power stages per phase soooooo even the TUF is going to easily be able to handle 1000W+. They're also high with a $400 tax on the Strix. If the Newegg price is real there's no way anyone should pay that.


Where'd you find the power stage info? Did they say anything else about the cards? All the AIB's are being super tight-lipped this time around.


----------



## dr/owned

mirkendargen said:


> Where'd you find the power stage info? Did they say anything else about the cards? All the AIB's are being super tight-lipped this time around.


PR: ASUS introduces GeForce RTX 4090/4080 ROG STRIX and TUF GPUs - VideoCardz.com

FE looks like it's probably 10+3 or something: NVIDIA GeForce News

"and upgrading to a 23-phase power supply"

Zotac Amp Extreme: 24+4 phase: Zotac Announces GeForce RTX 4090 and RTX 4080 in AMP Extreme AIRO, Trinity OC, and Trinity Editions

(also only the Strix says 70A power stages, but even if others are using 60A or 50A and operating them at 30A per stage, that's still >800W of power delivery easily. 22 70A stages is ridiculous overkill.)

Doesn't look like NV is allowing any factory boosts above ~2580 either. I think every card is probably going to end up being similar OC performance.


----------



## yzonker

dr/owned said:


> PR: ASUS introduces GeForce RTX 4090/4080 ROG STRIX and TUF GPUs - VideoCardz.com
> 
> FE looks like it's probably 10+3 or something: NVIDIA GeForce News
> 
> "and upgrading to a 23-phase power supply"
> 
> Zotac Amp Extreme: 24+4 phase: Zotac Announces GeForce RTX 4090 and RTX 4080 in AMP Extreme AIRO, Trinity OC, and Trinity Editions
> 
> (also only the Strix says 70A power stages, but even if others are using 60A or 50A and operating them at 30A per stage, that's still >800W of power delivery easily. 22 70A stages is ridiculous overkill.)
> 
> Doesn't look like NV is allowing any factory boosts above ~2580 either. I think every card is probably going to end up being similar OC performance.


I suspect Asus overbiilt it for the 4090 Ti with 2x12pin and maybe a 1200w XOC bios. 

Yea boost clock is mostly based on power limit and that's going to be about the same on them all given the 600w limit on the connector. 

I suspect they are all intentionally holding back 2x12pin (or maybe Nvidia decreed it so) to increase the gap to the 4090 Ti. This time rather than separating the current model by connector number and power limit, they are saving it for later.


----------



## dr/owned

CableMod 12VHPWR cable delivered today.

a) they are using 2 ground pins straight from the PSU for the Sense lines to "enable" 600W (thin black wires in photo). I'm not sure why dedicated instead of tee-ing it in to existing ground wires.
b) there's several merges of 2x wires into a single wire to the GPU . So for point a) if they're doing merges already then why not a split for the sense lines?
c) I'm not sure why they're using 3x connectors on the PSU side. They have 12 pins available when they only need 6. Subtracting 2 grounds for the Sense lines they have 12 +12V, and 10 GND lines.

d) Uh the 12VHPWR connector is a lot more dinky than I thought it would be. I wouldn't be surprised if the per-pin current limit is lower than the old PCIe connector because of less surface area.


----------



## dr/owned

yzonker said:


> I suspect Asus overbiilt it for the 4090 Ti with 2x12pin and maybe a 1200w XOC bios.
> 
> Yea boost clock is mostly based on power limit and that's going to be about the same on them all given the 600w limit on the connector.
> 
> I suspect they are all intentionally holding back 2x12pin (or maybe Nvidia decreed it so) to increase the gap to the 4090 Ti. This time rather than separating the current model by connector number and power limit, they are saving it for later.


Honestly I think you're right. They'll put out a 4090Ti in 6 months with 2 power connectors and a 800W TDP, and that would explain the rumor of "the 4090 is going to have 800W TDP" that we saw a few months ago. It was probably someone leaking an engineering sample testing "what if...."


----------



## yzonker

dr/owned said:


> Honestly I think you're right. They'll put out a 4090Ti in 6 months with 2 power connectors and a 800W TDP, and that would explain the rumor of "the 4090 is going to have 800W TDP" that we saw a few months ago. It was probably someone leaking an engineering sample testing "what if...."


One caveat I'll add that came to mind. If AMD falls on their ass and doesn't deliver a really competitive product and 4090 sales are not stellar, I could see them delaying or never making a 4090 Ti. Really unclear how much demand there will be. I noticed this poll over on reddit. Just took a screenshot. And I was one of the yes votes.


----------



## Arizor

Optimus have confirmed their waterblocks are compatible with both STRIX and TUF, and are ready to roll once testing is done. Looks like they'll be substantially smaller than 30 series blocks.

Definitely think I'll go TUF this time, STRIX seems overkill for all intents and purposes.


----------



## tubs2x4

I wonder if the 240 aio is going to be enough. 360 should really be minimum. I hope there is some reviews on it.


----------



## yzonker

Arizor said:


> Optimus have confirmed their waterblocks are compatible with both STRIX and TUF, and are ready to roll once testing is done. Looks like they'll be substantially smaller than 30 series blocks.
> 
> Definitely think I'll go TUF this time, STRIX seems overkill for all intents and purposes.
> 
> View attachment 2574378
> 
> View attachment 2574379


That's interesting. So same PCB, just more VRM on the Strix. Nothing that interferes with the block. Similar to what they did with the 3090 Ti.


----------



## yzonker

tubs2x4 said:


> I wonder if the 240 aio is going to be enough. 360 should really be minimum. I hope there is some reviews on it.


Depends on what you consider to be enough. A 240 can work with 450w. My 3080ti @450w and the hybrid kit would hold the core around 60C with moderate noise level (low 20's ambient). Quiet fans would let it creep to mid 60's. Obviously 600w will push that quite a bit higher unless you put a lot of fan on it. I suspect performance will be similar to the air cooled cards since the coolers are so big now.


----------



## Arizor

yzonker said:


> That's interesting. So same PCB, just more VRM on the Strix. Nothing that interferes with the block. Similar to what they did with the 3090 Ti.


Yeah I think that's pretty much spot on. Definitely cemented my decision to go TUF. I'm sure in extreme OC scenarios you'll see a difference, but outside of that I think it's pretty much chip lottery to see any meaningful difference.


----------



## Tyl3n0L

Rbk_3 said:


> Gaming OC up on Canada Computers for Preorder. I called to confirm and the stores actually already have it in stock in the back ready to go for the 12th
> 
> 
> 
> 
> 
> 
> GIGABYTE GeForce RTX 4090 GAMING OC 24G
> 
> 
> GIGABYTE GeForce RTX 4090 GAMING OC 24G Graphics Card, 3x WINDFORCE Fans, 24GB 384-bit GDDR6X, GV-N4090GAMING OC-24GD Video Card
> 
> 
> 
> 
> www.canadacomputers.com


I keep getting redirected to the index page. I assume the pre-orders are all filled up?


----------



## yzonker

Interesting, I was doing some simple ratios the other day to try to estimate how much power it might take to hold the voltage limit on the 4090. 

Assuming Nvidia sets the reference PL to a similar level of performance vs efficiency (which could be flawed of course). 

And knowing it takes a 600w or a little more to hold the voltage limit on a 3090 in very demanding benchmarks like TSE. 

350w reference PL for 3090. 

(600/350) = 1.71

For 450w 4090 reference limit, 

450 * 1.71 = 770w 

So maybe coincidence, but interesting it's so close to the rumored 800w card. 

So 600w sounds like a lot, but it may only be equivalent to ~450w on a 3090. (350 * 1.333)

Hopefully I'm wrong...


----------



## Arizor

Dammit @yzonker my power bills won't like it if you're right


----------



## th3illusiveman

Anyone know how to get the founders card in Canada? Its literally the only one that can fit in my case (12" Max) - Old CM690 Advance II.

My Case limited me on 3080s to FE /TUF/ FTW3 (which i got) on the 3080 launch... with these beasts..... im screwed lol.


----------



## J7SC

yzonker said:


> Interesting, I was doing some simple ratios the other day to try to estimate how much power it might take to hold the voltage limit on the 4090.
> 
> Assuming Nvidia sets the reference PL to a similar level of performance vs efficiency (which could be flawed of course).
> 
> And knowing it takes a 600w or a little more to hold the voltage limit on a 3090 in very demanding benchmarks like TSE.
> 
> 350w reference PL for 3090.
> 
> (600/350) = 1.71
> 
> For 450w 4090 reference limit,
> 
> 450 * 1.71 = 770w
> 
> So maybe coincidence, but interesting it's so close to the rumored 800w card.
> 
> So 600w sounds like a lot, but it may only be equivalent to ~450w on a 3090. (350 * 1.333)
> 
> Hopefully I'm wrong...


I think 800W for max PL with the right custom-PCB card and the right conditions is certainly possible - future side-bar on transient spikes is sure to follow.

*FYI, stock vbios* of the 3090 Strix OC was 450W, w/an indicated > 500W w/ max PL slider on air before I mounted a w-cooler and flashed 'higher limit'  vbios... Also, quad PCIe 8 pins on 2x w-cooled 2080 Ti have been pumping 760W peak since late '18 via a single 1300W platn. PSU. So, where's my 4090 Ti ?


Spoiler



3090 Strix OC stock










 

@th3illusiveman ...Best Buy Canada carried the RTX3k FEs (in addition to Asus etc) so likely a good spot for you to check once they're released.


----------



## dr/owned

On the connector power subject, looks like the max is 9.5A per pin, but that's assuming 75C ambient and a +30C connector temperature rise to stay in the operating range:










So if it's 45C ambient or something reasonable, you can shove 19A through each pin for a grand total of 1368W.



https://cdn.amphenol-cs.com/media/wysiwyg/files/documentation/datasheet/boardwiretoboard/bwb_minitek_pwr_cem_5_pcie.pdf


----------



## mirkendargen

dr/owned said:


> CableMod 12VHPWR cable delivered today.
> 
> a) they are using 2 ground pins straight from the PSU for the Sense lines to "enable" 600W (thin black wires in photo). I'm not sure why dedicated instead of tee-ing it in to existing ground wires.
> b) there's several merges of 2x wires into a single wire to the GPU . So for point a) if they're doing merges already then why not a split for the sense lines?
> c) I'm not sure why they're using 3x connectors on the PSU side. They have 12 pins available when they only need 6. Subtracting 2 grounds for the Sense lines they have 12 +12V, and 10 GND lines.
> 
> d) Uh the 12VHPWR connector is a lot more dinky than I thought it would be. I wouldn't be surprised if the per-pin current limit is lower than the old PCIe connector because of less surface area.
> 
> View attachment 2574374
> 
> View attachment 2574375


I figured they'd splice the 3rd 8pin across both the other two to balance it, if I'm seeing that picture correctly they just spliced half of the pins at random lol.


----------



## Blameless

CableMod does some strange stuff. Ordered from them once; didn't get anything defective, but they took sloppy shortcuts--silly stuff like not using all the ground pins available PSU side to save on wire/brading--and charged triple what ModDIY would have for a better product.


----------



## mirkendargen

Blameless said:


> CableMod does some strange stuff. Ordered from them once; didn't get anything defective, but they took sloppy shortcuts--silly stuff like not using all the ground pins available PSU side to save on wire/brading--and charged triple what ModDIY would have for a better product.


The ModDIY one is...interesting. They offer a 2x8pin and 3x8pin cable, but no pins are spliced in parallel, the 3x8pin one just doesn't use all the pins on the PSU connectors. I guess it's for people with rail load concerns or something and a PSU that actually has multiple rails to worry about.


----------



## cx-ray

Blameless said:


> CableMod does some strange stuff. Ordered from them once; didn't get anything defective, but they took sloppy shortcuts--silly stuff like not using all the ground pins available PSU side to save on wire/brading--and charged triple what ModDIY would have for a better product.


When I first got Cablemod cables for an AX1600i I also noticed one missing ground connection on the PSU side for the 8-pin PCIE supply. After checking the original cables I saw that Corsair does the same. I think they might combine one of the sense pins with ground.


----------



## Rbk_3

Tyl3n0L said:


> I keep getting redirected to the index page. I assume the pre-orders are all filled up?


They took all their listings down for some reason.


----------



## ttnuagmada

Any news anywhere on how the FE is going to be sold? I see BB has it listed, but unsure if it's in-store only or not.


----------



## ArcticZero

Well this is good news. I recently purchased a Prime PX-1300 so this is certainly welcome.

Seasonic is giving free 12VHPWR cables to Prime and Focus PSU customers - KitGuru


----------



## WayWayUp

I have an EVGA platinum 1200w p2
really dont want to upgrade but think eventually it will be the right move
annoyed by the change. It's one thing with constant upgrades to mobos, cpus, gpus, coolers, ext. But now we have ddr5 and new psu spec from scratch

I might just do a full rebuild next year with 14900ks


----------



## dr/owned

mirkendargen said:


> The ModDIY one is...interesting. They offer a 2x8pin and 3x8pin cable, but no pins are spliced in parallel, the 3x8pin one just doesn't use all the pins on the PSU connectors. I guess it's for people with rail load concerns or something and a PSU that actually has multiple rails to worry about.


Seems ModDIY also uses 18awg vs. CableMod 16awg. In the end it's something like 18A per wire vs. 14A which is still a lot more than the GPU is going to consume.


----------



## Rbk_3

WayWayUp said:


> I have an EVGA platinum 1200w p2
> really dont want to upgrade but think eventually it will be the right move
> annoyed by the change. It's one thing with constant upgrades to mobos, cpus, gpus, coolers, ext. But now we have ddr5 and new psu spec from scratch
> 
> I might just do a full rebuild next year with 14900ks


There is no reason for you to upgrade. I just grabbed the 1600 P2.


----------



## J7SC

ArcticZero said:


> Well this is good news. I recently purchased a Prime PX-1300 so this is certainly welcome.
> 
> Seasonic is giving free 12VHPWR cables to Prime and Focus PSU customers - KitGuru


Nice ! I bought two PX-1300 recently, so that move by Seasonic is good news.


----------



## mirkendargen

https://www.corsair.com/us/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284



Corsair cable is in stock now.


----------



## yzonker

mirkendargen said:


> https://www.corsair.com/us/en/Categories/Products/Accessories-%7C-Parts/PC-Components/Power-Supplies/600W-PCIe-5-0-12VHPWR-Type-4-PSU-Power-Cable/p/CP-8920284
> 
> 
> 
> Corsair cable is in stock now.


Thanks for the heads up. Ordered.


----------



## Arizor

I've ordered the cablemod 3 x PCIe4 to PCIe5 already, do we think it'll pretty much be of the same standard as the official Corsair (i.e. good for 600W)?


----------



## mirkendargen

Arizor said:


> I've ordered the cablemod 3 x PCIe4 to PCIe5 already, do we think it'll pretty much be of the same standard as the official Corsair (i.e. good for 600W)?


I'm sure anything will be fine as long as the connection quality on the pins isn't bad.


----------



## Arizor

Thanks @mirkendargen


----------



## 8472

Are the cablemod 12+4 pin cables compatible with all corsair power supplies? They don't distinguish between black label and yellow label like they do for the rest of their cables. 



Arizor said:


> Optimus have confirmed their waterblocks are compatible with both STRIX and TUF, and are ready to roll once testing is done. Looks like they'll be substantially smaller than 30 series blocks.
> 
> Definitely think I'll go TUF this time, STRIX seems overkill for all intents and purposes.
> 
> View attachment 2574378
> 
> View attachment 2574379


Any idea about a release date?


----------



## Arizor

8472 said:


> Are the cablemod 12+4 pin cables compatible with all corsair power supplies? They don't distinguish between black label and yellow label like they do for the rest of their cables.
> 
> 
> 
> Any idea about a release date?


That's a good question, I submitted a ticket asking, so hopefully can update the thread soon.

Optimus reckons they'll be able to release the Waterblock 4-6 weeks after launch.

They're not taking preorders anymore (after previous generation debacles), so will email previous customers first before announcing anything. Looking forward to blocking up my TUF!


----------



## Malinkadink

J7SC said:


> Nice ! I bought two PX-1300 recently, so that move by Seasonic is good news.


Nice I got a FOCUS-1000 last year so I'll be set, will suck for 4090 buyers needing to wait on the cable after already getting the card on launch lol


----------



## dr/owned

8472 said:


> Are the cablemod 12+4 pin cables compatible with all corsair power supplies? They don't distinguish between black label and yellow label like they do for the rest of their cables.


Probably. There is no difference between type 3 and type 4 cables re: pcie. AFAIK the only difference is capacitors in the 24 pin so I think only the cable kits that come with a 24 pin have the Corsair label distinction.


----------



## bmagnien

Official FE adapter:


----------



## mirkendargen

bmagnien said:


> Official FE adapter:
> View attachment 2574536


The other end would be the more interesting part, like do they actually populate all the pins in all four 8pins.


----------



## Krzych04650

Unboxings went up today. FE looks nice but basically all other cards are absolutely preposterous, maybe except for AIO ones. Like *** is this









I'd like to get FE even though I am going to go water anyway, it is a real piece of art and everything else just looks absolutely idiotic next to it, but waterblock compatibility may be a bit hit and miss, it does not have dual bios and may have lower power limits than others. Some cards also have 5 ports with 3xDP and 2xHDMI instead of 3x and 1x.

Reviews go up on 11th at the usual time btw. Although supposedly AIB reviews go up on 13th, a day after availability.


----------



## Arizor

Yep link to a solid unboxing overview NVIDIA GeForce RTX 4090 FE + 5x Custom Design Unboxing


----------



## Rbk_3

I am getting the Gaming OC.


----------



## WayWayUp

are the OC cards binned at all, even slightly?
For instance there will be a strix and a strix OC which comes with 90 additional boost. Is this just a scam for the noobs who are scared to overclock by themselves and just pay a premium for the higher boost? Or is there any kind of merit to these cards?

If my memory serves me correctly from the past there was no difference


----------



## dr/owned

I realize I can't run any card that has a triple slot bracket (including FE) because my motherboard only has 2-slot spacing for the GPU and I need all my other slots for soundcard / NIC / ssd.



WayWayUp said:


> are the OC cards binned at all, even slightly?
> For instance there will be a strix and a strix OC which comes with 90 additional boost. Is this just a scam for the noobs who are scared to overclock by themselves and just pay a premium for the higher boost? Or is there any kind of merit to these cards?
> 
> If my memory serves me correctly from the past there was no difference


Probably but they'll probably also never sell the non-OC version of cards. It's a way for them to mark up $50-$100.


----------



## WayWayUp

dr/owned said:


> I realize I can't run any card that has a triple slot bracket (including FE) because my motherboard only has 2-slot spacing for the GPU and I need all my other slots for soundcard / NIC / ssd.


can you try using a riser and changing the gpu orientation ? or does your case not accommodate that
also have you considered an aio? i know you will need to deal with a radiator but the gpu footprint is much smaller


----------



## mirkendargen

WayWayUp said:


> are the OC cards binned at all, even slightly?
> For instance there will be a strix and a strix OC which comes with 90 additional boost. Is this just a scam for the noobs who are scared to overclock by themselves and just pay a premium for the higher boost? Or is there any kind of merit to these cards?
> 
> If my memory serves me correctly from the past there was no difference


They're binned in that they are validated to run at those higher frequencies.

They're not binned in that 99.999% of chips that can run at the lower frequencies can also run at those higher frequencies.

+1 for it just being a way to mark up the price. I guarantee you'll never see a non-OC Strix for sale anywhere.


----------



## WayWayUp

how reliable is geekbench score for calculating rastor performance? I read that last gen it was spot on for predicting gen over gen uplift

if so, the 4090 is 67% faster than 3090ti. so at least 80% faster than the 3090, the card it's replacing.
If so, the 4090ti should be around 100% faster than the 3090 which is my current card.
What a remarkable gen over gen increase


----------



## WayWayUp

nvidia just released this

what a ridiculous gap between the 4080 and 4090. almost 40% uplift
I wonder if its cpu bound actually


----------



## 8472

WayWayUp said:


> View attachment 2574559
> 
> 
> nvidia just released this
> 
> what a ridiculous gap between the 4080 and 4090. almost 40% uplift
> I wonder if its cpu bound actually


I'm guessing that they left out the 3090ti because it would have been too close to the 4080 12gb.


----------



## Baasha

Did you guys see this:










The Asus is an absolute monster. I'm hoping it will fit my Corsair Air 540 Carbide that I houses my RTX 3090 Ti FTW3 Ultra currently.


----------



## dr/owned

Baasha said:


> can you try using a riser and changing the gpu orientation ? or does your case not accommodate that
> also have you considered an aio? i know you will need to deal with a radiator but the gpu footprint is much smaller


I mean I need access to all of the slots on the case. I'm going to be on a waterblock anyways so the thing will be 1.5-slot but if it still needs 3 mounting slots for the IO bracket that's not going to work.

So far it only looks like the FE and Suprim and all Zotac (Airo + Trinity) use 3 slots. Asus and Gigabyte use 2 even for their Strix and Master versions.


----------



## 8472

Looks like early to mid November for the Optimus block. 


__ https://twitter.com/i/web/status/1577031958621671424


----------



## yzonker

dr/owned said:


> I mean I need access to all of the slots on the case. I'm going to be on a waterblock anyways so the thing will be 1.5-slot but if it still needs 3 mounting slots for the IO bracket that's not going to work.
> 
> So far it only looks like the FE and Suprim and all Zotac (Airo + Trinity) use 3 slots. Asus and Gigabyte use 2 even for their Strix and Master versions.


Thinking outside the box, would it be possible to use a different bracket. Either make one or modify one from another card?


----------



## ArcticZero

8472 said:


> Looks like early to mid November for the Optimus block.
> 
> 
> __ https://twitter.com/i/web/status/1577031958621671424


Nice. Hoping I manage to get one, not being a previous Optimus GPU block owner and all so will have to wait for the actual announcement.


----------



## dr/owned

yzonker said:


> Thinking outside the box, would it be possible to use a different bracket. Either make one or modify one from another card?


Yeah anything is possible with an angle grinder  , but I'll probably just narrow my selection down to a 2 slot bracket card. Zotac: no, MSI: no, FE: no.

In talking this out I also realized my "I'll just run on air until I get a waterblock" strategy is probably also "nope". The air coolers on all these cards are at least 3.5 slot wide and that would really not work.

Pic for the boyz:



















I'm probably going to have to do some NVME->USB risers to move my sound card off the x1 slot, move my P5800X down to the lowest slot, and run 2.5gbe instead of 10gbe. I realllllly need to check when Sapphire Rapids is supposed to launch so I can get out of this mainstream "let's have 10 NVME ports but no actual PCIe slots" BS. (EDIT: for the few that actually read this far, it's not until like March 2023 with the w7-2495X)


----------



## 8472

ArcticZero said:


> Nice. Hoping I manage to get one, not being a previous Optimus GPU block owner and all so will have to wait for the actual announcement.


Do returning customers get first dibs? 

I'll be a first time customer as well. I currently have an EKWB and am definitely looking to move to a different brand.


----------



## dr/owned

8472 said:


> Do returning customers get first dibs?
> 
> I'll be a first time customer as well. I currently have an EKWB and am definitely looking to move to a different brand.


Alphacool, Heatkiller, Aquacomputer would be on my list ahead of paying $500 to Optimus. Second tier there's Bitspower, Phanteks, Corsair. (I'm waiting to see how fast Barrow is putting blocks out this gen).

The way optimus describes their blocks pisses me off...like no you guys aren't using some magical material it's f-ing acrylic the same as everyone else. "we CNC cut our threads because it's the best way". Yeah I've done over a thousand g1/4 fittings on every brand out there: never had a problem.


----------



## WayWayUp

hmm interesting.
I purchased an optimus block for my 3090. Even shunted the performance is amazing. I believe i used Liquid metal but im not sure how much difference it made.
I maxed out time spy stress test at 41 C for my core.

I dont remember the the temps i would get when heavy overclocking for benchmarks, im sure they were higher since i was pushing the card hard but if they were it wouldnt have been by much. Again thats shunted with altered bios. So safe to say optimus is a cut above the rest

My only issue is i dont want the 4090, im waiting for the Ti variant which should be a few months out hopefully. Do you think they will share the same waterblock?


----------



## WayWayUp

last gen they were all similar except for optimus which i believe gamers nexus said was doing 10c over delta. That was on his 3090


----------



## ArcticZero

8472 said:


> Do returning customers get first dibs?
> 
> I'll be a first time customer as well. I currently have an EKWB and am definitely looking to move to a different brand.


Actually I am not even sure where I read that from anymore, but I would be glad to be wrong in this case. I would certainly love to be able to buy one on release. Also an EKWB refugee here.



dr/owned said:


> Alphacool, Heatkiller, Aquacomputer would be on my list ahead of paying $500 to Optimus. Second tier there's Bitspower, Phanteks, Corsair. (I'm waiting to see how fast Barrow is putting blocks out this gen).
> 
> The way optimus describes their blocks pisses me off...like no you guys aren't using some magical material it's f-ing acrylic the same as everyone else. "we CNC cut our threads because it's the best way". Yeah I've done over a thousand g1/4 fittings on every brand out there: never had a problem.


The Alphacool one indeed looks amazing as well. That is also on my shortlist if the Optimus block proves to be overly expensive. Granted I have zero complaints about their AM4 Foundation copper block. Absolutely spotless and performing excellently to this day.


----------



## mirkendargen

dr/owned said:


> Yeah anything is possible with an angle grinder  , but I'll probably just narrow my selection down to a 2 slot bracket card. Zotac: no, MSI: no, FE: no.
> 
> In talking this out I also realized my "I'll just run on air until I get a waterblock" strategy is probably also "nope". The air coolers on all these cards are at least 3.5 slot wide and that would really not work.
> 
> Pic for the boyz:
> 
> View attachment 2574590
> 
> 
> View attachment 2574592
> 
> 
> I'm probably going to have to do some NVME->USB risers to move my sound card off the x1 slot, move my P5800X down to the lowest slot, and run 2.5gbe instead of 10gbe. I realllllly need to check when Sapphire Rapids is supposed to launch so I can get out of this mainstream "let's have 10 NVME ports but no actual PCIe slots" BS. (EDIT: for the few that actually read this far, it's not until like March 2023 with the w7-2495X)


On 3090's, I think I recall the FE blocks came with double slot brackets to replace the triple slot one. I have a Strix that was already double slot so I didn't pay a ton of attention, but that may be a thing again this gen.


----------



## J7SC

Once I figure out what to get (Strix ? 4090 or Ti, 7900xt or 7990XT) I'll see what is actually available in terms of GPU blocks for the specific PCB, along w/pricing...My 3090 Strix / Phanteks has performed flawlessly so far (spoiler below), but I also like Alphacool, Aquacomputer and a few others. Oddly enough, the Optimus thread at OCN plus other 'verbiage' has turned me off that one, along w/ their pricing. 

I am also wondering what 'factory full waterblock models' will be available for the RTX4K lines (like several full w-block models were for the 3090s and 2080 Tis). 


Spoiler


----------



## Arizor

J7SC said:


> Once I figure out what to get (Strix ? 4090 or Ti, 7900xt or 7990XT)


Come on @J7SC , when it comes to you we all know the eventual answer will be just "all of the above"


----------



## J7SC

Arizor said:


> Come on @J7SC , when it comes to you we all know the eventual answer will be just "all of the above"


You're not helping my budget resolve at all...


----------



## Mad Pistol

I may just go ahead and bite the bullet on this one.


----------



## Arizor

Mad Pistol said:


> I may just go ahead and bite the bullet on this one.


I'll certainly be bending over and biting the pillow when prices are announced in Australia...


----------



## Antsu

Nothing like spending 2600€ for a card in the EU, just to get ****ed by the 4090Ti few months later. And if you think you're smart about it and just wait for the 4090Ti, it will never be released with my luck. What a time to be alive. The 3090Ti was kind of useless if you had a nicely setup 3090 without a powerlimit, it was only 2.5% more CUDA cores. Now they've conveniently left ~11% of CUDA cores on the table and probably intentionally forced all AIB's to use 1x16pin only, so they can be sure they'll be able to milk you twice, absolutely disgusting.


----------



## Mad Pistol

Now I've just got to figure out if my current trusty EVGA 750-watt G2 is enough to power a 4090 and 5800x3D.

When gaming, my Battery Backup says my current 5800x3D/3080 hits around 480-500 watts. If that's the case, I have headroom for another 150 watts pretty easily.

The unintended advantage of a 5800x3D... it's actually pretty energy efficient for being such a little beast of a gaming CPU.


----------



## dr/owned

Antsu said:


> Nothing like spending 2600€ for a card in the EU, just to get ****ed by the 4090Ti few months later. And if you think you're smart about it and just wait for the 4090Ti, it will never be released with my luck. What a time to be alive. The 3090Ti was kind of useless if you had a nicely setup 3090 without a powerlimit, it was only 2.5% more CUDA cores. Now they've conveniently left ~11% of CUDA cores on the table and probably intentionally forced all AIB's to use 1x16pin only, so they can be sure they'll be able to milk you twice, absolutely disgusting.


I'd be more concerned by the 4080Ti coming out in 6 months than a 4090Ti. The 4090Ti would only be marginal vs. the 4090 same as the 3090Ti vs 3090 because the die is already mostly full-fat. OTOH there's a huge gap between the 4080-16 and the 4090 the 4080Ti can fill in. I think it's a back-pocket item depending on AMD's results. If they're in a bad way they can drop MSRP of the 4080's by 200 and make a $1300 4080Ti with 90% of the 4090 performance.

If I'm being a speculator: I think the 4090 was supposed to be $1999 6 months ago and the 4080Ti was supposed to slot in at $1500. But they had to price-compress it and now a theoretical 4080Ti doesn't have much room to squeeze in without the 4080-16 getting a price cut.


----------



## dante`afk

dr/owned said:


> CableMod 12VHPWR cable delivered today.
> 
> a) they are using 2 ground pins straight from the PSU for the Sense lines to "enable" 600W (thin black wires in photo). I'm not sure why dedicated instead of tee-ing it in to existing ground wires.
> b) there's several merges of 2x wires into a single wire to the GPU . So for point a) if they're doing merges already then why not a split for the sense lines?
> c) I'm not sure why they're using 3x connectors on the PSU side. They have 12 pins available when they only need 6. Subtracting 2 grounds for the Sense lines they have 12 +12V, and 10 GND lines.
> 
> d) Uh the 12VHPWR connector is a lot more dinky than I thought it would be. I wouldn't be surprised if the per-pin current limit is lower than the old PCIe connector because of less surface area.
> 
> View attachment 2574374
> 
> View attachment 2574375



good luck









CableMod PCI-E cable connector melted in my RTX3090 :(


Pretty much like the title says... I was playing a game, and suddenly the TV cut off (no signal). After a reboot, I entered the game again, but nothing seemed off about reported video card power consumption. 330W didn't seem odd, but in retrospect I have to find it curious that it was no longer...




www.overclock.net


----------



## th3illusiveman

Mad Pistol said:


> I may just go ahead and bite the bullet on this one.


Going to be interesting to see where the 3DX sits when reviews redo their CPU tests with the 4090 at 1080p lol. its our little champ next to these monstrosly clocked DDR5 platform beast CPUs.


----------



## Antsu

dr/owned said:


> I'd be more concerned by the 4080Ti coming out in 6 months than a 4090Ti. The 4090Ti would only be marginal vs. the 4090 same as the 3090Ti vs 3090 because the die is already mostly full-fat. OTOH there's a huge gap between the 4080-16 and the 4090 the 4080Ti can fill in. I think it's a back-pocket item depending on AMD's results. If they're in a bad way they can drop MSRP of the 4080's by 200 and make a $1300 4080Ti with 90% of the 4090 performance.
> 
> If I'm being a speculator: I think the 4090 was supposed to be $1999 6 months ago and the 4080Ti was supposed to slot in at $1500. But they had to price-compress it and now a theoretical 4080Ti doesn't have much room to squeeze in without the 4080-16 getting a price cut.


My point was indeed that it's no longer THAT marginal as the 3090 vs 3090Ti, where there was only 2.5% of the SM's disabled vs 11% now. I guess I will just have to bite the bullet and hope the AMD cards aren't strong enough for Nvidia to panic and release the Ti.


----------



## J7SC

...what I am mostly interested in for now is DLSS 3. Perhaps there will yet be a version for RTX3K even if it is slower due to additional chip features in RTX4K, and/or AMD upgrades their 'open' versions of similar apps. 

Apart from that, I regret that they killed SLI/Crossfire - going from 1x to 2x GPU was always a nice upgrade w/o creating unused tech for the display $helf. I know about microstutter and dev requirements for profiles, but there were solutions such as SLI/NVLink CFR ('checkerboard drivers') which did DX12, RTX and all that w/o microstutter...

Anyway, time to wait and see which company comes up with what and when for how much...in the meantime, my 3090 actually hasn't got any slower since RTX4K announcements


----------



## Glottis




----------



## stefxyz

So according to IhorsLab we will have cold and silent cards. The cooler were designed to cool 600 watts but Nvidia switched to more efficient 4nm tsmc process last minute and these cards only consume 450w . It was too late for the board partners to change.

also he said very small headroom for oc die to power increase without voltage mod.


----------



## yzonker

Here's the article, 









NVIDIA GeForce RTX 4090 - Wo der Irrtum mit den 600 Watt wirklich herkommt und warum die Karten so riesig sind | igor´sLAB


Gestern durften wir zwar die ersten eigenen Bilder von NVIDIAs kommender Grafikkarten-Generation veröffentlichen, müssen jedoch bis zum festgelegten Launchtag noch über weitere Details bis hin zur…




www.igorslab.de





This statement in particular, 

"You should very quickly come across the voltage limits for the VDDC here, which should actually make an increase to values over 500 watts superfluous." 

So this flies in the face of the notion we would need 800w unless I suppose if Nvidia actually set the voltage limit higher on a future card.


----------



## dante`afk

so, shuntmodding and voltmodding again to get 600+


got it


----------



## ArcticZero

yzonker said:


> Here's the article,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 - Wo der Irrtum mit den 600 Watt wirklich herkommt und warum die Karten so riesig sind | igor´sLAB
> 
> 
> Gestern durften wir zwar die ersten eigenen Bilder von NVIDIAs kommender Grafikkarten-Generation veröffentlichen, müssen jedoch bis zum festgelegten Launchtag noch über weitere Details bis hin zur…
> 
> 
> 
> 
> www.igorslab.de
> 
> 
> 
> 
> 
> This statement in particular,
> 
> "You should very quickly come across the voltage limits for the VDDC here, which should actually make an increase to values over 500 watts superfluous."
> 
> So this flies in the face of the notion we would need 800w unless I suppose if Nvidia actually set the voltage limit higher on a future card.


Looks like more shunt modding then. Perfect. At least this time I actually know how to solder and have proper equipment unlike on my 3090 which was literally the first thing I'd ever put an iron to.

And glad I can just skip the crappy silver paint this time. 😅



Spoiler


----------



## yzonker

ArcticZero said:


> Looks like more shunt modding then. Perfect. At least this time I actually know how to solder and have proper equipment unlike on my 3090 which was literally the first thing I'd ever put an iron to.
> 
> And glad I can just skip the crappy silver paint this time. 😅
> 
> 
> 
> Spoiler


I'm confused on how that article changes what you are going to do? Only way you would need to shunt mod is if you volt mod, but that would have been even more true if the card needed 600w+ at the stock voltage limit.


----------



## RetroWave78

ArcticZero said:


> Looks like more shunt modding then. Perfect. At least this time I actually know how to solder and have proper equipment unlike on my 3090 which was literally the first thing I'd ever put an iron to.
> 
> And glad I can just skip the crappy silver paint this time. 😅
> 
> 
> 
> Spoiler


Bring temps down and undervolt. I'm running 2GHz core @ .938v on my 3090 FE under water block, ~45C load. Less voltage = less wattage required to sustain a given freq. I've never, ever been an advocate of increasing GPU core voltage. Tried it with 780 Ti over a decade ago and the returns were not worth the increase in wattage or heat.


----------



## RetroWave78

mirkendargen said:


> On 3090's, I think I recall the FE blocks came with double slot brackets to replace the triple slot one. I have a Strix that was already double slot so I didn't pay a ton of attention, but that may be a thing again this gen.


I believe EKWB's currently available 4090 FE block is single slot. If you noticed EKWB has shifted their focus to SFF, their recent CPU block + reservoir + pump combo is neat.


----------



## coelacanth

Some of these 4090s are comically large.


----------



## ArcticZero

yzonker said:


> I'm confused on how that article changes what you are going to do? Only way you would need to shunt mod is if you volt mod, but that would have been even more true if the card needed 600w+ at the stock voltage limit.


Whoops, my bad, I swear I wrote I would be volt modding too but I guess I was reading the previous reply. 😅

On the shunt modding I just meant right now I am familiar with soldering so it won't be an issue and I won't need to use silver paint, whereas with the 3090 I had never done so before.



RetroWave78 said:


> Bring temps down and undervolt. I'm running 2GHz core @ .938v on my 3090 FE under water block, ~45C load. Less voltage = less wattage required to sustain a given freq. I've never, ever been an advocate of increasing GPU core voltage. Tried it with 780 Ti over a decade ago and the returns were not worth the increase in wattage or heat.


Yep I already run a custom curve on my current GPU and know how effective it is for Ampere at least.


----------



## dr/owned

stefxyz said:


> So according to IhorsLab we will have cold and silent cards. The cooler were designed to cool 600 watts but Nvidia switched to more efficient 4nm tsmc process last minute and these cards only consume 450w . It was too late for the board partners to change.
> 
> also he said very small headroom for oc die to power increase without voltage mod.


AIB's can redesign coolers faster than a silicon design can be re-validated for a new process. You don't just switch over a weekend. There's also fab components that need to change to that have many-month cycle times. IgorsLab is know for making **** up after the 3000 series...


----------



## Glottis

stefxyz said:


> So according to IhorsLab we will have cold and silent cards. The cooler were designed to cool 600 watts but Nvidia switched to more efficient 4nm tsmc process last minute and these cards only consume 450w . It was too late for the board partners to change.
> 
> also he said very small headroom for oc die to power increase without voltage mod.


All the memeing aside, cooler size is the one thing I don't mind about the new cards. It means they will run cool and quiet. I have a normal sized mid tower case so for me it's actually beneficial that space inside of my case is used up by larger but more silent GPU. In the past we had 2 slot cards with tiny fans that emitted high pitched whine, it used to be so annoying.


----------



## mirkendargen

dr/owned said:


> AIB's can redesign coolers faster than a silicon design can be re-validated for a new process. You don't just switch over a weekend. There's also fab components that need to change to that have many-month cycle times. IgorsLab is know for making **** up after the 3000 series...


100% agreed, and in addition to that buying capacity at the fab isn't an overnight deal.

But all that said, Nvidia could have just not told board partners till the last minute, and that's why the FE cooler is a lot smaller than others (but still huge).


----------



## yzonker

dr/owned said:


> AIB's can redesign coolers faster than a silicon design can be re-validated for a new process. You don't just switch over a weekend. There's also fab components that need to change to that have many-month cycle times. IgorsLab is know for making **** up after the 3000 series...


Well we'll find out Tuesday probably. I'm hoping he's right. There was endless people unhappy with the 30 series cards being power limited. Would be a nice change if they could actually hold the stock power limit at least in most games/benches.


----------



## yzonker

Lol. This is why I said MOST games/benches in the last post. 









NVIDIA GeForce RTX 4090 spotted reaching 3.0 GHz and 616 Watts with GPU stress tool - VideoCardz.com


NVIDIA RTX 4090 at 600W and 3.0 GHz NVIDIA’s new flagship GPU has now been spotted running at 3.0 GHz, and that’s with GPU stress tool. The RTX 4090 featuring NVIDIA’s newest Ada Lovelace architecture can easily run at very high clocks, much higher than previous architecture. NVIDIA themselves...




videocardz.com


----------



## J7SC

It seems that the actual PCB length is between ~ 200mm and 220mm for RTX4K, much shorter than RTX3K. I still think (custom) water-cooling makes a lot of sense, especially for anything over 300W and *with a boost algorithm linked to temps*. Besides, if you build the w-cooling right, the sound of silence


----------



## dr/owned

J7SC said:


> It seems that the actual PCB length is between ~ 200mm and 220mm for RTX4K, much shorter than RTX3K. I still think (custom) water-cooling makes a lot of sense, especially for anything over 300W and *with a boost algorithm linked to temps*. Besides, if you build the w-cooling right, the sound of silence


Yes, I call the heatsink c*** extensions because the PCB is only a little bigger than the slot itself (although they have started to get ridiculous on the PCB height now, making it more of a square vs. rectantangle) and the rest of the card is free-hanging fin stack. Ironically these cards might be decent in SFF cases with a waterblock.



yzonker said:


> Well we'll find out Tuesday probably. I'm hoping he's right. There was endless people unhappy with the 30 series cards being power limited. Would be a nice change if they could actually hold the stock power limit at least in most games/benches.


I just hope we get a power slider that actually does something this time around. It was a bit stupid that there were no stock 500W 3090's at all. "oh would you like 425W or the "extreme" 450W??"

Side bar I probably need to go dig for the shunt resistors and power resistors to bypass fuses. I don't remember how many leftovers I have...


----------



## dr/owned

Ordered a resupply, I'm assuming they're still going with 5mOhm shunt resistors and 1206 size fuses:


----------



## yzonker

dr/owned said:


> Ordered a resupply, I'm assuming they're still going with 5mOhm shunt resistors and 1206 size fuses:
> 
> View attachment 2574698


Actually I think some may have switched to 2 mOhm if I'm looking in the right place. Can't tell on the Founders, but the EVGA card was the same when I looked. 









MSI GeForce RTX 3090 Ti Suprim X Review


The MSI GeForce RTX 3090 Ti is an impressive quad-slot graphics card. In our review, we found that it runs quieter than other RTX 3090 Ti tested today. MSI also included a large factory overclock with their card, which handles 4K60 with ease.




www.techpowerup.com


----------



## dr/owned

yzonker said:


> Actually I think some may have switched to 2 mOhm if I'm looking in the right place. Can't tell on the Founders, but the EVGA card was the same when I looked.


_eyeroll_ yeah looks like the 3090Ti FE used 2mOhm:










4090 you can't make it out but it looks like maybe 3 shunts instead of 6 (which let's pray they calmed down on putting a shunt on every little rail):


----------



## WayWayUp

what do you guys think of the 4080 16gb benchmark leak?

Im sure the drivers are not optimized. Nvidia usually releases day 1 drivers
anyways with stock settings:





























These are some great results
For reference:

3080 scores around release:

port royal was around ~11k
timespy extreme graphics were around ~9k
Fire Strike Ultra was at or just under ~11k

Now we already know the 4090 is a huge gap in performance to the 4080. So you can deduce a lot of info from this. Excited to see some overclocked results on good drivers


----------



## Martin778

Not enough info, this could've even been an LN2 OC run.


----------



## Glottis

Corsair EU site has 600W PCIe 5.0 12VHPWR cable in stock.


----------



## bmagnien

Martin778 said:


> Not enough info, this could've even been an LN2 OC run.


It’s chilled water.


----------



## RetroWave78

yzonker said:


> Lol. This is why I said MOST games/benches in the last post.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 spotted reaching 3.0 GHz and 616 Watts with GPU stress tool - VideoCardz.com
> 
> 
> NVIDIA RTX 4090 at 600W and 3.0 GHz NVIDIA’s new flagship GPU has now been spotted running at 3.0 GHz, and that’s with GPU stress tool. The RTX 4090 featuring NVIDIA’s newest Ada Lovelace architecture can easily run at very high clocks, much higher than previous architecture. NVIDIA themselves...
> 
> 
> 
> 
> videocardz.com


From the comment section, looks like Timespy, 49C @ 3GHz @ 407w but this isn't saying much as we have no idea as to whether or not the bench just started, fan speed, ambient and Timespy load varies, i.e. the section of the bench where it's like an aerospace museum my clocks dip down from 2000 MHz to 1960 MHz with 3090 FE and undervolt @ 400w.

I remember seeing elsewhere 4090 running Cyberpunk 2077 @ 56C core @ 2850 MHz with DLSS 3.0

Either way, if core stays under 65C @ 3GHz and the fan isn't obnoxious, I may cancel my EKWB water block pre-order and just keep it on air. The cooler is a work of art to top it off and GN's recent 4090 FE vapor chamber cross section analysis with NV engineer was impressive. Also, if I need to troubleshoot I don't need to drain the entire loop to pull the GPU.










Also, just as I predicted in an earlier post (nearly exactly) AD-102 is clocked right at the power efficiency curve (2.65 GHz pulling 425w and 3GHz requiring 616w). That's nearly 50% increase in power for 10% more freq. 

You watch, Lovelace is going to love undervolting, just like Ampere.


----------



## yzonker

RetroWave78 said:


> From the comment section, looks like Timespy, 49C @ 3GHz @ 407w but this isn't saying much as we have no idea as to whether or not the bench just started, fan speed, ambient and Timespy load varies, i.e. the section of the bench where it's like an aerospace museum my clocks dip down from 2000 MHz to 1960 MHz with 3090 FE and undervolt @ 400w.
> 
> I remember seeing elsewhere 4090 running Cyberpunk 2077 @ 56C core @ 2850 MHz with DLSS 3.0
> 
> Either way, if core stays under 65C @ 3GHz and the fan isn't obnoxious, I may cancel my EKWB water block pre-order and just keep it on air. The cooler is a work of art to top it off and GN's recent 4090 FE vapor chamber cross section analysis with NV engineer was impressive. Also, if I need to troubleshoot I don't need to drain the entire loop to pull the GPU.
> 
> View attachment 2574719
> 
> 
> Also, just as I predicted in an earlier post (nearly exactly) AD-102 is clocked right at the power efficiency curve (2.65 GHz pulling 425w and 3GHz requiring 616w). That's nearly 50% increase in power for 10% more freq.
> 
> You watch, Lovelace is going to love undervolting, just like Ampere.


That's in the middle of GT1. GT2 does pull more power, so assuming the screenshot is the card actually running at the voltage limit, then GT2 would probably peak around 450w. TSE GT2 would be ~500w peak probably just based on the relative power draw I've seen on my 3080ti/3090. For my cards, TS GT2 will hit around 550w, maybe a little more. TSE GT2 is 600-630w.


----------



## J7SC

fyi - 4090 in Geekbench 5 Cuda, OpenCL

source


----------



## Glottis

> NVIDIA GeForce RTX 4090 has a default TDP of 450W, which is a default configuration for most custom models as well. However, the vast majority of cards will also feature ‘OC’ or ‘Gaming’ configurations, with TDP going well beyond 500W. Furthermore, GPUs such as Founder Edition reportedly has a power limit up to 600W, which may explain what card was used for this test.


Is this leak claiming what I think it's claiming? FE will have 600W power limit, but this is not guaranteed for partner cards where some of them may have lower power limit? This would be outrageous, as partner cards already charge pretty big premium. If Asus tries something shady with 4090 TUF and limit it to 500W for no reason just to make Strix look better that would be terrible.


----------



## Netarangi

So I'm guessing the ti versions will be shorter, hotter cards.. Not sure if I can wait, grabbing a 4080-16


----------



## ZealotKi11er

Glottis said:


> Is this leak claiming what I think it's claiming? FE will have 600W power limit, but this is not guaranteed for partner cards where some of them may have lower power limit? This would be outrageous, as partner cards already charge pretty big premium. If Asus tries something shady with 4090 TUF and limit it to 500W for no reason just to make Strix look better that would be terrible.


That is what I am waiting for personally. Also the TUF cooler looks small compared to most other 4090s.


----------



## WayWayUp

i dont know why they keep comparing the 4090 to the 3090ti

the 3090ti was released just a few months ago and should be compared to the 4090ti
I want to see apples to apples. 4090 > 3090


----------



## Glottis

ZealotKi11er said:


> Also the TUF cooler looks small compared to most other 4090s.


Is it though? TUF is gigantic like every other AIB 4090. Size and cooler volume is very similar to Strix. If much smaller FE card can handle 600W, there is no reason why more expensive and larger TUF can't. I mean the fact that we are even discussing this is funny, there should be no compromise with TUF. It's more expensive than FE, has larger, heavier cooler with more fans.


----------



## ZealotKi11er

Glottis said:


> Is it though? TUF is gigantic like every other AIB 4090. Size and cooler volume is very similar to Strix. If much smaller FE card can handle 600W, there is no reason why more expensive and larger TUF can't. I mean the fact that we are even discussing this is funny, there should be no compromise with TUF. It's more expensive than FE, has larger, heavier cooler with more fans.


Look at the actual heatsink vs other 4090. Also 3090 performed better then most bigger AIB card apart from memory colling which should not be a problem for 4090 since its like 3090 Ti in that regard. Also its not about handling 600W. The idea is can I game in silence at 450-600W with decent temps? Which is the best for this reequipment. I can go water but I also want to get RDNA3 and that becomes too complicated so probably switch to Air for this gen.


----------



## WayWayUp

after watching the Gamers Nexus video from yesterday I'm thinking the FE will probably have the best air-cooling solution despite being smallest


----------



## yzonker

There are a couple of Asus videos where they talk about the differences between the cards. I suspect the giant coolers like on the Strix will perform better than the FE. 

Here's one of them.


----------



## Glottis

yzonker said:


> There are a couple of Asus videos where they talk about the differences between the cards. I suspect the giant coolers like on the Strix will perform better than the FE.
> 
> Here's one of them.


According to this video the TUF will run at "default power limit", whatever that means in the context of 4090. If it's locked at just 450W and not even OC bios can go above that, this will be most disappointing. Maybe this will be the first time FE noticeably outperforms "entry" cards from partners. Imgine if you need to buy Strix just to have performance parity with FE. This would further explain why EVGA exited.


----------



## yzonker

Glottis said:


> According to this video the TUF will run at "default power limit", whatever that means in the context of 4090. If it's locked at just 450W and not even OC bios can go above that, this will be most disappointing. Maybe this will be the first time FE noticeably outperforms "entry" cards from partners. Imgine if you need to buy Strix just to have performance parity with FE. This would further explain why EVGA exited.


It may be more than 450w. It comes with the same 4x8pin adapter. But no way to know until the NDA lifts probably. I know the low end Zotac card just comes with 3x8pin. It very likely is 450w.


----------



## ZealotKi11er

yzonker said:


> It may be more than 450w. It comes with the same 4x8pin adapter. But no way to know until the NDA lifts probably. I know the low end Zotac card just comes with 3x8pin. It very likely is 450w.


TUF had not power limit option for RTX 30. 3080 Ti and 3090 both locked at 350W. 3080 could got from 320 to 350 but make almost no difference.


----------



## Rbk_3

Should be easy to flash another higher power bios either way.


----------



## 8472

The MSI Gaming Trio 4090 also only had a 3 8 pin to 12+4 pin. The entry level 4090s being power limited would definitely push me to the founder's edition. 

If true, I doubt Best Buy will have enough stock for everyone.


----------



## Benni231990

Fow now i have a trusted source and he knows 3 (Maybe a 4) cards that have a 600 watt powerlimit

Strix (Save)
Aorus Master (Save)
Suprim X (Save)
(Holo Extreme) (Maybe)


----------



## ZealotKi11er

Benni231990 said:


> Fow now i have a trusted source and he knows 3 (Maybe a 4) cards that have a 600 watt powerlimit
> 
> Strix (Save)
> Aorus Master (Save)
> Suprim X (Save)
> (Holo Extreme) (Maybe)


Strix is $500 CAD more expensive here in Canada than base skus. I might have to go gigabyte. I think stock will be fine personally since these are just too expensive.


----------



## Rbk_3

ZealotKi11er said:


> Strix is $500 CAD more expensive here in Canada than base skus. I might have to go gigabyte. I think stock will be fine personally since these are just too expensive.
> 
> View attachment 2574831


I got a preorder in for the Gaming OC and Canada Computers but I may get the Master if available on launch day for a couple hundred more


----------



## Tyl3n0L

Rbk_3 said:


> I got a preorder in for the Gaming OC and Canada Computers but I may get the Master if available on launch day for a couple hundred more


Did NVIDIA officially announce the Canadian MSRP yet? Just wondering since I'll be interested in a FE model and was hoping it'll be cheaper than 3rd party...


----------



## Benni231990

Ive you had luck to get a FE you have the cheapest but a 600watt Bios so this is the best deal you can make


----------



## Hulk1988

Benni231990 said:


> Ive you had luck to get a FE you have the cheapest but a 600watt Bios so this is the best deal you can make


Correct. FE has 600W. That was posted by IgorsLab but he deleted it after 20 minutes. I still got it.


----------



## Rbk_3

Tyl3n0L said:


> Did NVIDIA officially announce the Canadian MSRP yet? Just wondering since I'll be interested in a FE model and was hoping it'll be cheaper than 3rd party...


Will likely be $2199. That’s what the lowest AIB models are.


----------



## Rbk_3

So Gigabyte lists the Gaming OC as 1000W recommended PSU along with the Master but list the Windforce at 850W. So perhaps the Gaming OC will have 600W also


----------



## dr/owned

I'm figuring I'll get the Windforce and flash it to the Gaming OC....although I can't remember if nvflash was immediately compatible with the 3000 series when that came out. Plus just shunt the damn thing so it'll have as much room as it needs and hopefully not melt the 12VHPWR connector (altohugh the math says it should be good for about 750W TDP and 1300W? if the ambient is 30C or less)

The FE is more "ehhhh" because waterblock support isn't going to be as great and it doesn't have dual BIOS. And on the 3000 series it was awful experience trying to shunt it.


----------



## ZealotKi11er

Tyl3n0L said:


> Did NVIDIA officially announce the Canadian MSRP yet? Just wondering since I'll be interested in a FE model and was hoping it'll be cheaper than 3rd party...


Should be $2199.


----------



## bmagnien

NVIDIA GeForce RTX 4090 3DMark scores leaked, at least 82% faster than RTX 3090 - VideoCardz.com


NVIDIA RTX 4090 tested in 3DMark We have some early benchmarks of the upcoming Ada Lovelace flagship gaming card, the RTX 4090. The leaked benchmark scores feature 3DMark FireStrike Ultra, a DirectX11 benchmark running at 4K resolution, TimeSpy Extreme using DirectX12 API and same resolution...




videocardz.com





2.04x 3090 in 4K dx11 raster. I’m big hype.


----------



## Benni231990

Rbk_3 said:


> So perhaps the Gaming OC will have 600W also


Maybe yes but we will see on the 11.10 my Source only know the Master has 100% 600watt bios


----------



## yzonker

Benni231990 said:


> Maybe yes but we will see on the 11.10 my Source only know the Master has 100% 600watt bios


Will we know on the 11th? In looking at the 3080/3090 reviews, it appears the FE reviews were released the day before but AIBs on the day of launch. 

Anybody remember when those reviews actually hit? I've forgotten. Just going by the dates of the articles.


----------



## Benni231990

Yes on 11th we get reviews and 12 is launch


----------



## Glottis

Benni231990 said:


> Yes on 11th we get reviews *and 12 is launch*


And by launch you mean scalper and bot rat race. 🤣


----------



## Alemancio

I have access to preorder the Asus ROG Strix 4090 OC and the MSI Suprim Liquid X.

*Which one would you preorder???* Im inclined to the Liquid Cooled Suprim (lower temps most likely)


----------



## Rbk_3

So the Corsair 16 Pin adapter only uses 2 8 pins and can still do 600W?


----------



## WayWayUp

Damn son
Pushing 26k stock with bunk drivers in FSU
This is dx11 raster which makes it even more impressive
It’s more than double the 3090
Have we ever had such a gen over gen increase?
Honestly this is 2 generations worth of increase right here
Hopefully it translates well to games


----------



## Tyl3n0L

Rbk_3 said:


> So the Corsair 16 Pin adapter only uses 2 8 pins and can still do 600W?


To my understanding yes but you need a 1200W PSU to be able to deliver 600W to GPU.


----------



## Glottis

Rbk_3 said:


> So the Corsair 16 Pin adapter only uses 2 8 pins and can still do 600W?


PSU side only looks like pcie 8 pin but it's not. That connector is capable of at least 366 Watts. That's why Corsair only needs 2. Also it's a power cable only for Corsair PSUs, not adapter.


----------



## RetroWave78

WayWayUp said:


> View attachment 2574841
> 
> Damn son
> Pushing 26k stock with bunk drivers in FSU
> This is dx11 raster which makes it even more impressive
> It’s more than double the 3090
> Have we ever had such a gen over gen increase?
> Honestly this is 2 generations worth of increase right here
> Hopefully it translates well to games


The 24-25k Port Royal is incredibly impressive, I'm around 14k with 3090 FE @ 2GHz under water block. Firestrike, Port Royal and Timespy average is 80% uplift over 3090, I think it's mostly safe to say 4090 will be around 80% faster average in games.

I don't believe we've ever seen this kind of uplift between generations. The biggest uplift I remember was from 980 Ti to 1080 Ti and that was 60%. 1080 Ti to 2080 Ti was ~25%. 2080 Ti to 3090 was ~ 30%.

The cards are still overpriced though. The only way the 4090 appears to be a bargain at $1600-2000 is because the 4080 16GB is $1200 and the only reason these cards are still priced like this is because NV needs to clear 30 series overstock. Everything went downhill with the 2080 Ti. We should have stopped supporting NV right then and there when they tried to price the 80 Ti card at $1200.

Here we are 3 generations later staring down the barrel of a glorified $2k 80 Ti card (because you change the nomenclature of the top-tier card from the 80 or 80 Ti to 90 doesn't change the fact that your'e asking double the historical asking price for it. These die SKU and GPU name games do work on the simpletons unfortunately who simp for jacket man).

The 90 card is the only one worth getting. The 4080 16GB solely exists for marketing purposes to make the 4090 seem like a bargain. The $900 70 card will not sell well, if at all.

GTX 1080 = $600
GTX 980 = $600
GTX 780 = $600

GTX 1080 Ti = $700
GTX 980 Ti = $700 
GTX 780 Ti = $700

RTX 2080 Ti = $1200
RTX 3090 = $1500
RTX 4090 = $1600

GTX 1070 = $450
GTX 970 = $450

RTX 4070 12GB (ahem, RTX 4060 Ti 12GB) = $900.


----------



## Rbk_3

Glottis said:


> And by launch you mean scalper and bot rat race. 🤣





Glottis said:


> PSU side only looks like pcie 8 pin but it's not. That connector is capable of at least 366 Watts. That's why Corsair only needs 2. Also it's a power cable only for Corsair PSUs, not adapter.


So the Cablemods EVGA 3 8 pin to 16 pin should be fine to pull 600W also?


----------



## RetroWave78

Rbk_3 said:


> So the Cablemods EVGA 3 8 pin to 16 pin should be fine to pull 600W also?


8 pin PCI-E is rated for 150w. It can convey more. Problem is, cable manufacturing isn't standardized and gauge thickness can vary (all made in China) so it's best to stick by this rule of thumb. 600w will require 4x 8 pin PCI-E to PSU, i.e., Nvidia's adapter is 4x8 PCI-E, 3 for 450w, 4 for 600w. 75w is also pulled over PCI-E slot.


----------



## ZealotKi11er

RetroWave78 said:


> The 24-25k Port Royal is incredibly impressive, I'm around 14k with 3090 FE @ 2GHz under water block. Firestrike, Port Royal and Timespy average is 80% uplift over 3090, I think it's mostly safe to say 4090 will be around 80% faster average in games.
> 
> I don't believe we've ever seen this kind of uplift between generations. The biggest uplift I remember was from 980 Ti to 1080 Ti and that was 60%. 1080 Ti to 2080 Ti was ~25%. 2080 Ti to 3090 was ~ 30%.
> 
> The cards are still overpriced though. The only way the 4090 appears to be a bargain at $1600-2000 is because the 4080 16GB is $1200 and the only reason these cards are still priced like this is because NV needs to clear 30 series overstock. Everything went downhill with the 2080 Ti. We should have stopped supporting NV right then and there when they tried to price the 80 Ti card at $1200.
> 
> Here we are 3 generations later staring down the barrel of a glorified $2k 80 Ti card (because you change the nomenclature of the top-tier card from the 80 or 80 Ti to 90 doesn't change the fact that your'e asking double the historical asking price for it. These die SKU and GPU name games do work on the simpletons unfortunately who simp for jacket man).
> 
> The 90 card is the only one worth getting. The 4080 16GB solely exists for marketing purposes to make the 4090 seem like a bargain. The $900 70 card will not sell well, if at all.
> 
> GTX 1080 = $600
> GTX 980 = $600
> GTX 780 = $600
> 
> GTX 1080 Ti = $700
> GTX 980 Ti = $700
> GTX 780 Ti = $700
> 
> RTX 2080 Ti = $1200
> RTX 3090 = $1500
> RTX 4090 = $1600
> 
> GTX 1070 = $450
> GTX 970 = $450
> 
> RTX 4070 12GB (ahem, RTX 4060 Ti 12GB) = $900.


GTX 580 to GTX 780 Ti is very big jump.


----------



## Nizzen

RetroWave78 said:


> 8 pin PCI-E is rated for 150w. It can convey more. Problem is, cable manufacturing isn't standardized and gauge thickness can vary (all made in China) so it's best to stick by this rule of thumb. 600w will require 4x 8 pin PCI-E to PSU, i.e., Nvidia's adapter is 4x8 PCI-E, 3 for 450w, 4 for 600w. 75w is also pulled over PCI-E slot.


Yep! Evga 780ti classified with 2x 8pin was drawing 650w with evbot under water.

2x sli and total system draw from the wall was 1700w. Good times 😁


----------



## Alemancio

I have access to preorder the Asus ROG Strix 4090 OC and the MSI Suprim Liquid X.

*Which one would you preorder???* Im inclined to the Liquid Cooled Suprim (lower temps most likely)


----------



## RetroWave78

ZealotKi11er said:


> GTX 580 to GTX 780 Ti is very big jump.


Yes actually 780 Ti may still be the biggest performance jump in between generations (1:46 mark below) but we haven't even seen 4090 Ti.


----------



## RetroWave78

Nizzen said:


> Yep! Evga 780ti classified with 2x 8pin was drawing 650w with evbot under water.
> 
> 2x sli and total system draw from the wall was 1700w. Good times 😁


Wow that is insane. I had 2x EVGA 780 Ti SC and thought 370w bios was ridiculous and didn't want to push them further.


----------



## RetroWave78

Alemancio said:


> I have access to preorder the Asus ROG Strix 4090 OC and the MSI Suprim Liquid X.
> 
> *Which one would you preorder???* Im inclined to the Liquid Cooled Suprim (lower temps most likely)


How and where? I would go Suprim over Strix, AIO can displace heat better and you can set it up to push the heat out of your case whereas Strix is dumping heat in the case to saturate your memory etc. Strix is overpriced IMO, Asus is riding on brand-name at this point as they have established themselves with 30 series Strix. MSI cards were and are as good as Strix. MSI is completely underrated.


----------



## J7SC

While looking for some (any) factory full water-block models via Google Images, I came across this one below. While I enjoy the custom w-cooled Asus 3090 Strix, I also still have the 2080 Ti Aorus Waterforce WB which also have been high quality and very reliable since Dec. '18. I do wonder though if Giagabyte will really bring out all these 4090 variations at once, and/or in all markets simultaneously.


----------



## Arizor

Optimus waterblock looking ready to roll


----------



## J7SC

Arizor said:


> Optimus waterblock looking ready to roll
> 
> View attachment 2574869


...else:


Spoiler


----------



## dr/owned

Arizor said:


> Optimus waterblock looking ready to roll


CAD is cheap. Delivering a tangible product in under 3 months has always been their weak point.


----------



## dr/owned

Alemancio said:


> I have access to preorder the Asus ROG Strix 4090 OC and the MSI Suprim Liquid X.
> 
> *Which one would you preorder???* Im inclined to the Liquid Cooled Suprim (lower temps most likely)


Strix. Long term AIO things are a tough resell and AIO's inevitably fail from some issue or another. 

(really I would wait to see how other models perform. The Strix is looking to be hugely overpriced)


----------



## bmagnien

dr/owned said:


> CAD is cheap. Delivering a tangible product in under 3 months has always been their weak point.


Agreed. And that isn’t final render. They’re waiting for a strix board in hand to make final measurements and finalize the design.


----------



## Nono31

RetroWave78 said:


> Yes actually 780 Ti may still be the biggest performance jump in between generations (1:46 mark below) but we haven't even seen 4090 Ti.


Interesting! Massive pain in ass ever.
Thank you Nvidia


----------



## WayWayUp

now that there will be no kingpin, (unless he signs with another company) any other extreme variants planned? msi lightning, ext..?

there is a market for this and i will pay more for a binned gpu


----------



## Benni231990

You dont need Pay extra my Source said that Nvidia has a Extreme tight binning on the 4090 for her FE so he said the best cards will be a FE


----------



## EastCoast

dr/owned said:


> I'm skipping Bykski and going Barrow this time around. I've had 2 nickel plating failures now with the 3090 on Bykski blocks. Whatever their plating is nowadays, it's really weak.
> 
> And this is with Mayhem X1 Clear premix so it's not like it's sketchy fluid.





ZealotKi11er said:


> Their blocks are garbage. No instruction, no proper mounting screws and same as you nickel plating failed. Worse of all performance was garbage got my 6900XT vs Heatkiler.


Isn't that the same block praised by frame chasers?




isamu said:


> I've got my 4090 in the mail today. It's......OK. Nothing mind blowing.


Cheer up, you've satisfied your FOMO. So, there that amiright?
So now you can deal with FOBO. Or you can sell it and get JOMO.


----------



## long2905

WayWayUp said:


> now that there will be no kingpin, (unless he signs with another company) any other extreme variants planned? msi lightning, ext..?
> 
> there is a market for this and i will pay more for a binned gpu


theres alway GALAX HOF. its trading blow with kingpin


----------



## RetroWave78

WayWayUp said:


> now that there will be no kingpin, (unless he signs with another company) any other extreme variants planned? msi lightning, ext..?
> 
> there is a market for this and i will pay more for a binned gpu


Things have changed, now it's reverse binning (Nvidia keeping best dies for themselves, giving AIB's leftovers) and AD doesn't need more power. AIB's will still sell, why people will buy them is beyond me. You're getting an inferior cooler and board design and in many instances paying more for it (Strix).



J7SC said:


> While looking for some (any) factory full water-block models via Google Images, I came across this one below. While I enjoy the custom w-cooled Asus 3090 Strix, I also still have the 2080 Ti Aorus Waterforce WB which also have been high quality and very reliable since Dec. '18. I do wonder though if Giagabyte will really bring out all these 4090 variations at once, and/or in all markets simultaneously.
> View attachment 2574865


Sadly Gigabyte didn't release the WB variants at launch with 30 series, they trailed some 6 months after release. I'm not waiting that long. Another serious transgression by GB was the fact that one could not disassemble said WB to clean or inspect or repaste. Problem with 20 and 30 series WB design. Love their aesthetic design though.


----------



## Rbk_3

RetroWave78 said:


> Things have changed, now it's reverse binning (Nvidia keeping best dies for themselves, giving AIB's leftovers) and AD doesn't need more power. AIB's will still sell, why people will buy them is beyond me. You're getting an inferior cooler and board design and in many instances paying more for it (Strix).
> 
> 
> 
> Sadly Gigabyte didn't release the WB variants at launch with 30 series, they trailed some 6 months after release. I'm not waiting that long. Another serious transgression by GB was the fact that one could not disassemble said WB to clean or inspect or repaste. Problem with 20 and 30 series WB design. Love their aesthetic design though.


There is no way the Founders cools better than many of these AIB models.


----------



## Benni231990

No one cares about cooling  we want the best binned gpu and this will be a FE

for cooling the most (me includet) put a WB on and is happy


----------



## EastCoast

RetroWave78 said:


> Things have changed, now it's reverse binning (Nvidia keeping best dies for themselves, giving AIB's leftovers) and AD doesn't need more power. AIB's will still sell, why people will buy them is beyond me. You're getting an inferior cooler and board design and in many instances paying more for it (Strix).


I remembered when this was rumored to be the case. But is there any proof nvidia is doing this specifically?
Oh, don't get me wrong, I do believe they are. I just never found any evidence they were.
What I also find intriguing is that people ignore this and pay more for AIB variants anyway.


----------



## Glottis

RetroWave78 said:


> Things have changed, now it's reverse binning (Nvidia keeping best dies for themselves, giving AIB's leftovers) and AD doesn't need more power. AIB's will still sell, why people will buy them is beyond me. You're getting an inferior cooler and board design and in many instances paying more for it (Strix).


We'll see if FE cooler is better than AIBs, it would certainly be the first time for Nvidia. As for Asus cards, they have huge advantage over FE and other AIBs, they offer 2x HDMI ports. This is a must for people who may run multiple LG OLEDs and such. I do agree that things are getting very very strange with AIB vs FE in general. It looks like you'll need to get AIB flagship cards like Strix just to get bin and power target parity with FE.


----------



## Rbk_3

Glottis said:


> It looks like you'll need to get AIB flagship cards like Strix just to get bin and power target parity with FE.


Where are you people coming up with this?


----------



## WayWayUp

so the raw hardware suggests an even bigger increase than what we saw with those leaked benchmarks
I'm thinking Nvidia is saving a driver release to smack AMD with right before the new amd cards come out. Despite the already big increase, it seems like Nvda is holding back

raw math shows that a theoretical jump up to 2.32x could happen based on shader and clock growth, among other improvements


----------



## Glottis

Rbk_3 said:


> Where are you people coming up with this?


We have leaks that suggest FE will have 600W power limit (not 100% confirmed yet but leaks look pretty legit).
Asus on official stream confirmed Strix cooler is designed for 600W mode (also higher PSU requirement on website).
Asus on official stream confirmed that cheaper TUF will run at "default TGP", as in Nvidia reference design default (remember that FE isn't reference).

I mean info is already out there when you look for it.


----------



## yzonker

I don't recall seeing 3090 FE's being able to run significantly higher core offsets than other cards. As far as I can remember, there has not been any real data showing the FE had the better cores. The best cores just seemed to be scattered amongst various brands/cards. And the giant AIB coolers will definitely beat the FE cooler. Probably not by a lot, but they'll be better.

The 4090 FE cooler is just a modest upgrade over the 3090ti FE cooler and it got beat pretty bad in this test (of course ignoring the Strix LC),



















NVIDIA GeForce RTX 3090 Ti Founders Edition Review


The GeForce RTX 3090 Ti Founders Edition is NVIDIA's mightiest card from the Ampere lineup. We previously looked at various custom designs. Today, we're checking out the Founders Edition to test how well it does in terms of heat and noise, and whether it's an alternative to the even more...




www.techpowerup.com





Don't take this as me hating on the FE, I'm actually considering it myself given the revelation that the card may not need more than 600w. I'll WC it so don't care about the cooler anyway.


----------



## yzonker

Glottis said:


> We have leaks that suggest FE will have 600W power limit (not 100% confirmed yet but leaks look pretty legit).
> Asus on official stream confirmed Strix cooler is designed for 600W mode (also higher PSU requirement on website).
> Asus on official stream confirmed that cheaper TUF will run at "default TGP", as in Nvidia reference design default (remember that FE isn't reference).
> 
> I mean info is already out there when you look for it.


It's not a leak, Nvidia stated it would have a 600w PL.

"While the RTX 4090 operates at 450 W by default, the power delivery capability allows you to increase the power limit up to 600 W for overclocking. "









NVIDIA Details GeForce RTX 4090 Founders Edition Cooler & PCB Design, new Power Spike Management


NVIDIA detailed the design of its GeForce RTX 4090 Founders Edition graphics card in a briefing with us today. While the new Founders Edition looks very similar to the RTX 3090 Ti Founders Edition, NVIDIA says it has made several changes to its design. The metal outer frame now comes with a...




www.techpowerup.com


----------



## mirkendargen

EastCoast said:


> I remembered when this was rumored to be the case. But is there any proof nvidia is doing this specifically?
> Oh, don't get me wrong, I do believe they are. I just never found any evidence they were.
> What I also find intriguing is that people ignore this and pay more for AIB variants anyway.


On 3090's it didn't matter if FE's were binned better, they had a power limit so much lower than the AIB cards that they still left performance on the table.

That could be different this time around if no card gets more than 600W, and FE gets 600w.


----------



## Rbk_3

Glottis said:


> We have leaks that suggest FE will have 600W power limit (not 100% confirmed yet but leaks look pretty legit).
> Asus on official stream confirmed Strix cooler is designed for 600W mode (also higher PSU requirement on website).
> Asus on official stream confirmed that cheaper TUF will run at "default TGP", as in Nvidia reference design default (remember that FE isn't reference).
> 
> I mean info is already out there when you look for it.


I think the models that are stating they require a 1000W PSU in the specs are going to have 600W bios options. 

1000 W listed in specs 

Strix
Suprim
Master
Gaming OC

850 W listed in specs

Windforce
Gaming Trio
TUF


----------



## J7SC

Benni231990 said:


> No one cares about cooling  we want the best binned gpu and this will be a FE
> 
> for cooling the most (me includet) put a WB on and is happy


With boost algorithms that include temps, I would like the best-binned _and_ best-cooled GPU right from the factory before I get to do my custom cooling setups. Typically, vendors delay factory full-WB models a bit to build up the highest-bin chips as NVidia allegedly makes vendors buy a lot of the rest before they get the best (though there's' Galax🥴). I understand this gen has a tighter bin range anyways, but still, the top 50% of the chips will be faster than the bottom 50%


----------



## Daemon_xd

Do you guys think 1600 W power supply will be enough to power up overclocked (probably custom loop, nothing crazy) 4090 + 13900k system? Debating between purchasing 1600 and 2000 W variants of SUPER FLOWER Leadex 80+ Platinum at the moment. Difference in price is around 100$


----------



## energie80

Cmon 1200 are more then enough 😅


----------



## Rbk_3

Daemon_xd said:


> Do you guys think 1600 W power supply will be enough to power up overclocked (probably custom loop, nothing crazy) 4090 + 13900k system? Debating between purchasing 1600 and 2000 W variants of SUPER FLOWER Leadex 80+ Platinum at the moment. Difference in price is around 100$


Personally, I would be extra safe and get both and run the CPU on the 1600W and the GPU on the 2000W.


----------



## Daemon_xd

Thanks for replies! Honestly asking just to feel safe about purchase, currently have 850 w so need to upgrade anyway


----------



## 8472

If it can hit 3ghz under 500W, I'm really questioning how worthwhile going to 600W will be. Anyway, if I get the TUF, I'll just put a Strix bios on it to test out what it can do.


----------



## Benni231990

I got news from my Source

The tuf had a 4x8pin adaper so it will be 600watt bios to and for the strix we will have a 600+ BIOS TDP!!!!!!

So the Strix BIOS is "THE" Bios every body will have only maybe the HOF will have a bigger on but the HOF will have 2x12pin adapter maybe deliberately for Stand alone because a KingPin card was the only rival and now HOF ist the only Extrem OC Card


----------



## Glottis

Benni231990 said:


> I got news from my Source
> 
> The tuf had a 4x8pin adaper so it could be 600watt bios to and for the strix we will have a 600+ TDP!!!!!!


Yes you can see TUF has 4x8pin adapter on Asus website https://dlcdnwebimgs.asus.com/gain/881ec8bd-ffd3-4988-89e5-d9ffacb81944//fwebp
This doesn't confirm that the card will have 600W vBIOS though.


----------



## Benni231990

Here i found a video i hope its real but when it is real holy **** the 4090 Destroys Everything


----------



## 8472

Benni231990 said:


> Here i found a video i hope its real but when it is real holy **** the 4090 Destroys Everything


The results are as fake as a 3 dollar bill. 

That channel is an imposter of the below channel. Notice the same name and profile picture but vastly different number of subscribers. These people are desperate for clicks and views.


----------



## RetroWave78

Glottis said:


> We'll see if FE cooler is better than AIBs, it would certainly be the first time for Nvidia. As for Asus cards, they have huge advantage over FE and other AIBs, they offer 2x HDMI ports. This is a must for people who may run multiple LG OLEDs and such. I do agree that things are getting very very strange with AIB vs FE in general. It looks like you'll need to get AIB flagship cards like Strix just to get bin and power target parity with FE.


Great points, I'm not sure if Strix offers this but having a dual BIOS option with physical switch is really useful (for example, flashing XOC BIOS fails, instead of bricking your card you can fall back to redundant bios). EVGA used to offer this, and I think possibly Asus Strix as well. I like the cooler on Strix, I like the card, I just don't see where the $2k asking price is coming from when (IMO) the FE card has a better cooler, better board design, better power delivery (I believe the power surge mitigation chip is proprietary and AIB's are on their own) and FE will be one of the first, if not the first, with a WB offering (EKWB).


----------



## RetroWave78

Benni231990 said:


> Here i found a video i hope its real but when it is real holy **** the 4090 Destroys Everything


Video is probably fake, rough math puts the 4090 @ ~50% faster in the video but in reality it is ~80% faster in rasterization and benches from real, confirmed leaks thus far (Timespy, Firestrike, not to mention Port Royal but that's RT). 

4090 is even faster than this fake video shows.


----------



## RetroWave78

EastCoast said:


> I remembered when this was rumored to be the case. But is there any proof nvidia is doing this specifically?
> Oh, don't get me wrong, I do believe they are. I just never found any evidence they were.
> What I also find intriguing is that people ignore this and pay more for AIB variants anyway.


No proof that I know of, only conjecture, but I could be mistaken.


----------



## mirkendargen

RetroWave78 said:


> Great points, I'm not sure if Strix offers this but having a dual BIOS option with physical switch is really useful (for example, flashing XOC BIOS fails, instead of bricking your card you can fall back to redundant bios). EVGA used to offer this, and I think possibly Asus Strix as well. I like the cooler on Strix, I like the card, I just don't see where the $2k asking price is coming from when (IMO) the FE card has a better cooler, better board design, better power delivery (I believe the power surge mitigation chip is proprietary and AIB's are on their own) and FE will be one of the first, if not the first, with a WB offering (EKWB).


The good news is basically all current gen CPUs have an iGPU now, so having to do a recovery flash isn't as painful as it could be in the past.


----------



## RetroWave78

mirkendargen said:


> The good news is basically all current gen CPUs have an iGPU now, so having to do a recovery flash isn't as painful as it could be in the past.


I forgot about that, fortunately I've never had to deal with a bad flash. Last time I flashed vbios was the FTW3 bios to 2080 Ti XC2. Loved that card, the XC2 sensors were amazing, you could tell exactly how hot your memory was etc.


----------



## Bexak

8472 said:


> If it can hit 3ghz under 500W, I'm really questioning how worthwhile going to 600W will be. Anyway, if I get the TUF, I'll just put a Strix bios on it to test out what it can do.


Love how you just assume Asus were negligent and would electrically or software wise allow you to do this 

Almost 100% you'll brick the card trying to do this, based on Asus's comments.


----------



## mirkendargen

Bexak said:


> Love how you just assume Asus were negligent and would electrically or software wise allow you to do this
> 
> Almost 100% you'll brick the card trying to do this, based on Asus's comments.


It's a decently safe assumption based on past experience multiple generations that it'll work completely fine. Asus is never going to tell you that, they want to sell Strixs. Is it possible they do something different this time around? Sure, but the default expectation is that crossflashing bioses will be a thing.


----------



## Krzych04650

NVIDIA GeForce RTX 4090 3DMark scores leaked, at least 82% faster than RTX 3090 - VideoCardz.com

~19K TSE stock, just as rumored, likely 21K+ overclocked. 24K TSE rumor for overclocked full AD102 must be true as well. Interesting how accurate all the twitter leaks were this time around, specs and performance are exactly on point.

For the rumors from reviewers, 1.7x-2x gains in raster and 2-2.2x in RT. Power not an issue, card is supposedly much gentler on PSUs than 3090 Ti was, with less spikes. Supposedly doesn't need much more power to overclock, so 600W should be sufficient, and can be undervolted/power limited much below 450W without losing much performance.

This is just so much better than previous two generations, pricing aside this is Pascal 2.0 + additional innovations. I was convinced that we will never see anything close to this kind of generational leap ever again. And this isn't even a full die yet.


----------



## Rbk_3

Bexak said:


> Love how you just assume Asus were negligent and would electrically or software wise allow you to do this
> 
> Almost 100% you'll brick the card trying to do this, based on Asus's comments.


Tell me you have no idea what you're talking about without telling me you have no idea what you're talking about. You will likely be able to flash the Strix Bios to any 4090 if you want to.


----------



## Tyl3n0L

Really hope I can grab a FE on launch day in Canada. I really hate the aesthetics of every other AIB "gamer'' look they all have.


----------



## Rbk_3

Tyl3n0L said:


> Really hope I can grab a FE on launch day in Canada. I really hate the aesthetics of every other AIB "gamer'' look they all have.


Gaming OC doesn't have much of a "gamer" look if you turn off the RGB


----------



## EastCoast

That's certainly looks like it's built to cost over a grand. /s


----------



## 8472

Bexak said:


> Love how you just assume Asus were negligent and would electrically or software wise allow you to do this
> 
> Almost 100% you'll brick the card trying to do this, based on Asus's comments.


Lol.

Take a look at the 3090 owners thread. There are plenty of people flashing one card with another card's bios. The worst thing that happened was that the performance with the new bios didn't meet expectations.

Do you think people only use those bios switches to get a new fan profile?


----------



## yzonker

8472 said:


> Lol.
> 
> Take a look at the 3090 owners thread. There are plenty of people flashing one card with another card's bios. The worst thing that happened was that the performance with the new bios didn't meet expectations.
> 
> Do you think people only use those bios switches to get a new fan profile?


Yep, I've flashed probably a dozen different bios on my 3080ti/3090. All post and boot to Windows. Assuming nothing is different with 40 series, shouldn't be an issue. Mainly if Nvidia decides to try to kill it. No idea if there is much chance of that or not.


----------



## ZealotKi11er

I am now trying to figure out if GIGABYTE AORUS GeForce RTX 4090 MASTER can fit in my case O11 XL.


----------



## yzonker

I know I'm going to have to move my friggin res again to get the Strix to fit. I sized it so that the biggest 3090ti would fit. Silly me.


----------



## RetroWave78

When exactly does ordering go live and does the Best Buy app refresh the listing page or must you close and re-open the app to get it to refresh? 

I'm hearing 8 am PST, CST, and EST.


----------



## lordkahless

I'm a Best Buy Total tech member and I asked Total Tech support if there was a 4090 benefit for the membership. They said yes that to stay tuned to our email that we would get a notice for members to get advanced purchase on the morning of the 12th.


----------



## bmagnien

RetroWave78 said:


> When exactly does ordering go live and does the Best Buy app refresh the listing page or must you close and re-open the app to get it to refresh?
> 
> I'm hearing 8 am PST, CST, and EST.


B&H said 9am est. But their Ryzen 5000 launch started way earlier than that. Start refreshing at midnight.


----------



## yzonker

RetroWave78 said:


> When exactly does ordering go live and does the Best Buy app refresh the listing page or must you close and re-open the app to get it to refresh?
> 
> I'm hearing 8 am PST, CST, and EST.


No clue on the app, but it's supposed to be 9:00am EST. So PST folks will need to set their alarms. LOL. Dunno outside the US.


----------



## bmagnien

lordkahless said:


> I'm a Best Buy Total tech member and I asked Total Tech support if there was a 4090 benefit for the membership. They said yes that to stay tuned to our email that we would get a notice for members to get advanced purchase on the morning of the 12th.


That’s….interesting. What contact method for total tech support if we wanted to verify?


----------



## yzonker

bmagnien said:


> B&H said 9am est. But their Ryzen 5000 launch started way earlier than that. Start refreshing at midnight.


Isn't BH just doing wait list? Pretty sure Newegg put up Zen4 right around 9:00am EST. Maybe someone that was actually buying one knows for sure (Zen4).


----------



## bmagnien

yzonker said:


> Isn't BH just doing wait list? Pretty sure Newegg put up Zen4 right around 9:00am EST. Maybe someone that was actually buying one knows for sure (Zen4).


B&H’s wait list is literally just an email notification for in stock alerts. They will have stock and start selling at 9am.


----------



## bmagnien

bmagnien said:


> B&H’s wait list is literally just an email notification for in stock alerts. They will have stock and start selling at 9am. Also I was talking about Zen3 launch. I got one of the first 5950xs because I randomly refreshed BH’s site usurper early in the morning and got in.


----------



## lordkahless

bmagnien said:


> That’s….interesting. What contact method for total tech support if we wanted to verify?


When you have the Total Tech Membership when you log into your Best Buy account there is a support chat specific to Total Tech. That has no queue and you don't have to wait in line. I just initiated that chat and asked the question directly. They are support people specific to Total Tech or at least have some extra tier of support specific to the membership.


----------



## bmagnien

x


----------



## yzonker

bmagnien said:


> Wow…
> View attachment 2575172


FU BestBuy.


----------



## RetroWave78

Thanks for the replies everyone, heads up B&H will take your order on the 12th but will not process or ship until the 19th in observance of the Jewish holiday Yom Kippur.

If B&H start taking orders at 9 am PST that is 2 hours before Best Buy is scheduled to begin taking orders (8 am PST).

Our best strategy would probably be to try and place and order with B&H and if that is unsuccessful you will have another chance with Best Buy.

The top 3 cards I'm considering in order are:

1. FE (but must wait until Best Buy opens)
2. Asus Strix
3. MSI Suprim X

What time does Newegg open / start accepting orders?


----------



## RetroWave78

$200, I'm going to need something other than a text message from some random working at Best Buy. Also, how is this different from scalping? Didn't MSI get busted for something similar? Are there laws in place to dissuade this kind of predatory behavior? 

So instead of Bounce Alerts now it's straight up Best Buy scalping.

I have exactly zero need for a "Total Tech" membership. 



https://www.tomshardware.com/news/msi-subsidiary-caught-selling-rtx-30-series-gpus-at-scalper-pricing-on-amazon-ebay


----------



## bmagnien

RetroWave78 said:


> Thanks for the replies everyone, heads up B&H will take your order on the 12th but will not process or ship until the 19th in observance of the Jewish holiday Yom Kippur.
> 
> If B&H start taking orders at 9 am PST that is 2 hours before Best Buy is scheduled to begin taking orders (8 am PST).
> 
> Our best strategy would probably be to try and place and order with B&H and if that is unsuccessful you will have another chance with Best Buy.
> 
> The top 3 cards I'm considering in order are:
> 
> 1. FE (but must wait until Best Buy opens)
> 2. Asus Strix
> 3. MSI Suprim X
> 
> What time does Newegg open / start accepting orders?


FE cards might be sold on Nvidia’s website via their DigitalRiver storefront. Or they might just go 100% exclusive to Best Buy and just have a link to the Best Buy page from the Nvidia store.


----------



## yzonker

RetroWave78 said:


> $200, I'm going to need something other than a text message from some random working at Best Buy. Also, how is this different from scalping? Didn't MSI get busted for something similar? Are there laws in place to dissuade this kind of predatory behavior?
> 
> So instead of Bounce Alerts now it's straight up Best Buy scalping.
> 
> I have exactly zero need for a "Total Tech" membership.
> 
> 
> 
> https://www.tomshardware.com/news/msi-subsidiary-caught-selling-rtx-30-series-gpus-at-scalper-pricing-on-amazon-ebay


They were pulling that crap during the shortage too IIRC. There was some tie in to that worthless TT membership.


----------



## 8472

bmagnien said:


> B&H said 9am est. But their Ryzen 5000 launch started way earlier than that. Start refreshing at midnight.


They never "launched" them that day. They just allowed people to start placing preorders at midnight. Half a day later they were still taking preorders, meaning the ability to place a preorder was seperate from actual stock levels. 

If they "launch" the 4090 at midnight, good luck trying to get it that week.


----------



## ZealotKi11er

I dont think there is any rush to buy 4090. Better to wait and get the version you want. What is 1-2 weeks of waiting going to do. Just remember this is $1600+ card post mining, post pandemic. We could get plenty of 3090 at MSRP before mining took full effect in 2020.


----------



## th3illusiveman

Tyl3n0L said:


> Really hope I can grab a FE on launch day in Canada. I really hate the aesthetics of every other AIB "gamer'' look they all have.


How are you getting an FE in canada? Bestbuy? They don't even list 4090s right now.


----------



## Krzych04650

Yea all the rumors are pointing to unprecedented levels of supply for this tier of product and the demand is not exactly crazy. Not sure about the US, $1600 is not a lot for a card like this, but those 2000€+ price tags in the EU are absolutely massive, I don't think there are going to be any problems with availability unless supply here is delayed, which may happen in some countries.


----------



## Tyl3n0L

th3illusiveman said:


> How are you getting an FE in canada? Bestbuy? They don't even list 4090s right now.


I will try on BestBuy, yes I know they're isn't listing them right now but I talked to a Customer Support guy yesterday and said to look on the 12th. I think Best buy has the FE exclusivity as well as the Nvidia store. It was the same for the rtx 30 series too iirc.


----------



## ZealotKi11er

How is warranty with FE card/ Bestbuy? Will Nvidia know if you resell it? Main reason I get ASUS is serial based warranty.


----------



## changboy

Hey guys :


----------



## Mad Pistol

I'm a bit perturbed that Best Buy is restricting first hits on the 4090 FE to Total Tech members. That's a $200/yr paywall. Talk about a rip off.


----------



## bmagnien

Mad Pistol said:


> I'm a bit perturbed that Best Buy is restricting first hits on the 4090 FE to Total Tech members. That's a $200/yr paywall. Talk about a rip off.


Nvidia sold direct from their website via DigitalRiver for the 3000 launch, I’ve heard nothing that says they won’t do that again this time.


----------



## dr/owned

bmagnien said:


> Nvidia sold direct from their website via DigitalRiver for the 3000 launch, I’ve heard nothing that says they won’t do that again this time.


They abandoned that after like a month and let BBY handle all of the FE sales. I don't see why they would want to "take over" just for launch.


----------



## bmagnien

dr/owned said:


> I don't see why they would want to "take over" just for launch.


To make more money. Best Buy takes a cut for their overhead, which DigitalRiver has significantly less of as a digital-only etailer without any brick and mortar costs. I guess we’ll see


----------



## PLATOON TEKK

pretty clear PCB shot/3090ti comparison


----------



## th3illusiveman

Tyl3n0L said:


> I will try on BestBuy, yes I know they're isn't listing them right now but I talked to a Customer Support guy yesterday and said to look on the 12th. I think Best buy has the FE exclusivity as well as the Nvidia store. It was the same for the rtx 30 series too iirc.


Ah, guess we will see. I hope the stock rumours were true. Would be nice to simply buy the actual GPU product you want with no hassle for once.


----------



## kaTus

Hello there,
do someone know which cards are limited to 450W?


----------



## GosuPl

kaTus said:


> Hello there,
> do someone know which cards are limited to 450W?


Hello there, my friend 

Good question.


----------



## WayWayUp

NVIDIA GeForce RTX 4090 now spotted with max 3.2 GHz GPU and 25 Gbps memory overclock - VideoCardz.com


NVIDIA GeForce RTX 4090 memory overclocked to 25 Gbps The RTX 4090 may have even more overclocking potential than expected. We have already seen RTX 4090 operating with 3.0 GHz or even 3.1 GHz maximum frequency and TDP going up to 600W. However, the new screenshot from @wnxod may suggest that...




videocardz.com





This is pretty cool. 4090 spotted doing 3,240 Mhz and 1,562 on the memory

I would love to know the conditions...power draw and temps. Was this a gold sample? ext


----------



## yzonker

Maybe this is already confirmed, but anyway Zotac just put this up. (no I'm not suggesting to buy Zotac, lol) 


__ https://twitter.com/i/web/status/1579464377547177985


----------



## WayWayUp

is there any point to DP 2.0 right now? i see a lot of ppl upset over this but if hdmi 2.1 can support 4k 240hz it should also support 8k 60hz, correct?
from my understanding however (for both DP and HDMI) they need to use DSC but I've read it's lossless


----------



## Mad Pistol

WayWayUp said:


> is there any point to DP 2.0 right now? i see a lot of ppl upset over this but if hdmi 2.1 can support 4k 240hz it should also support 8k 60hz, correct?


I thought HDMI 2.1 only supported 4k120 full bandwidth. Is the 4k240 compressed?


----------



## WayWayUp

Mad Pistol said:


> I thought HDMI 2.1 only supported 4k120 full bandwidth. Is the 4k240 compressed?


this is what I've been able to dig up
_"Monitors that use DSC claim there's no reduction in image quality. In fact, VESA says its compression technique is visually lossless. Most folks shouldn't be able to tell the difference_

I believe it because there are 4k 240hz monitors out there such as the Odyssey Neo G8. Why would they release a monitor with a selling point that doesnt come with the ability to use it?

Hdmi has a higher max transmission rate than DP 1.4 so i would feel more comfortable using hdmi for less compression


----------



## bmagnien

3.1ghz at 550w at 99% utilization in 4K PUBG. First real game performance I’ve seen. Looks spicy


----------



## Mad Pistol

bmagnien said:


> View attachment 2575270
> 
> 3.1ghz at 550w at 99% utilization in 4K PUBG. First real game performance I’ve seen. Looks spicy


Not sure what "SPUER" means, but yea... looks tasty.


----------



## bmagnien

Mad Pistol said:


> Not sure what "SPUER" means, but yea... looks tasty.


Bro you're not familiar with the *RTX 3090TI SPUER PRO MAX PLUS*!? I'm picking one up on Wednesday.


----------



## GRABibus




----------



## ZealotKi11er

WayWayUp said:


> is there any point to DP 2.0 right now? i see a lot of ppl upset over this but if hdmi 2.1 can support 4k 240hz it should also support 8k 60hz, correct?
> from my understanding however (for both DP and HDMI) they need to use DSC but I've read it's lossless


DPM2.0 would be nice because you will start to see monitors to come out for next gen GPUs. Now even if next gen GPUs are DP 2.0 it will take another 3-4 years from now to get those monitors. I play at 4K and would not mind to have over 120Hz since most games I play hit over that easily. Maybe we need HDMI 3.0.


----------



## changboy

Is the benchmark will be tonight at 00h00 ?


----------



## bmagnien

changboy said:


> Is the benchmark will be tonight at 00h00 ?


9am est on the 11th, just like unboxing were last week.


----------



## changboy

bmagnien said:


> 9am est on the 11th, just like unboxing were last week.


Ok thank you.


----------



## WayWayUp

ZealotKi11er said:


> DPM2.0 would be nice because you will start to see monitors to come out for next gen GPUs. Now even if next gen GPUs are DP 2.0 it will take another 3-4 years from now to get those monitors. I play at 4K and would not mind to have over 120Hz since most games I play hit over that easily. Maybe we need HDMI 3.0.


i wonder what the actual limit is for hdmi 2.1
is it only 120hz @ 4k without DSC?

i would think it would be higher since 48Gbps in bandwidth

DisplayPort 2.0 can handle 2x 4k @ 144Hz natively with only 80Gbps
so hdmi 2.1 should be able to do 4k 144Hz native. If not, where did the Bandwidth go?


----------



## dr/owned

WayWayUp said:


> is there any point to DP 2.0 right now? i see a lot of ppl upset over this but if hdmi 2.1 can support 4k 240hz it should also support 8k 60hz, correct?
> from my understanding however (for both DP and HDMI) they need to use DSC but I've read it's lossless


DSC is advertised as visually lossless which is the same as saying MP3 is audibly lossless. I'm pretty sure that my tab-bar in Chrome induces artifacts with DSC, because it goes away at 120Hz.


----------



## dr/owned

PLATOON TEKK said:


> View attachment 2575225
> 
> 
> pretty clear PCB shot/3090ti comparison


Ayyyyy only 3 shunt resistors this time. At least on the front. 3090 Ti FE had 6 on the front and 1 on the back.


----------



## yzonker

The rumor I heard, which came from a YouTuber so take that for what it's worth, was that RDNA3 will have DP 2.0.


----------



## WayWayUp

nvda themselves released an 8k benchmark with a variety of games.
If the 4090 couldnt brute force its way to 60fps, it did it with dlss but all the games they showed did 8k at over 60fps

but there isnt even a monitor that can do 8k 120.... and ppl seem to be fine with DSC

If i really wanted to game at 4k 240 or 8k 60, im most likely using dlss 2 quality mode to give me a performance boost and for that i would still be under the native bandwidth threshold

But even without DLSS, and completely ignoring Display Stream Compression, there are still ways to get to where you want natively

the Dell UltraSharp UP3218K, which is an 8k 60Hz monitor, ships with 2 display ports so you can play 8k 60 natively

My point is the lack of DP 2.0 doesnt really matter. It's just a nice added feature but doesnt change anything


----------



## ZealotKi11er

yzonker said:


> The rumor I heard, which came from a YouTuber so take that for what it's worth, was that RDNA3 will have DP 2.0.


It depends. It could be 10, 13.5 or 20. 20 is what we really want. HDMI 2.1 is already 12. 13.5 is decent.


----------



## coelacanth

yzonker said:


> The rumor I heard, which came from a YouTuber so take that for what it's worth, was that RDNA3 will have DP 2.0.


It's not a rumor it's been confirmed by AMD that RDNA3 will have DisplayPort 2.0.



https://www.pcgamer.com/amd-rdna-3-gpus-sport-next-gen-ports-for-true-4k-gaming-at-240hz/











AMD Radeon RX 7000 Graphics Cards With Rearchitected RDNA 3 GPU Compute Units To Deliver Enhanced Ray Tracing Capabilities & Even Higher Clock Speeds


AMD has confirmed that RDNA 3 GPU powered Radeon RX 7000 delivers enhanced raytracing capabilities and even higher clock speeds than RDNA 2.




wccftech.com


----------



## yzonker

Interesting, as of right now MSI cards have disappeared from the BB and Newegg sites. Still up on BH and Microcenter as well as MSI's store.

Edit : Gigabyte gone too

Edit edit: Newegg's website still shows them all. I was looking at the app.


----------



## yzonker

And I still think only FE reviews tomorrow. 

"More details are to be shared tomorrow with the first independent reviews featuring NVIDIA’s own Founders Edition." 









NVIDIA GeForce RTX 4090 PCB shows minor changes from RTX 3090 Ti - VideoCardz.com


RTX 4090 PCB looks a lot like RTX 3090 Ti We have official picture of the RTX 4090 Ti board design. Up to 28 phases can be added to AD102 Founder Edition PCB, but only 23 are installed on the RTX 4090 graphics card. What this means is that NVIDIA could still use this board […]




videocardz.com





One of those times I would like to be wrong...


----------



## ZealotKi11er

yzonker said:


> And I still think only FE reviews tomorrow.
> 
> "More details are to be shared tomorrow with the first independent reviews featuring NVIDIA’s own Founders Edition."
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 PCB shows minor changes from RTX 3090 Ti - VideoCardz.com
> 
> 
> RTX 4090 PCB looks a lot like RTX 3090 Ti We have official picture of the RTX 4090 Ti board design. Up to 28 phases can be added to AD102 Founder Edition PCB, but only 23 are installed on the RTX 4090 graphics card. What this means is that NVIDIA could still use this board […]
> 
> 
> 
> 
> videocardz.com
> 
> 
> 
> 
> 
> One of those times I would like to be wrong...


That would suck. We all know it will perform good. We care more about cooling/oc/power/price.


----------



## Arizor

Yeah if the leaked benchmarks are real, the AIBs are doing about 2-3% better than stock, so looks like marginal gains from OC. I'd imagine, best case scenario, an extra 7-8% performance with big overclocks.

So it's all about cooling, power and price this gen. Looking at leaked prices for the top range (Strix, Suprim etc.), I think cards like the ASUS TUF are the way to go if you can't get FE (such as here in bloody Oz...).


----------



## yzonker

ZealotKi11er said:


> That would suck. We all know it will perform good. We care more about cooling/oc/power/price.


Well like I posted a few days ago, if you look at TPU's review database for 3080, the actual article dates for the FE are one day before release, AIB's are the day of release. 3090 appears to all be day of release (Sept. 24th).


----------



## ZealotKi11er

Arizor said:


> Yeah if the leaked benchmarks are real, the AIBs are doing about 2-3% better than stock, so looks like marginal gains from OC. I'd imagine, best case scenario, an extra 7-8% performance with big overclocks.
> 
> So it's all about cooling, power and price this gen. Looking at leaked prices for the top range (Strix, Suprim etc.), I think cards like the ASUS TUF are the way to go if you can't get FE (such as here in bloody Oz...).


There is no point to get anything but the FE card if the rest are a lot more expensive. FE card can fit in way more cases. We know FE does 600W. Is there a point to more than 600W? If all are limited to 600W, what u get then?


----------



## Arizor

ZealotKi11er said:


> There is no point to get anything but the FE card if the rest are a lot more expensive. FE card can fit in way more cases. We know FE does 600W. Is there a point to more than 600W? If all are limited to 600W, what u get then?


Yep the only point is for dual BIOS so you can mess around with a safety net. 

The other, bigger reason of course, is if you don't have FE sold in your country like us Aussies


----------



## dr/owned

Arizor said:


> Yep the only point is for dual BIOS so you can mess around with a safety net.
> 
> The other, bigger reason of course, is if you don't have FE sold in your country like us Aussies


Do you guys not bother with re-shipping services? I get stuff from Japan in the US regularly (especially now when the Yen has horrible value...dollar goes a lot further in Japan)


----------



## Arizor

dr/owned said:


> Do you guys not bother with re-shipping services? I get stuff from Japan in the US regularly (especially now when the Yen has horrible value...dollar goes a lot further in Japan)


Yeah I buy a lot (or used to) of my clothing from Japan and the reshipping is valuable. For whatever reason, reshipping from the USA is horrendous, priced extortionately and gets held up in customs.


----------



## dante`afk

who's gonna get scalped


----------



## Arizor

Haha I think there's about to be a few scalpers facing a rude awakening when they realise the economic and social conditions that gave rise to the insane prices for the 3-series are all drastically different this time round.

No crypto mining, no stimulus cheques, no lockdowns, strong supply of previous gens available, (apparent) oversupply of 4-series.

edit: Oh, and that little thing called inflation and recession.


----------



## Mad Pistol

dante`afk said:


> who's gonna get scalped
> 
> View attachment 2575328


That’s rather… optimistic. 😂


----------



## stefxyz

Of course there are arguments for going with a non fe. Quite simple minded to pretend this as a general statement.

Most obvious reason next to the already mentioned dual bios is the fact that a much bigger third party card with beefed up cooling will stay significantly quieter under load.

So for non custom Watercooling crowds that’s a huge benefit.


----------



## J7SC

dante`afk said:


> who's gonna get scalped
> 
> View attachment 2575328


...probably the chap who bought the *RTX 3090TI SPUER PRO MAX PLUS*!


----------



## Glottis

ZealotKi11er said:


> There is no point to get anything but the FE card if the rest are a lot more expensive. FE card can fit in way more cases. We know FE does 600W. Is there a point to more than 600W? If all are limited to 600W, what u get then?


AIB cards do have advantages. Asus offers an extra HDMI port on their cards. I don't think FE or any other card will have troubles with stock 450W mode. However I think we'll start seeing clear winners and losers when cards are stressed with 600W load. I highly doubt FE will be as good as something like Strix which simply has way more heatsink volume to work with (I mean, can you even think of example when FE had better cooling than Strix? I can't).


----------



## sugi0lover

Is it confirmed that 4090 FE bios allows 600W ? 
If yes, any link for the source will be appreciated.


----------



## mattxx88

sugi0lover said:


> Is it confirmed that 4090 FE bios allows 600W ?
> If yes, any link for the source will be appreciated.











GeForce RTX 4090 Founders Edition unboxing: la nuova top di gamma di NVIDIA dal vivo


In attesa di proporvi la recensione, vi mostriamo alcune foto della GeForce RTX 4090 Founders Edition, la nuova GPU top di gamma di NVIDIA basata su architettura Ada Lovelace. A bordo un chip con 16384 CUDA core e 24 GB di memoria GDDR6X.




www.hwupgrade.it





thi italian site report:
Sul lato c'è *un connettore a 16 pin* (12VHPWR) che permette, con gli alimentatori compatibili o tramite adattatore, di alimentatore la scheda con un singolo cavo. NVIDIA, in ogni caso, ci ha dato un adattatore che ci permette di collegare un "vecchio" alimentatore tramite connettori a 8 pin: ci sono quattro connettori, ma in realtà ne servono solo tre, il quarto è necessario collegarlo solo in caso di overclock.

On the side there is a 16-pin connector (12VHPWR) that allows, with compatible power supplies or via an adapter, to power the board with a single cable. NVIDIA, in any case, gave us an adapter that allows us to connect an 'old' power supply via 8-pin connectors: there are four connectors, but in reality *only three are needed (450w)*, the *fourth only needs to be connected in case of overclocking (600w)*.


----------



## sugi0lover

mattxx88 said:


> GeForce RTX 4090 Founders Edition unboxing: la nuova top di gamma di NVIDIA dal vivo
> 
> 
> In attesa di proporvi la recensione, vi mostriamo alcune foto della GeForce RTX 4090 Founders Edition, la nuova GPU top di gamma di NVIDIA basata su architettura Ada Lovelace. A bordo un chip con 16384 CUDA core e 24 GB di memoria GDDR6X.
> 
> 
> 
> 
> www.hwupgrade.it
> 
> 
> 
> 
> 
> thi italian site report:
> Sul lato c'è *un connettore a 16 pin* (12VHPWR) che permette, con gli alimentatori compatibili o tramite adattatore, di alimentatore la scheda con un singolo cavo. NVIDIA, in ogni caso, ci ha dato un adattatore che ci permette di collegare un "vecchio" alimentatore tramite connettori a 8 pin: ci sono quattro connettori, ma in realtà ne servono solo tre, il quarto è necessario collegarlo solo in caso di overclock.
> 
> On the side there is a 16-pin connector (12VHPWR) that allows, with compatible power supplies or via an adapter, to power the board with a single cable. NVIDIA, in any case, gave us an adapter that allows us to connect an 'old' power supply via 8-pin connectors: there are four connectors, but in reality *only three are needed (450w)*, the *fourth only needs to be connected in case of overclocking (600w)*.


Thanks for the reply! But I am not talking about the 16-pin connector. I know it can provide 600w. What I want to know is that whether the actual FE bios can allow to use 600W. I think we will know soon.


----------



## mattxx88

sugi0lover said:


> Thanks for the reply! But I am not talking about the 16-pin connector. I know it can provide 600w. What I want to know is that whether the actual FE bios can allow to use 600W. I think we will know soon.


yes I understood, but mine was an argument based on reverse engineering
it seems Nvidia in the 4090's cable instructions says you can only connect 3 cables to get 450w and the fourth to unlock the 600w in oc, so the bios should allow 600w power draw (TBP i assume)
we will know this in few hours for sure, right 😁


----------



## GosuPl

RTX 4090 FE have 600 PT unlocked  Confirmed.


----------



## AvengedRobix

mattxx88 said:


> GeForce RTX 4090 Founders Edition unboxing: la nuova top di gamma di NVIDIA dal vivo
> 
> 
> In attesa di proporvi la recensione, vi mostriamo alcune foto della GeForce RTX 4090 Founders Edition, la nuova GPU top di gamma di NVIDIA basata su architettura Ada Lovelace. A bordo un chip con 16384 CUDA core e 24 GB di memoria GDDR6X.
> 
> 
> 
> 
> www.hwupgrade.it
> 
> 
> 
> 
> 
> thi italian site report:
> Sul lato c'è *un connettore a 16 pin* (12VHPWR) che permette, con gli alimentatori compatibili o tramite adattatore, di alimentatore la scheda con un singolo cavo. NVIDIA, in ogni caso, ci ha dato un adattatore che ci permette di collegare un "vecchio" alimentatore tramite connettori a 8 pin: ci sono quattro connettori, ma in realtà ne servono solo tre, il quarto è necessario collegarlo solo in caso di overclock.
> 
> On the side there is a 16-pin connector (12VHPWR) that allows, with compatible power supplies or via an adapter, to power the board with a single cable. NVIDIA, in any case, gave us an adapter that allows us to connect an 'old' power supply via 8-pin connectors: there are four connectors, but in reality *only three are needed (450w)*, the *fourth only needs to be connected in case of overclocking (600w)*.


Ommioddio.. hwupgrade.. saranno più di dieci anni che non lo sentivo nominare 🤣😉


----------



## ArcticZero

Looks like reviews are out. The power efficiency compared to the 3090 Ti is very impressive.











And the stock power target is incredibly excessive past the point of diminishing returns, it seems. Looks like I'll be running it at 50-60% PT this time. Forget shunt modding.









Source (Der8auer):


----------



## Thanh Nguyen

Mining is dead but why its out of stock everywhere guys.


----------



## lordkahless

It isn't for sale until tomorrow. The 12th.


----------



## Demonkevy666

Thanh Nguyen said:


> Mining is dead but why its out of stock everywhere guys.


Nivida's mind share.


----------



## Glottis

So we will not have power limit table for AIB cards until the moment they are launched. Sigh, thanks for maximizing launch day chaos and frustration, Nvidia.


----------



## 8472

Thanh Nguyen said:


> Mining is dead but why its out of stock everywhere guys.


Sites like bestbuy and Newegg are showing it's out of stock because the cards don't go on sale until tomorrow morning.


----------



## dante`afk

incredible generational leap in performance.

absolutely impressive monster


----------



## ttnuagmada

Power seems to be in check honestly, pretty well matches a 3090. We're a long ways from the 800w rumors.


----------



## z390e

4090 looks to take a massive crap on 3 series cards in the same way 1080 gen did

666w at 33% OC?


----------



## WayWayUp

so according to Der8auer. in games like PUBG
60% power target results in -5% fps but -33% power draw
incredible
after looking at some games he came to the conclusion you can get 95% of the performance with 130-150watts less power draw

good for the crybabies who go on an on about spacer heater this and nuclear reactor that
you can get massive performance out of this with minimal power consumption
but i think those people are more concerned with mentally justifying not upgrading because of the price so i dont think it changes much as those ppl are priced out and looking for something to poo poo on


----------



## z390e

Thanh Nguyen said:


> Mining is dead but why its out of stock everywhere guys.











Asus ROG STRIX GeForce RTX 4090 OC Edition Graphics Card PCIe 4.0 24GB - Newegg.com


Buy ASUS ROG NVIDIA GeForce RTX 4090 OC Edition Video Card (PCIe 4.0, 24GB GDDR6X, HDMI 2.1, DisplayPort 1.4a, GPU Tweak) ROG-STRIX-RTX4090-O24G-GAMING with fast shipping and top-rated customer service. Once you know, you Newegg!




www.newegg.com





right there can buy right now


----------



## sakete

Do we think this launch will be just as bad as the 30xx gen in terms of availability? Scalpers galore?

And what would be the best one to get? FE? Asus?


----------



## Blameless

WayWayUp said:


> so according to Der8auer. in games like PUBG
> 60% power target results in -5% fps but -33% power draw
> incredible
> after looking at some games he came to the conclusion you can get 95% of the performance with 130-150watts less power draw


This is not atypical for GPUs in general.

It's why I never fuss over the power consumption of new parts...outside in once-per-decade flukes, efficiency always goes up, by huge margins, generationally.


----------



## dante`afk

sakete said:


> Do we think this launch will be just as bad as the 30xx gen in terms of availability? Scalpers galore?
> 
> And what would be the best one to get? FE? Asus?


considering the marginal performance boost for exorbitant more power consumption and heat dissipation, FE seems the best bet


----------



## ttnuagmada

It's crazy seeing this thing run Witcher 3 at 400+ FPS at 1080p ultra settings. I needed 2 780's to get a locked 60 when that game came out.


----------



## sakete

dante`afk said:


> considering the marginal performance boost for exorbitant more power consumption and heat dissipation, FE seems the best bet


Any knowledge on for which cards waterblocks will be made? I currently have a 3080 FTW3 with a full waterblock (have a custom loop). That will in the end be the main deciding factor for me I suppose, if I even decide to get the 4090. I rarely game as it is these days, partly because my room gets super hot after an hour with all the heat being dumped into the room. That would be even worse with a 4090, though the plan would be to seriously limit power consumption while still getting good framerates.


----------



## ZealotKi11er

What 4090 has shown is we need faster CPUs for lower resolutions. Also can't wait for UE5 to bring new GPUs down.


----------



## WayWayUp

it's scaling better at 8k
for example in Forza he was seeing a 75% bump in 4k vs 3090 and 110% bump in 8k


----------



## Mad Pistol

If you ever needed an excuse to go buy an LG C2 42" OLED screen, the 4090 is it. 4K120 on a single GPU just became very real.


----------



## Mad Pistol

ZealotKi11er said:


> What 4090 has shown is we need faster CPUs for lower resolutions. Also can't wait for UE5 to bring new GPUs down.


I'm curious to see how my 5800x3D works with a 4090.


----------



## Luda

This is terrible! not even 200% increase? no one should buy this crap. seriously dont go get one i have meetings all morning!


----------



## sakete

Man, eventhough I have a RTX 3080 and I barely game anymore, I'm seriously tempted to go get this, lol. My local Micro Center looks like they will have stock tomorrow. And need to get my hands on that Asus TUF 4090, as I know of at least one manufacturer that will make full waterblocks for those. The Strix as well, but that's serious overkill - $400 more for what?


----------



## Krzych04650

Massive gains, so much better than previous gens, although unfortunately the inconsistency that Ampere had is still there so that ~70% average gain consists of a mix of 40% to 100% improvements depending on the game. There also doesn't seem to be much overclocking headroom, it doesn't look nearly as power choked as 2080 Ti and 3090 were, it actually often uses much less than 450W and generally looks like it should have been a 350W card. 

Funny how it went from rumors about 800W cards into the card that is so efficient that it doesn't even fully utilize its power limit at stock 

Got Corsair 12VHPWR cable shipped, now only need the waterblock situation to clear out. Not digging those short cassette tape style waterblocks at all, so that complicates stuff a bit. I really like those Phanteks G40 ones with ports on the side, but I am not sure if they are going to have FE one or not. Otherwise I will probably go TUF because of more ports. Doesn't look like high-end models will do anything at all this time around and they all look stupid to say the least.


----------



## Mad Pistol

sakete said:


> Man, eventhough I have a RTX 3080 and I barely game anymore, I'm seriously tempted to go get this, lol. My local Micro Center looks like they will have stock tomorrow. And need to get my hands on that Asus TUF 4090, as I know of at least one manufacturer that will make full waterblocks for those. The Strix as well, but that's serious overkill - $400 more for what?


My 3080 is fantastic, but the reality that 4K120 on my LG CX with a 4090 is possible is VERY tempting.


----------



## RetroWave78

Interesting but I predicted that NV would put the clocks and PT of AL right at or over the threshold of power / efficiency. Derbauer concludes that if NV simply cut power by 5-10% it would result in consuming 33% less power, i.e. 280w vs 450w. The performance per watt is at 50% PT and then drops off from there.

14:34 mark






If I were to surmise as to why they did this it would be because ultimately the metric, with RDNA3 right around the corner, is still _which is the fastest graphics card in the world _not _which is the most power efficient graphics card in the world. _


----------



## sakete

Mad Pistol said:


> My 3080 is fantastic, but the reality that 4K120 on my LG CX with a 4090 is possible is VERY tempting.


I game on 1440p so I'd see some significant framerate gains on certain games, OR, I'd see lower power consumption as I do cap framerates at 140 (given that my screens top out at 144hz).

It would also open up the possibility for gaming on 3 screens in 1440p when sim-racing (currently have 2 screens).


----------



## EastCoast

This is a 4k card. And at 2k you need to use max iq settings. 
At 1080p (Esports) it's not worth it at that price point. Even Steve GN says as much towards the end.


----------



## Sir Beregond

Mad Pistol said:


> My 3080 is fantastic, but the reality that 4K120 on my LG CX with a 4090 is possible is VERY tempting.


Meh. I am just fine with my 3080 Ti on my LG C1. When I actually need more performance, will see what prices look like. I don't play competitive shooters, or really any shooters much, so maybe I am a little more tolerant of not having locked 120.

Definitely a 4k card though. Just not a fan of $1600 price tag.


----------



## EastCoast

Der8auer is about to release those nifty device. 


It's called WireView


----------



## sakete

EastCoast said:


> This is a 4k card. And at 2k you need to use max iq settings.
> At 1080p (Esports) it's not worth it at that price point. Even Steve GN says as much towards the end.


Hmmm, almost makes more sense to get a 3090ti at this point.


----------



## EastCoast

sakete said:


> Hmmm, almost makes more sense to get a 3090ti at this point.


Didn't some pay as much as the 4090 when the 3090ti was released back in March/April 2022? How much has it dropped this month? A few hundred depending on who you buy from...


----------



## sakete

EastCoast said:


> Didn't some pay as much as the 4090 when the 3090ti was released back in March/April 2022? How much has it dropped this month? A few hundred depending on who you buy from...


My local micro center has an open box EVGA FTW3 3090 Ti for $990. Not sure why it's open box (probably just a return from someone, but hopefully not defective). But given that it's EVGA, I wouldn't even need to buy a new waterblock, my 3080 block should fit the 3090TI block as well. And then if I can sell my 3080 for $500, I'm paying $490 net. Hmmmm.


----------



## BigMack70

Any benchmarks show how much performance (if any) is lost at 4k on this card when running it on PCI-e 3.0 x16 instead of 4.0 x16?


----------



## Spiriva

Cant wait go get a RTX 4090!


----------



## sakete

After watching the GamersNexus review, I'm like ehhh... I'm gonna wait to see what the 4080 brings to the table before I even decide to ditch my 3080.


----------



## dante`afk

the only video you need to watch is this:


----------



## EastCoast

dante`afk said:


> the only video you need to watch is this:
> 
> 
> 
> 
> 
> 
> View attachment 2575400


Ah, no thanks I like to know what the gpu does at 2k and 1080p also. He only reviewed it at 4k. And since I am not the only one who doesn't have a need to go 4k that's not very helpful review.


----------



## sakete

dante`afk said:


> the only video you need to watch is this:
> 
> 
> 
> 
> 
> 
> View attachment 2575400


Ok, very interesting regarding performance per watt. So I could run this at 60% power, and it would consume the same amount of power as my RTX 3080, while performing much better. Maybe even have better thermal efficiency, and thus dumping out less heat into my room (my main issue with the RTX 3080 currently, it gets so frigging hot).


----------



## sakete

EastCoast said:


> Ah, no thanks I like to know what the gpu does at 2k and 1080p also. He only reviewed it at 4k. And since I am not the only one who doesn't have a need to go 4k that's not very helpful review.


Look at the Gamers Nexus review. Conclusion, at 1440p, it only makes sense when playing at max graphical settings with RT on. This really is a 4k card. At 1080p, just don't bother, you'll be severely CPU limited.


----------



## tubs2x4

some games with 1440p at max settings on like a 3070 like i have going to a 4090 would make a big difference and GN using a 12700kf as a test bench showed that cpu can let the 4090 stretch its legs in a lot of games


----------



## Glottis

dante`afk said:


> the only video you need to watch is this:
> 
> 
> 
> 
> 
> 
> View attachment 2575400


This video is kinda misleading and badly researched on der8auer's part. He tested a couple of scenarios and started accusing Nvidia of using absurd power limits? Suggesting that you get 95% performance at 60% power limit? This is just not true across the board. For example in Control framerate nosedives if you lower power limit by just 100 watts. Every game engine, game, application behaves different, der8auer of all people should know this. And now all dummies will set 60% power limit and think they are getting 95% performance in everything. * facepalm *


----------



## dr/owned

sakete said:


> Look at the Gamers Nexus review. Conclusion, at 1440p, it only makes sense when playing at max graphical settings with RT on. This really is a 4k card. At 1080p, just don't bother, you'll be severely CPU limited.


Honestly if you're buying a flagship card for 1080 gaming you're doing it wrong. "oh wow guyz I get 700 fps!"

I was on 1440 since 2016 and 4K since 2021 (and that 4K transition made me somewhat late). Actually 1440 since like 2012 or something but only at 60hz for productivity, cause 1080 is like working through a peep hole.


----------



## sakete

dr/owned said:


> Honestly if you're buying a flagship card for 1080 gaming you're doing it wrong. "oh wow guyz I get 700 fps!"
> 
> I was on 1440 since 2016 and 4K since 2021 (and that 4K transition made me somewhat late)


I cap all my games at 140fps anyway, as my monitors don't go beyond 144hz. Pointless to push more frames at that point, best to save the energy. But this 4090 would still provide benefits to 1440p, and then if you I decided to upgrade to 4k screens at some point (not happening anytime soon unless one of my monitors dies), I wouldn't need to right away buy a new GPU.


----------



## Benni231990

So you see nobody needs the 660watt Bios from ASUS self the FE never hits 600watt xD 

I Never expect that


----------



## EastCoast

dr/owned said:


> Honestly if you're buying a flagship card for 1080 gaming you're doing it wrong. "oh wow guyz I get 700 fps!"
> 
> I was on 1440 since 2016 and 4K since 2021 (and that 4K transition made me somewhat late)


Once again, your assessment on this is wholly inaccurate (just like your transient spikes don't matter post mocking people who took notice). Did you know people are making bank playing esport type games on Twitch/YT at 2k and 1080p?

Buying the latest and greatest always gets them subs so it's an investment to get a 4090 in BR games like Warzone, etc for example and pub stomp the lobby claiming their 4090 giving them XXX FPS helped them dance around the plebs. That's regardless if it's actually better at 1080p or not. It's all marketing. And showing their followers they are "current" is what puts money in the bank, sort of speaking.

So, it's not doing it wrong if you are trying to make a buck.


----------



## dr/owned

EastCoast said:


> Did you know people are making bank playing esport type games on Twitch/YT at 2k and 1080p?


The point.
[A lot of sky]
Your understanding.


----------



## RetroWave78

Glottis said:


> This video is kinda misleading and badly researched on der8auer's part. He tested a couple of scenarios and started accusing Nvidia of using absurd power limits? Suggesting that you get 95% performance at 60% power limit? This is just not true across the board. For example in Control framerate nosedives if you lower power limit by just 100 watts. Every game engine, game, application behaves different, der8auer of all people should know this. And now all dummies will set 60% power limit and think they are getting 95% performance in everything. * facepalm *


Yes tensor / RT cores being used will require more power. I believe the example he used was non RT.


----------



## MrTOOSHORT

Does anyone know if you can use the EVGA 12 pin adapter cable with the 4090? I have an EVGA 1600 T2.

*








EVGA PerFE 12 Cable, Individually Sleeved Cable, Built for NVIDIA GeForce RTX 30 Series Founders Edition, Included Cable Combs


NVIDIA GeForce RTX 30 Series GPUs deliver the ultimate performance for gamers and creators. They're powered by Ampere - NVIDIA's 2nd gen RTX architecture with new RT Cores, Tensor Cores, and streaming multiprocessors for the most realistic ray-traced graphics and cutting-edge AI features. The...




www.evga.com




*


----------



## mirkendargen

RetroWave78 said:


> Yes tensor / RT cores being used will require more power. I believe the example he used was non RT.


On 3090's the opposite was generally true. RT cores were a bottle neck that caused less shader core utilization and RT games often used less power. SER and beefier RT cores should help with that.


----------



## yzonker

Benni231990 said:


> So you see nobody needs the 660watt Bios from ASUS self the FE never hits 600watt xD
> 
> I Never expect that


Well and it appears to go past 600w if you really push it with Furmark which is unusual. Usually nothing will get a card much past the PL.


----------



## mirkendargen

MrTOOSHORT said:


> Does anyone know if you can use the EVGA 12 pin adapter cable with the 4090? I have an EVGA 1600 T2.
> 
> *
> 
> 
> 
> 
> 
> 
> 
> 
> EVGA PerFE 12 Cable, Individually Sleeved Cable, Built for NVIDIA GeForce RTX 30 Series Founders Edition, Included Cable Combs
> 
> 
> NVIDIA GeForce RTX 30 Series GPUs deliver the ultimate performance for gamers and creators. They're powered by Ampere - NVIDIA's 2nd gen RTX architecture with new RT Cores, Tensor Cores, and streaming multiprocessors for the most realistic ray-traced graphics and cutting-edge AI features. The...
> 
> 
> 
> 
> www.evga.com
> 
> 
> 
> 
> *


No, that's a 12pin cable that was used on nothing but 3080/90 FE cards and isn't any kind of standard. The 12VHPWR cable is different and they aren't compatible.


----------



## yzonker

-


----------



## MrTOOSHORT

Thanks.

Says on EVGA site that the cable is compatible with the 3090ti FTW which uses the 16pin standard now. So I don't know what to think.


----------



## yzonker

MrTOOSHORT said:


> Thanks.
> 
> Says on EVGA site that the cable is compatible with the 3090ti FTW which uses the 16pin standard now. So I don't know what to think.


I cant get the link to work on my phone, but I don't think it has the 4 sense pins?


----------



## Glottis

MrTOOSHORT said:


> Thanks.
> 
> Says on EVGA site that the cable is compatible with the 3090ti FTW which uses the 16pin standard now. So I don't know what to think.


What are you confused about? Product page for that cable mentions 12 pin multiple times and you can see 12 pin connector in the photos. There is no mention of 16 pin whatsoever.


----------



## MrTOOSHORT

yzonker said:


> I cant get the link to work on my phone, but I don't think it has the 4 sense pins?



No 4 sense pins, but it says the cable can be used with the 3090ti FTW, which does have the 4 sense pins on the card.

The cable is a 12pin to two 8pin cable.


----------



## sakete

So the FE card is rated at 450W TDP. Is anything known yet on the board partner cards?


----------



## mirkendargen

sakete said:


> So the FE card is rated at 450W TDP. Is anything known yet on the board partner cards?


Correction, the FE card goes up to 600w, that's what ultimately matters. It's unknown what the AIB cards are. Some will do at least 600w, unclear if any will do more, or if doing more would even do anything because the voltage limit may be hit before that.


----------



## sakete

mirkendargen said:


> Correction, the FE card goes up to 600w, that's what ultimately matters. It's unknown what the AIB cards are. Some will do at least 600w, unclear if any will do more, or if doing more would even do anything because the voltage limit may be hit before that.


Ok thanks. Well, IF I decided to get this 4090 (honestly only started thinking about this today as I saw the reviews pop up, so it's all very impulsive), I'd probably get the Asus TUF, as I know that Optimus is making a block that will fit both the TUF and Strix. The Strix will be absolute overkill I think, so definitely not spending another $400 on that.


----------



## dr/owned

Looks like only 4 shunt resistors total:









Although it seems like maybe only Furmark is going to really stretch 600W anyways.

The PCB being (basically) the same as the 3090Ti with 2 power stages filled in would make me guess waterblocks are going to come quick for it. The inductors moved around by the power connector but that area is usually clearanced anyways since it's irrelevant for the block.


----------



## MrTOOSHORT

Gamers Nexus did a review of the 4090 FE. They talked about the included 16pin to four 8pin adapter. If you plug in only three 8pins, you can't move the power limit, card still works. Plug in only two 8pins, then the card won't boot at all.

I'll just use the included cable that comes with the 4090, until a specific one is made for my psu, or buy a whole new psu altogether.


----------



## Rbk_3

sakete said:


> My local micro center has an open box EVGA FTW3 3090 Ti for $990. Not sure why it's open box (probably just a return from someone, but hopefully not defective). But given that it's EVGA, I wouldn't even need to buy a new waterblock, my 3080 block should fit the 3090TI block as well. And then if I can sell my 3080 for $500, I'm paying $490 net. Hmmmm.


I doubt it will fit. They 3090 Hybrid kit didn't fit on the ti.


----------



## Rbk_3

MrTOOSHORT said:


> Gamers Nexus did a review of the 4090 FE. They talked about the included 16pin to four 8pin adapter. If you plug in only three 8pins, you can't move the power limit, card still works. Plug in only two 8pins, then the card won't boot at all.
> 
> I'll just use the included cable that comes with the 4090, until a specific one is made for my psu, or buy a whole new psu altogether.


I have the T2 also and ordered the Cablemod adapter for our PSU






CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store







store.cablemod.com


----------



## energie80

Why don’t you just use the included cable?


----------



## dr/owned

energie80 said:


> Why don’t you just use the included cable?


Cause it's an adapter cable and that sucks compared to a native cable.


----------



## sakete

Rbk_3 said:


> I doubt it will fit. They 3090 Hybrid kit didn't fit on the ti.


Thanks. I have already decided to not go for the 3090ti. IF anything, I'd just get the 4090.


----------



## energie80

What is the difference?


----------



## mirkendargen

energie80 said:


> What is the difference?


Adapter = more connections, more connections = more places connections can fail.


----------



## Rbk_3

energie80 said:


> Why don’t you just use the included cable?


Because I don't want that tangled mess ruining the ascetics of my case


----------



## energie80

you have to wait then


----------



## xarot

Strix 4090 non-oc is 2529,99 EUR in my country. Cheapest 4090 TUF is 1999,00€. 

Over 500€ difference for the Strix is just insane.


----------



## ZealotKi11er

Looking at how 4090 does not use 450w all the time. What do we expect at 600w? Is its going to just be limited by voltage/core clk? This does not seem like a pwr limited card.


----------



## Tyl3n0L

To my understanding you can use 3x8pins on a 4090 and run it perfectly fine at 450w and the 600w figure then requires 4x8pins that's for overclocking.


----------



## Slaughtahouse

dante`afk said:


> View attachment 2575400


More reviewers need to do this. Peak FPS @ X Power Target to help explain the value of UVing. Knowing that you can obtain 95% performance (117fps vs 123.5fps) at 78% power (329w vs 422w) is critical on cards like these in my opinion.

Thank you for sharing this


----------



## mirkendargen

For people wondering, it's easy to tell what Nvidia is doing with the adapter:










They are just connecting the ground on the 4th 8pin to Sense0, so that when it isn't connected the card is told 450w max.


----------



## dante`afk

Rbk_3 said:


> I have the T2 also and ordered the Cablemod adapter for our PSU
> 
> 
> 
> 
> 
> 
> CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store
> 
> 
> 
> 
> 
> 
> 
> store.cablemod.com


i wouldnt trust cablemod; CableMod PCI-E cable connector melted in my RTX3090 :(


----------



## dr/owned

dante`afk said:


> i wouldnt trust cablemod; CableMod PCI-E cable connector melted in my RTX3090 :(


I'm willing to let it ride. New connector during covid supply chain problems so I'll allow that some bad cables got made. Has there been recent problems?


----------



## Slaughtahouse

I watched the Debaurer video. I have to agree with his consensus about the TDP / cooler design.

Sure for consumers who know how to UV or adjust power limits, its nice to get an oversized aircooler so you dont need to deshroud but at the end of it, you're paying for extra for it. 

For businesses that utilize latest in graphics (film post-production, game design etc), good luck fitting a 4090 in a standard workstation.


----------



## yzonker

Has anyone seen what the actual max voltage limit is on the 4090? 

Looks like it's around 1.05v default, but I don't think I've seen where it gets to with it maxed out (I'm assuming it can be increased obviously).


----------



## yzonker

yzonker said:


> Has anyone seen what the actual max voltage limit is on the 4090?
> 
> Looks like it's around 1.05v default, but I don't think I've seen where it gets to with it maxed out (I'm assuming it can be increased obviously).


Answer myself. 1100mv, same as 30 series.









Overclocking - Seite 33 - Hardwareluxx


GeForce RTX 4090 in der Founders Edition im Test.




www.hardwareluxx.de





Not sure why they used Firestorm, although I did see a shot of AB that had the voltage slider greyed out, so maybe it's not working yet.


----------



## dr/owned

Slaughtahouse said:


> For businesses that utilize latest in graphics (film post-production, game design etc), good luck fitting a 4090 in a standard workstation.


Businessses don't fit their own hardware, they buy a prebuild from Dell/HP that already has the card fitted with a blower cooler or whatever. (cause it has a warranty agreement attached to it where they'll swap the whole thing out and troubleshooting becomes the OEM problem, not the IT department)


----------



## yzonker

And a mix of shunts.









NVIDIA GeForce RTX 4090 Founders Edition 24GB Review - Drink less, work more! | Page 3 | igor'sLAB


The time has come and after an almost unbearable period of waiting and speculation, I can finally present the GeForce RTX 4090 Founders Edition (FE) today including all technical details and the long…




www.igorslab.de


----------



## RetroWave78

dr/owned said:


> I'm willing to let it ride. New connector during covid supply chain problems so I'll allow that some bad cables got made. Has there been recent problems?


No problem with CableMod, been using their 12 pin cable with 3090 Fe for well over a year now.


----------



## ZealotKi11er

Mad Pistol said:


> I'm curious to see how my 5800x3D works with a 4090.


That is what I also have. 5950x or 5800x3d at 4K.


----------



## RetroWave78

xarot said:


> Strix 4090 non-oc is 2529,99 EUR in my country. Cheapest 4090 TUF is 1999,00€.
> 
> Over 500€ difference for the Strix is just insane.


This is why EVGA dipped out, AIB's are being asked to compete with NV and there are basically no margins after procuring dies from NV. Notice how the AIB coolers are materially cheaper than reference? I.E. Asus Tuff, still going for $100 over reference. AIB's probably make $100 per GPU sold when all is said and done. 

Nvidia has much higher profit margins even with the better designed cooler, R&D, overhead etc because they are buying the dies directly from TSMC etc. 

EVGA used to justify markup because the cooler performed better than reference and the card had more overclocking headroom with higher TDP. This is no longer the case. We will quickly discover that Asus Strix doesn't run that much cooler than reference and ADL is not in need of more power. Asus will have to adjust the price to what the market will bear, meaning, no margins. Asus will probably dip out as well. This is what the death of an industry looks like. 

There are other factors, EVGA may have bought a bunch of 30 series cards earlier this year and can't offload them and also, the step up and warranty program is open to abuse by crypto miners who just flogged their GPU willy nilly 24/7 on Nicehash with noob settings (i.e. 120% PT) "lik so cool dude". 

Crypto mining killed the industry. This is what that looks like. Followed on by Greatest Depression with $2k 80 series cards = RIP discrete GPU PC gaming.


----------



## dr/owned

yzonker said:


> And a mix of shunts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 Founders Edition 24GB Review - Drink less, work more! | Page 3 | igor'sLAB
> 
> 
> The time has come and after an almost unbearable period of waiting and speculation, I can finally present the GeForce RTX 4090 Founders Edition (FE) today including all technical details and the long…
> 
> 
> 
> 
> www.igorslab.de


Rambling: even when I shunted my 3090 TUF it never reached 1000W (I think 700W might have been the max), because there was (I think) some sort of internal-current-sense going on for minor rails. I hope NV didn't reduce the shunts externally and increase the aggression of internal power limiting.

I can also read it the other way where they're less concerned about operating the gpu perpetually at the power limit as was the 3090 story so they didn't bother going so hard with the shunt monitors.


----------



## RetroWave78

4090 @ 700W = 10% performance over 4090 @ 450W. Golf clap. No wonder Europeans hate Americans so much, especially right now when the cost of energy is 2-3x higher than average.

Next-to-noone in Europe is interested in a 400w discrete GPU at the moment. That's great for availability but rather ominous in terms of, long term survival I suppose?

33% more PT, +40mv voltage = 5% increase in performance, isn't this exactly what I stated it would be like a week ago? 









GeForce RTX 4090 Founders edition review


NVIDIA unveiled their much-anticipated Series 4000 graphics cards, beginning with the GeForce RTX 4090 founder edition. The new graphics card provides greater performance, more extensive raytracing fe... Overclocking




www.guru3d.com


----------



## dr/owned

RetroWave78 said:


> 4090 @ 700W = 10% performance over 4090 @ 450W. Golf clap. No wonder Europeans hate Americans so much, especially right now when the cost of energy is 2-3x higher than average.


would tolerate 1000W for a 3% improvement in performance.

With mining dead anyways no one is really maxing their card 24/7 anyways so the peak figure is less relevant int total-cost-ownership.


----------



## bmagnien

Has anyone seen a full gpuz max sensor screenshot from any 3DMark tests?


----------



## ZealotKi11er

The OC model comes with 4x PCIe adapter, the non OC comes with 3x PCIe adapter. 

This is the cheapest model in Canada so far. Only 3 slot cooler. I bet it will perform worse than FE.


----------



## ZealotKi11er

RetroWave78 said:


> 4090 @ 700W = 10% performance over 4090 @ 450W. Golf clap. No wonder Europeans hate Americans so much, especially right now when the cost of energy is 2-3x higher than average.
> 
> Next-to-noone in Europe is interested in a 400w discrete GPU at the moment. That's great for availability but rather ominous in terms of, long term survival I suppose?
> 
> 33% more PT, +40mv voltage = 5% increase in performance, isn't this exactly what I stated it would be like a week ago?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GeForce RTX 4090 Founders edition review
> 
> 
> NVIDIA unveiled their much-anticipated Series 4000 graphics cards, beginning with the GeForce RTX 4090 founder edition. The new graphics card provides greater performance, more extensive raytracing fe... Overclocking
> 
> 
> 
> 
> www.guru3d.com


That seems like a very low pef increase. Maybe this GPU is memory BW limited. We have large L2 but still same BW as 3090 Ti.


----------



## yzonker

ZealotKi11er said:


> That seems like a very low pef increase. Maybe this GPU is memory BW limited. We have large L2 but still same BW as 3090 Ti.


I bet it's not even 10% unless you add core/mem offsets too. And I'm not sure where 700w came from. No such thing. There's a lot wrong in that statement.


----------



## yzonker

dr/owned said:


> would tolerate 1000W for a 3% improvement in performance.
> 
> With mining dead anyways no one is really maxing their card 24/7 anyways so the peak figure is less relevant int total-cost-ownership.


Yea I think it's awesome we finally got enough power to not be limited by the PL. People can always reduce power. Can't add it easily though. I really couldn't be happier with this. I mean, common, people are moaning about power usage for a flagship no compromise card.


----------



## 8472

__ https://twitter.com/i/web/status/1579945471849955328


----------



## yzonker

Is Nvidia going to sell all of these on their store? 



https://store.nvidia.com/en-us/geforce/store/?page=2&limit=9&locale=en-us&search=4090


----------



## dr/owned

yzonker said:


> Is Nvidia going to sell all of these on their store?
> 
> 
> 
> https://store.nvidia.com/en-us/geforce/store/?page=2&limit=9&locale=en-us&search=4090


Probably not. They usually just link to like Newegg and Best Buy and stuff product pages.


----------



## warbucks

RetroWave78 said:


> This is why EVGA dipped out, AIB's are being asked to compete with NV and there are basically no margins after procuring dies from NV. Notice how the AIB coolers are materially cheaper than reference? I.E. Asus Tuff, still going for $100 over reference. AIB's probably make $100 per GPU sold when all is said and done.
> 
> Nvidia has much higher profit margins even with the better designed cooler, R&D, overhead etc because they are buying the dies directly from TSMC etc.
> 
> EVGA used to justify markup because the cooler performed better than reference and the card had more overclocking headroom with higher TDP. This is no longer the case. We will quickly discover that Asus Strix doesn't run that much cooler than reference and ADL is not in need of more power. Asus will have to adjust the price to what the market will bear, meaning, no margins. Asus will probably dip out as well. This is what the death of an industry looks like.
> 
> There are other factors, EVGA may have bought a bunch of 30 series cards earlier this year and can't offload them and also, the step up and warranty program is open to abuse by crypto miners who just flogged their GPU willy nilly 24/7 on Nicehash with noob settings (i.e. 120% PT) "lik so cool dude".
> 
> Crypto mining killed the industry. This is what that looks like. Followed on by Greatest Depression with $2k 80 series cards = RIP discrete GPU PC gaming.


You're exagerating quite a bit with this "death of the industry" narrative. Flagship GPUs never accounted for more than 10-15% of overall gpu sales. The vast majority of gpu sales were from the mid range models. There's a lot more at play with EVGA leaving the GPU business than any of the antics by Nvidia. The complaints from AIBs on having to compete with Nvidia are valid.


----------



## dr/owned

8472 said:


> __ https://twitter.com/i/web/status/1579945471849955328












No...just....no. Everyone else is going to make $180 blocks or less. Average probably $150. Bykski probably $110. EK got in their head that because they yeet a bunch of free product to Techtubers that they're the ultimate premium brand. Reality is that their fin structure has been unchanged for 10 years (very coarse) and the only improvement is some end cap glued on the the edge of the block. Ask them about their plain clear Cryofuel gumming up blocks and how "premium" that experience is.


----------



## yzonker

dr/owned said:


> Probably not. They usually just link to like Newegg and Best Buy and stuff product pages.


Yea I saw that when I clicked buy on some existing cards.


----------



## ZealotKi11er

Oh wow, FE is 2099 CAD. No way I am buying AIB if FE is this much cheaper. If I need better cooling I can just do custom Loop since I already have.


----------



## Slaughtahouse

dr/owned said:


> Businessses don't fit their own hardware, they buy a prebuild from Dell/HP that already has the card fitted with a blower cooler or whatever. (cause it has a warranty agreement attached to it where they'll swap the whole thing out and troubleshooting becomes the OEM problem, not the IT department)


That’s not true. Where I work, we buy workstations and upgrade them (GPUs) as the system will easily last quite a few years.

4090s don’t fit. Let alone the PSU supplied cannot sustain it…


----------



## Arizor

Getting nervous over here in Aus - just got off the phone with a store confirming they're up for sale tonight midnight (it's already the 12th here), but still haven't listed any of the cards or prices.

Pass the lube lads.


----------



## ZealotKi11er

Please people dont buy the STIX. The price is an insult. Its $660 CAD over FE. No way ASUS needs that much to make money on the GPU.


----------



## dr/owned

ZealotKi11er said:


> Please people dont buy the STIX. The price is an insult. Its $660 CAD over FE. No way ASUS needs that much to make money on the GPU.


Asus is being real lame this time around. $200 price premium for a basic factory overclock, and I'll guarantee they're not going to stock any of the $1599 one:


----------



## coelacanth

The 4090 performance and features seem really good. It also seems like a great value for a lot of content creation workloads. After watching der8auer's video it seems that MSI Afterburner curve editor could work wonders for the card, keeping most of the performance but decreasing the power draw significantly. That said, I am just not interested in this card at $1,600+. I play single player games on a 4K 120Hz screen and the 3080 Ti does that just fine. I'm OK lowering a few settings that I don't even notice anyway for a good frame rate.

For me there's just no point in spending well over double for what the 3080 Ti cost me for basically the same experience. Once this card (or Navi 31 assuming performance is good) drops below $1,000 then I may start to look into buying one.


----------



## tubs2x4

ZealotKi11er said:


> Oh wow, FE is 2099 CAD. No way I am buying AIB if FE is this much cheaper. If I need better cooling I can just do custom Loop since I already have.


Be cheaper in 6 months.


----------



## Arizor

Yep just silly for this gen.

Plus the 'OC' version, as we all know, is just testing a very mild OC BIOS, which we can just flash onto the non-OC card. So whether or not you get a good chip is still a lottery, not that I think a 'good' versus 'bad' chip here means much for this gen.



dr/owned said:


> Asus is being real lame this time around. $200 price premium for a basic factory overclock, and I'll guarantee they're not going to stock any of the $1599 one:
> 
> View attachment 2575499


----------



## ZealotKi11er

dr/owned said:


> Asus is being real lame this time around. $200 price premium for a basic factory overclock, and I'll guarantee they're not going to stock any of the $1599 one:
> 
> View attachment 2575499


I think the OC model can do 600w and comes with 600w connector. These are things reviewer need to check. It like there is no thinking about us enthusiast and review 1600 gpu like a 200 one.


----------



## sakete

ZealotKi11er said:


> I think the OC model can do 600w and comes with 600w connector. These are things reviewer need to check. It like there is no thinking about us enthusiast and review 1600 gpu like a 200 one.


Has that been confirmed anywhere? Or tomorrow when all the specs are released?


----------



## ZealotKi11er

sakete said:


> Has that been confirmed anywhere? Or tomorrow when all the specs are released?


I see it with other model. Also with FE the power slider is locked if you use the 3xpcie vs 4xpcie which is probably the same for TUF.


----------



## dr/owned

ZealotKi11er said:


> I think the OC model can do 600w and comes with 600w connector. These are things reviewer need to check. It like there is no thinking about us enthusiast and review 1600 gpu like a 200 one.


The 3090 TUF OC was functionally the exact same as the non-OC. They put a more restrictive power limit in the BIOS for one of the sub-rails like Aux being 50W instead of 75W but you could flash the OC BIOS and away you go. Maybe they won't include the "magic" _cough 2 extra ground pins_ adapter cable in the box for 600W but if you're going $1600 deep on a GPU what's another $30 for a cable.

The 45 mhz OC difference is as insulting as a dead relative leaving you $1 while your bro gets $1M.

But as said previous it's not going to matter because they're only going to stock the OC as a $200 tax if you want an Asus card or can't get the FE and are dying for a 4090. (And the Strix is a hard pass...no one should ever buy that card)


----------



## Arizor

Yep this generation the Strix perhaps makes the least sense it ever has. Grab the TUF, flash OC BIOS, done. WCCF Tech were able to hit 3ghz at around 450W and got about a 9% uplift in performance vs stock in synthetic benchmarks


----------



## sakete

dr/owned said:


> The 3090 TUF OC was functionally the exact same as the non-OC. They put a more restrictive power limit in the BIOS for one of the sub-rails like Aux being 50W instead of 75W but you could flash the OC BIOS and away you go. Maybe they won't include the "magic" _cough 2 extra ground pins_ adapter cable in the box for 600W but if you're going $1600 deep on a GPU what's another $30 for a cable.
> 
> The 45 mhz OC difference is as insulting as a dead relative leaving you $1 while your bro gets $1M.
> 
> But as said previous it's not going to matter because they're only going to stock the OC as a $200 tax if you want an Asus card or can't get the FE and are dying for a 4090. (And the Strix is a hard pass...no one should ever buy that card)


My local Micro Center has a product page up of what they're supposed to have in store tomorrow when they open, and that includes the TUF, TUF OC and Strix. I'll see what they actually have, as there's a disclaimer that actual stock may vary. I'm only really interested in the TUF, if after a night's sleep I still feel like I should get one. I wasn't even paying attention to all this until today 

But if I am still interested, I'll swing by right before they open to see what they have. Don't really care about OC potential too much, if anything I'll be undervolting these to reduce temps. Maybe I can bring temps below my 3080 at full load, while still getting much more performance, that would be perfect.


----------



## sakete

Also, anyone know at what time these are supposed to drop at online stores?


----------



## Arizor

sakete said:


> Also, anyone know at what time these are supposed to drop at online stores?


In Australia it's midnight tonight, which would make it (I think) 6am pacific off the top of my head.


----------



## sakete

Man, I can't stress how bummed I am that EVGA dropped out of the GPU business. Their customer service is second to none. Have had to do exchanges a couple times for other products and they make it as easy and stress free as can be.

Asus I've heard can be a real pain, though I've never had an Asus product fail on me before (I typically get Asus mobos). This would be my first Asus card if anything, because of the waterblock support.


----------



## yzonker

Arizor said:


> In Australia it's midnight tonight, which would make it (I think) 6am pacific off the top of my head.


Yea here in central time US, the Microcenter in KC is opening 2 hours early just tomorrow (8:00 am CST) for the launch. I suspect that coincides with all the releases and AIB reviews just like the FE reviews coming today at 8:00 CST.


----------



## yzonker

Just saw this too. 


__
https://www.reddit.com/r/nvidia/comments/y1pxn9


----------



## Rei86

Holy crap the card isn't even out and this topic has surpassed the RTX 3090Ti Thread.

Subbed to see what you guys do with it.


----------



## dr/owned

I don’t get what the problem is launching at 9AM PST. 6AM is brutal cause that means getting up at 5:30 to have tabs and whatnot open. Like the east coast would not care at all 9am vs 12pm.

ugh I’m probably going to have to stay awake all night cause I sure as hell can’t wake up after 2 hours sleep.


----------



## Arizor

dr/owned said:


> I don’t get what the problem is launching at 9AM PST. 6AM is brutal cause that means getting up at 5:30 to have tabs and whatnot open. Like the east coast would not care at all 9am vs 12pm.
> 
> ugh I’m probably going to have to stay awake all night cause I sure as hell can’t wake up after 2 hours sleep.


Yeah it's a weird one, though not as bad as waiting up to midnight to start hammering that F5 key across various Aussie sites, knowing I'm going to be woken up at 6am regardless by my 6 year old


----------



## sakete

yzonker said:


> Yea here in central time US, the Microcenter in KC is opening 2 hours early just tomorrow (8:00 am CST) for the launch. I suspect that coincides with all the releases and AIB reviews just like the FE reviews coming today at 8:00 CST.


Hmm, the person I chatted with on the Microcenter site had said 9am local time, but I guess she meant 9am EST, as that explains why it's 7am local time. Yeah, I'm not going to make it to that  Will see if anything is available online if I still want it. And if not, well, too bad.


----------



## dr/owned

Rei86 said:


> Holy crap the card isn't even out and this topic has surpassed the RTX 3090Ti Thread.
> 
> Subbed to see what you guys do with it.


Cause the 3090Ti was the dumbest purchase anyone could possibly make  a) It was obvious we were 6 months away from the 4000 series. b) It was $2200 at launch. c) 3% performance boost over a 3090. d) everyone who would buy it already had a 3090.

It's some WSB level of "special" if you bought a version like the 3090Ti KPE. Or you hopefully have so much money that lighting $1500 on fire is like losing a nickel down the sewer. These people do exist...they'll pay someone to replace all the furniture in their house cause they get bored with it.


----------



## yzonker

sakete said:


> Hmm, the person I chatted with on the Microcenter site had said 9am local time, but I guess she meant 9am EST, as that explains why it's 7am local time. Yeah, I'm not going to make it to that  Will see if anything is available online if I still want it. And if not, well, too bad.


I just saw it on their website for the KC store.


----------



## dante`afk

to me, OC and and AiB seems literally dead with this gen, its all about finding that PT sweetspot/fps


also, I don't think we'll see any shortages (fingers crossed)  and everyone will be able to get a card


----------



## changboy

I have many 3090 but never tried SLI, do you think sli of 3090 not worth buying a 4090 ?

Just the nvlink its 300$ and i dont have....


----------



## ZealotKi11er

dante`afk said:


> to me, OC and and AiB seems literally dead with this gen, its all about finding that PT sweetspot/fps
> 
> 
> also, I don't think we'll see any shortages (fingers crossed)  and everyone will be able to get a card


Most likely not. Day 1 or first week it could be but should be normal after that. 3090 was very easy to get and 3080 I was able to pre-order 2 weeks after launch and got it 2 month after. That was worse case.


----------



## Arizor

Just talked to an Aussie store, and he reckons we've got _tonnes_ of stock compared to the 3-series. He said he doesn't think they'll run out for at least the whole first week, and even then more is on the way.

So I can only imagine North America is even better stocked.


----------



## changboy

From that test , not much difference from 3090 sli to the 4090, maybe all game dosent support the sli that sli will fail, not sure if i better just buy a nvlink bridge. I need a 3 slot.

Cant put the link from youtube


----------



## dr/owned

changboy said:


> I have many 3090 but never tried SLI, do you think sli of 3090 not worth buying a 4090 ?
> 
> Just the nvlink its 300$ and i dont have....


SLI always had issues and few games support dx12 frame sli whatever thing because it was a chicken egg problem of no one having dual 3090s so no point making support for it. You’d be throwing $300 on a dead end.


----------



## dr/owned

Arizor said:


> Just talked to an Aussie store, and he reckons we've got _tonnes_ of stock compared to the 3-series. He said he doesn't think they'll run out for at least the whole first week, and even then more is on the way.
> 
> So I can only imagine North America is even better stocked.


How was Aussie stock historically against US? Table scraps?


----------



## changboy

Hmm ya for this i didnt bought that bridge yet, seam not bad for 3 or 4 game thats it.


----------



## changboy

Check that crazy Canada scalper price : 






Are you a human?







www.newegg.ca


----------



## Mad Pistol

changboy said:


> Check that crazy Canada scalper price :
> 
> 
> 
> 
> 
> 
> Are you a human?
> 
> 
> 
> 
> 
> 
> 
> www.newegg.ca


Eh?


----------



## changboy

Lol the card is gone, maybe he sold it at 4699$, some are crazy


----------



## Arizor

dr/owned said:


> How was Aussie stock historically against US? Table scraps?


table scraps is kind, we never even get the FE version. Very limited stock of AIBs for the first few weeks historically too.


----------



## dr/owned

changboy said:


> Lol the card is gone, maybe he sold it at 4699$, some are crazy


Retailers remove 3rd party listings all the time. Pre release product....absurd price...probably high chance of fraud or a drop shipper and they don't want that on their platform.


----------



## changboy

dr/owned said:


> Retailers remove 3rd party listings all the time. Pre release product....absurd price...probably high chance of fraud or a drop shipper and they don't want that on their platform.


Ya but when you check those add they will delivery in 30 days, its bad. Kind of scam.


----------



## LunaP

SOON!~!!!

Is anyone going w/ the strix vs the FE version due to the dual hdmi 2.1 ports? And do you think that might cause any issues or nah? I'm thinking of going the strix cuz blocks are launching first for them w/ Optimus, but was originally going w/ the FE since Bestbuy near me would have it, kinda conflicted ( putting a waterblock on it so )


----------



## Arizor

LunaP said:


> SOON!~!!!
> 
> Is anyone going w/ the strix vs the FE version due to the dual hdmi 2.1 ports? And do you think that might cause any issues or nah? I'm thinking of going the strix cuz blocks are launching first for them w/ Optimus, but was originally going w/ the FE since Bestbuy near me would have it, kinda conflicted ( putting a waterblock on it so )


I'm just going TUF for the same reasons (plus no FE in Oz) - TUF has 2 x HDMI and is compatible with the Optimus block.


----------



## dr/owned

LunaP said:


> SOON!~!!!
> 
> Is anyone going w/ the strix vs the FE version due to the dual hdmi 2.1 ports? And do you think that might cause any issues or nah? I'm thinking of going the strix cuz blocks are launching first for them w/ Optimus, but was originally going w/ the FE since Bestbuy near me would have it, kinda conflicted ( putting a waterblock on it so )


Optimus has never been first to deliver anything. They'll talk it up like they're gonna ship next week and the next post you'll see is "looking for a customer card to confirm fitment" because they don't get the CAD files from the AIB mfg's. So it'll take 2 weeks minimum for them to even get a card in hand and then they have to manufacture a prototype. They didn't make a 3090Ti block so they're starting further behind than say....Alphacool who only need to router out an extra 1cm on the notch for the moved power connector on their 3090Ti block. Then they'll screw up production in some way anyways and see your block in January 2023 and it'll do the same as a $120 block.

I swear I do have an irrational hatred of Optimus. It took them like a week and getting publicly called out by me before they're like "oh yeah that reservoir that we say is "in stock" actually isn't". If they actually ran themselves like a top tier boutique company I'd probably be like "yeah eff it...$400 for a block why not" but they're a hobby shop / side hustle probably-one-man-show.

HDMI...meh you can passive adapt DP to HDMI. The Strix is a stupid card to buy if you're waterblocking....the cheapest base model is where it's at or FE.


----------



## Shaded War

Any guesses for when we can add to cart? Looking for FE on bestbuy or possibly something from Asus on amazon.




LunaP said:


> SOON!~!!!
> 
> Is anyone going w/ the strix vs the FE version due to the dual hdmi 2.1 ports? And do you think that might cause any issues or nah? I'm thinking of going the strix cuz blocks are launching first for them w/ Optimus, but was originally going w/ the FE since Bestbuy near me would have it, kinda conflicted ( putting a waterblock on it so )


I could see going TUF for the extra HDMI port. The Strix doesn't seem to offer much for $400 premium besides more RGB crap that your removing for water block.


----------



## dante`afk

LunaP said:


> SOON!~!!!
> 
> Is anyone going w/ the strix vs the FE version due to the dual hdmi 2.1 ports? And do you think that might cause any issues or nah? I'm thinking of going the strix cuz blocks are launching first for them w/ Optimus, but was originally going w/ the FE since Bestbuy near me would have it, kinda conflicted ( putting a waterblock on it so )


cooling is not an issue here, doesnt make sense paying 400+ for Optimus blocks that you won't see for the next 5 weeks anyway.

bykski/bitspower will give equal performance for 1/4 the price.


----------



## sakete

dr/owned said:


> Optimus has never been first to deliver anything. They'll talk it up like they're gonna ship next week and the next post you'll see is "looking for a customer card to confirm fitment" because they don't get the CAD files from the AIB mfg's. So it'll take 2 weeks minimum for them to even get a card in hand and then they have to manufacture a prototype. They didn't make a 3090Ti block so they're starting further behind than say....Alphacool who only need to router out an extra 1cm on the notch for the moved power connector on their 3090Ti block. Then they'll screw up production in some way anyways and see your block in January 2023 and it'll do the same as a $120 block.
> 
> I swear I do have an irrational hatred of Optimus. It took them like a week and getting publicly called out by me before they're like "oh yeah that reservoir that we say is "in stock" actually isn't". If they actually ran themselves like a top tier boutique company I'd probably be like "yeah eff it...$400 for a block why not" but they're a hobby shop / side hustle probably-one-man-show.
> 
> HDMI...meh you can passive adapt DP to HDMI. The Strix is a stupid card to buy if you're waterblocking....the cheapest base model is where it's at or FE.


I remember pre-ordering my current block for my 3080 from them. It was a real hassle as their timelines they kept giving were way too optimistic. Took forever. But it was also their first time making GPU blocks and supply chains were extremely screwed back then. 

Hope they've improved. They're at least no longer taking pre-orders.

But not sure yet what block to get. I'm not looking for the absolute best performance. EK has their blocks ready to go it seems like. And then there is Watercool, but so far they're only making blocks for Strix and FE. But I wonder if their Strix block will also fit the TUF, as Optimus claims their block is for both Strix and FE.


----------



## Malinkadink

Shaded War said:


> Any guesses for when we can add to cart? Looking for FE on bestbuy or possibly something from Asus on amazon.
> 
> 
> 
> 
> I could see going TUF for the extra HDMI port. The Strix doesn't seem to offer much for $400 premium besides more RGB crap that your removing for water block.


Asus was making a big deal about the cooler on the strix, I'm not sure how different it actually is vs TUF aside from that it will be better binned so you may get like 50Mhz higher clocks definitely not worth the $400 premium over an MSRP TUF. I know the 30 series TUF models cooled quite well so I'll be getting one of those the extra hdmi port is a bonus.


----------



## Cobra26

sakete said:


> Man, I can't stress how bummed I am that EVGA dropped out of the GPU business. Their customer service is second to none. Have had to do exchanges a couple times for other products and they make it as easy and stress free as can be.
> 
> Asus I've heard can be a real pain, though I've never had an Asus product fail on me before (I typically get Asus mobos). This would be my first Asus card if anything, because of the waterblock support.


"This would be my first Asus card if anything, because of the waterblock support..."

Wait does Asus support putting a waterblock on their cards without voiding warranty?


----------



## dr/owned

Cobra26 said:


> "This would be my first Asus card if anything, because of the waterblock support..."
> 
> Wait does Asus support putting a waterblock on their cards without voiding warranty?


In the US they don't have a choice. "warranty void if removed" are meaningless.

For the record though I've only had one GPU fry and that was a Titan X on first power up....inductor exploded.


----------



## Arizor

Yep I think TUF is the clearest value proposition this gen (if we can talk of "value" once we've hit $1600...), though reviews tomorrow will probably confirm.


----------



## sakete

Cobra26 said:


> "This would be my first Asus card if anything, because of the waterblock support..."
> 
> Wait does Asus support putting a waterblock on their cards without voiding warranty?


I meant, most manufacturers are making blocks geared towards the Asus cards


----------



## sakete

dr/owned said:


> In the US they don't have a choice. "warranty void if removed" are meaningless.
> 
> For the record though I've only had one GPU fry and that was a Titan X on first power up....inductor exploded.


Is that true? So they'll have to honor warranty either way? Or only if you take them to court, or they'd still try to screw you?


----------



## sakete

So what are good blocks to get? I know of Optimus and I have one now - other than the whole pre-ordering process having been super messy, it's been problem free for me otherwise. They are very expensive though, probably overpriced for the performance. 

There is obviously EK and Heatkiller (Watercool). EK you can apparently order one right now and it'll ship. But don't know how good they are, and how good their nickel quality is. Heatkiller there is no ETA, but I trust their quality.

What else is worth considering that's good quality and good performance?


----------



## Shaded War

Malinkadink said:


> Asus was making a big deal about the cooler on the strix, I'm not sure how different it actually is vs TUF aside from that it will be better binned so you may get like 50Mhz higher clocks definitely not worth the $400 premium over an MSRP TUF. I know the 30 series TUF models cooled quite well so I'll be getting one of those the extra hdmi port is a bonus.


I'm not sure how much better the cooling is going to be, or if it even matters. I saw a FE review where they mentioned mid 60's while gaming on stock fan curve which leaves plenty of room for overclocking. It will be interesting to see more detailed reviews including the custom AIB coolers.


----------



## dr/owned

sakete said:


> Is that true? So they'll have to honor warranty either way? Or only if you take them to court, or they'd still try to screw you?











We’re Afraid of Warranty Stickers, but Really, Manufacturers Should Be | iFixit News


According to a survey published today by consumer group U.S. PIRG, 45 out of 50 appliance manufacturers automatically void the warranty of a device if it has…




www.ifixit.com





But TLDR it's like everything else where if they want to fold their arms and say "nope" you have to take them to small claims so:

(speaking US only) a lot of credit cards come built-in with a 1 year warranty minimum type deal where if the manufacturer screws you and says "nope we're not honoring it" you can claim with the credit card and they'll pay out. But there's $ limits sometimes so read your card policy.

If it's within 6 months I would pressure the retailer to either take a return if the mfg. is denying warranty or hit em with a chargeback.

Or just box swap a new order and let the retailer / distributor deal with getting a credit from the AIB mfg that way. Few retailers track even serial numbered products and even fewer do anything if there is a mismatch. I've only personally seen it happen on Apple stuff.


----------



## dr/owned

sakete said:


> So what are good blocks to get? I know of Optimus and I have one now - other than the whole pre-ordering process having been super messy, it's been problem free for me otherwise. They are very expensive though, probably overpriced for the performance.
> 
> There is obviously EK and Heatkiller (Watercool). EK you can apparently order one right now and it'll ship. But don't know how good they are, and how good their nickel quality is. Heatkiller there is no ETA, but I trust their quality.
> 
> What else is worth considering that's good quality and good performance?


Watercool, Alphacool, Aquacomputer, Barrow, Bykski (warning on their nickel plating quality but at least it's cheap), Corsair (usually a rebrand though) Phanteks, Bitspower.

EK is hard off my list because they're scummy as a company and overpriced for what you do get. The first 3 I listed are made in Germany...and they do it cheaper than EK making it it friggin' Slovakia.

All waterblocks are more or less similar performance because it's direct-die. More important to manually measure out the thermal pad thickness and use your own IMO. Whichever one I get I'll post the board measurements and what pads to use....

Bonus is right now the euro is totally boned and has 1:1 parity with the dollar if you order direct from europe. And the yen too while we're at it is screwed so order your JDM anime stuff now


----------



## mirkendargen

dr/owned said:


> We’re Afraid of Warranty Stickers, but Really, Manufacturers Should Be | iFixit News
> 
> 
> According to a survey published today by consumer group U.S. PIRG, 45 out of 50 appliance manufacturers automatically void the warranty of a device if it has…
> 
> 
> 
> 
> www.ifixit.com
> 
> 
> 
> 
> 
> But TLDR it's like everything else where if they want to fold their arms and say "nope" you have to take them to small claims so:
> 
> (speaking US only) a lot of credit cards come built-in with a 1 year warranty minimum type deal where if the manufacturer screws you and says "nope we're not honoring it" you can claim with the credit card and they'll pay out. But there's $ limits sometimes so read your card policy.
> 
> If it's within 6 months I would pressure the retailer to either take a return if the mfg. is denying warranty or hit em with a chargeback.
> 
> Or just box swap a new order and let the retailer / distributor deal with getting a credit from the AIB mfg that way. Few retailers track even serial numbered products and even fewer do anything if there is a mismatch. I've only personally seen it happen on Apple stuff.


The good ol' Amazon warranty.


----------



## ZealotKi11er

Don’t bother with Bykski. Had for both Radeon 7 and 6900 XT. Both nickel plating failed, mounting hardware is garbage and vs HK much worse performance.


----------



## RetroWave78

yzonker said:


> I bet it's not even 10% unless you add core/mem offsets too. And I'm not sure where 700w came from. No such thing. There's a lot wrong in that statement.


4090 FE drew 666.6W in Gamer's Nexus review @ 33% PT increase:






Guru3D doesn't show the power draw increase in their review but increasing PT by 33% and 40mv yielded 5% improvement in performance:









GeForce RTX 4090 Founders edition review


NVIDIA unveiled their much-anticipated Series 4000 graphics cards, beginning with the GeForce RTX 4090 founder edition. The new graphics card provides greater performance, more extensive raytracing fe... Overclocking




www.guru3d.com





So an additional 5% performance required an additional 200w of power, 33% PT.

I predicted this over a week ago.

Derbauer had the most relevant review video of all the ones I viewed this morning showing precisely where Lovelace is the most power efficient (50% PT) and how dropping PT down to 90% resulted in no noticeable loss in performance.

NV clocked Lovelace aggressively, because as I stated in my post before last "If I were to surmise as to why they did this it would be because ultimately the metric, with RDNA3 right around the corner, is still _which is the fastest graphics card in the world _not _which is the most power efficient graphics card in the world." _

This tells me that Nvidia is really actually threatened by AMD this time around. Unlike Nvidia who is saddled with 30 series overstock, AMD can price their products much lower. I'm hearing that their 4090 equivalent in their product stack will be $1199 and that their $1500 card will actually be some 20% faster in rasterization.

Think about it, Nvidia has no reason to release a card right now, they are trying to beat AMD to the market and not only that, they literally rung the snot out of AD-102, well beyond it's power / efficiency curve sweet spot to top it off.

If anyone reading this does not actually need a faster GPU, is sitting with a 4K display or Pimax 8KX or triple monitor set-up I absolutely recommend sitting tight and waiting to see what AMD will offer up in November.

Additionally, if you think that that NV don't have the 4090 Ti and 4080 Ti ready to go, right now, you're not paying attention. Not only will NV drop those in response to AMD line-up but NV can and may, at the wave of a pen, change the nomenclature of the "4080 12 GB" to the 4070 and drop the price down to $600 in response to an AMD card as fast as an RTX 3090 / AD-106 for $600 "with touted exclusive DLSS 3.0" features.

If you want full die AD-102, I would wait.

If you don't need a faster GPU right now, I would wait.

If you are incensed with NV shenanigans, creating the very situation that they are once again profiting from (handing 30 series cards over to crypto miners hand over fist), I would wait.

If you don't like where the PC component industry has headed ($2000 80 series cards), and you understand that we're here because we haven't supported the competition, I would wait.


----------



## dr/owned

RetroWave78 said:


> ...


You're not in the right thread for this....

This is the generic "complain about price / performance / whatever" thread: 4090 - 1599$ 4080 16G - 1199$ 4080 - 12G 899$


----------



## RetroWave78

dr/owned said:


> You're not in the right thread for this....
> 
> This is the generic "complain about price / performance / whatever" thread: 4090 - 1599$ 4080 16G - 1199$ 4080 - 12G 899$


Oh I intend to overclock, I'm running 2GHz @ .937v with undervolt freq curve with 3090 FE. Bringing the voltage down means it only dips under 2GHz briefly to 1965 MHz in Timespy GT2 whereas cards on air at +30C on the core require more voltage and another 100w to sustain the same performance.

I'm all about overclocking, _intelligently _overclocking.

Running 670w for 5% performance gain, bro you're dumb.


----------



## mirkendargen

dr/owned said:


> You're not in the right thread for this....
> 
> This is the generic "complain about price / performance / whatever" thread: 4090 - 1599$ 4080 16G - 1199$ 4080 - 12G 899$


For real. Maybe I need glasses but it really looks like this is the (currently prospective) 4090 owners thread, not the convince people not to be 4090 owners thread.


----------



## Arizor

For what it's worth Optimum Tech overclocked, using 460ish Watts, and achieved about 9-10% more performance


----------



## dr/owned

mirkendargen said:


> For real. Maybe I need glasses but it really looks like this is the (currently prospective) 4090 owners thread, not the convince people not to be 4090 owners thread.


Money can't buy happiness but it can buy you not having to think about whether you should spend less money for less performance....and that makes me happy.

It's going to be a doo-doo-show in the threads figuring out 4070 vs. AMD vs. 4080-12


----------



## LunaP

dr/owned said:


> Optimus has never been first to deliver anything. They'll talk it up like they're gonna ship next week and the next post you'll see is "looking for a customer card to confirm fitment" because they don't get the CAD files from the AIB mfg's. So it'll take 2 weeks minimum for them to even get a card in hand and then they have to manufacture a prototype. They didn't make a 3090Ti block so they're starting further behind than say....Alphacool who only need to router out an extra 1cm on the notch for the moved power connector on their 3090Ti block. Then they'll screw up production in some way anyways and see your block in January 2023 and it'll do the same as a $120 block.
> 
> I swear I do have an irrational hatred of Optimus. It took them like a week and getting publicly called out by me before they're like "oh yeah that reservoir that we say is "in stock" actually isn't". If they actually ran themselves like a top tier boutique company I'd probably be like "yeah eff it...$400 for a block why not" but they're a hobby shop / side hustle probably-one-man-show.
> 
> HDMI...meh you can passive adapt DP to HDMI. The Strix is a stupid card to buy if you're waterblocking....the cheapest base model is where it's at or FE.


Omg I'm sorry I feel like I opened a wound there wasn't aware, (I've been in situations like that so I feel you) I have their CPU block on my 10980XE and love them for it. I heard alpha cool had issues w/ the 3000 series but that was due to the mfc not them giving the wrong specs or something iirc ?



Shaded War said:


> I could see going TUF for the extra HDMI port. The Strix doesn't seem to offer much for $400 premium besides more RGB crap that your removing for water block.





dante`afk said:


> cooling is not an issue here, doesnt make sense paying 400+ for Optimus blocks that you won't see for the next 5 weeks anyway.
> 
> bykski/bitspower will give equal performance for 1/4 the price.


Wait its 400$+ over the FE? yeah thats not worth it for the extra port we probably can't even use (unless it allows 5 outputs vs 4 )

As for my loop its copper so trying to stick to that w/o mixing to many metals so I guess alpha cool would be the goto then since they offer chromium, haven't seen ay others yet.



dr/owned said:


> Watercool, Alphacool, Aquacomputer, Barrow, Bykski (warning on their nickel plating quality but at least it's cheap), Corsair (usually a rebrand though) Phanteks, Bitspower.
> 
> EK is hard off my list because they're scummy as a company and overpriced for what you do get. The first 3 I listed are made in Germany...and they do it cheaper than EK making it it friggin' Slovakia.
> 
> All waterblocks are more or less similar performance because it's direct-die. More important to manually measure out the thermal pad thickness and use your own IMO. Whichever one I get I'll post the board measurements and what pads to use....
> 
> Bonus is right now the euro is totally boned and has 1:1 parity with the dollar if you order direct from europe. And the yen too while we're at it is screwed so order your JDM anime stuff now


Yeah I've been going nuts w/ booth and other stuff VRC/Tech wise cuz of this lol. As for Aqua etc do the others (have copper/chromium options? ) and do we know which are hitting on the week of launch?

Tomorrow is preorder right ? and ship day is when again?

Coming from 2x 2080ti's so either gonna keep 1 for spare monitors to save $$ on buying a 3060 for spare monitors or unsure atm.... since adding a USB -> DP/HDMI monitor just kills windows refresh due to MS screwing WDDM in 2019... hopefully doesn't do it for a 2ndary GPU.


----------



## RetroWave78




----------



## J7SC

4090 FE actually seems efficient in most regular apps from what I have read and seen today - unless the PL slider is cranked (for benching, mostly, I assume).







🥴

I guess we find out tomorrow what the partner models bring to the table (apart from RGB ) and draw from the wall socket.

EDIT: FYI, checking 'details' in the > 3DM HoF submissions such as PortRoyal, it looks like plenty of AIB partner cards (per vbios) running in the top 10 already...MSI is well represented


----------



## Malinkadink

Shaded War said:


> I'm not sure how much better the cooling is going to be, or if it even matters. I saw a FE review where they mentioned mid 60's while gaming on stock fan curve which leaves plenty of room for overclocking. It will be interesting to see more detailed reviews including the custom AIB coolers.


Keep in mind thats open bench #s so best case scenario, inside most peoples cases it will likely be closer to 75C. GN tests showed under maximum load fans are at 45% on an open bench to maintain good temps you may need to run fan at 60%. I have no idea how the cards noise levels are I have a 3080 ftw3u at 1.9Ghz undervolted with 60% fans at all times and it blends in with all the other case fans that are also set to not be intrusive to my ears. 

I think the FE this time around will be very comparable to AIB coolers, still need to wait for that embargo to lift to see those reviews.


----------



## menko2

I play at 4K and 120hz monitor.

If I keep my 10900k how will limit the 4090 because it's PciE 4.0?

I could upgrade to the 11900k because I have a z590 Mobo but would prefer to skip it if possible.


----------



## Luda

Impossible to say but holding out might be prudent with 13900k coming, but im in the same 10900k boat and have a 11900k and some ram picked out incase i find a bottleneck in my own personal usage.


----------



## Nizzen

ZealotKi11er said:


> Don’t bother with Bykski. Had for both Radeon 7 and 6900 XT. Both nickel plating failed, mounting hardware is garbage and vs HK much worse performance.


I had 3x EK that nickel plating failed.... Still buying EK and Bykski. Bykski because it's almost allways first to the marked. That means less waiting overclocking stuff on water.
PS: I preordered EK strix 4090 block, so I guess we have to wait and see when it's coming


----------



## menko2

Luda said:


> Impossible to say but holding out might be prudent with 13900k coming, but im in the same 10900k boat and have a 11900k and some ram picked out incase i find a bottleneck in my own personal usage.


It will be a shame because my 10900k is great and already overclocked and memory tweaked. The 11900k temps and power draw are not worth upgrading but for sure I don't want to change motherboard.

It's imposible to know until someone tries the 4090 with a 10900k I guess.


----------



## dr/owned

Nizzen said:


> I had 3x EK that nickel plating failed.... Still buying EK and Bykski. Bykski because it's almost allways first to the marked. That means less waiting overclocking stuff on water.
> PS: I preordered EK strix 4090 block, so I guess we have to wait and see when it's coming


If I get an FE in 5 hours I'll probably just order a 3090Ti block from Alphacool and mod it. The only thing that needs fixing is the power connector cutout and that's just plain acrylic. (EDIT: actually no the removal of NVLink looks to have moved 1 screw hole a tiny bit at the top edge of the PCB...womp womp)


----------



## ArcticZero

Really wish FE was available in more countries as it really seems like the way to go this gen. The cooler handles it quite well and is a comparably "reasonable" size. I know I can ship it from overseas but the import tax is a killer especially at this card's value.


----------



## domdtxdissar

RetroWave78 said:


> 4090 FE drew 666.6W in Gamer's Nexus review @ 33% PT increase:
> 
> 
> 
> 
> 
> 
> Guru3D doesn't show the power draw increase in their review but increasing PT by 33% and 40mv yielded 5% improvement in performance:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GeForce RTX 4090 Founders edition review
> 
> 
> NVIDIA unveiled their much-anticipated Series 4000 graphics cards, beginning with the GeForce RTX 4090 founder edition. The new graphics card provides greater performance, more extensive raytracing fe... Overclocking
> 
> 
> 
> 
> www.guru3d.com
> 
> 
> 
> 
> 
> So an additional 5% performance required an additional 200w of power, 33% PT.
> 
> I predicted this over a week ago.
> 
> Derbauer had the most relevant review video of all the ones I viewed this morning showing precisely where Lovelace is the most power efficient (50% PT) and how dropping PT down to 90% resulted in no noticeable loss in performance.
> 
> NV clocked Lovelace aggressively, because as I stated in my post before last "If I were to surmise as to why they did this it would be because ultimately the metric, with RDNA3 right around the corner, is still _which is the fastest graphics card in the world _not _which is the most power efficient graphics card in the world." _
> 
> This tells me that Nvidia is really actually threatened by AMD this time around. Unlike Nvidia who is saddled with 30 series overstock, AMD can price their products much lower. I'm hearing that their 4090 equivalent in their product stack will be $1199 and that their $1500 card will actually be some 20% faster in rasterization.
> 
> Think about it, Nvidia has no reason to release a card right now, they are trying to beat AMD to the market and not only that, they literally rung the snot out of AD-102, well beyond it's power / efficiency curve sweet spot to top it off.
> 
> If anyone reading this does not actually need a faster GPU, is sitting with a 4K display or Pimax 8KX or triple monitor set-up I absolutely recommend sitting tight and waiting to see what AMD will offer up in November.
> 
> Additionally, if you think that that NV don't have the 4090 Ti and 4080 Ti ready to go, right now, you're not paying attention. Not only will NV drop those in response to AMD line-up but NV can and may, at the wave of a pen, change the nomenclature of the "4080 12 GB" to the 4070 and drop the price down to $600 in response to an AMD card as fast as an RTX 3090 / AD-106 for $600 "with touted exclusive DLSS 3.0" features.
> 
> If you want full die AD-102, I would wait.
> 
> If you don't need a faster GPU right now, I would wait.
> 
> If you are incensed with NV shenanigans, creating the very situation that they are once again profiting from (handing 30 series cards over to crypto miners hand over fist), I would wait.
> 
> If you don't like where the PC component industry has headed ($2000 80 series cards), and you understand that we're here because we haven't supported the competition, I would wait.


Best dies are already saved for a 2.75Ghz boost (2.95 typ. gaming) 475W 18176 CUDA cores 96MB L2 cache 24Gb beast (to *try* to compete with RDNA3 in raster)
Performance 10~20% above 4090

__ https://twitter.com/i/web/status/1580024711072739329


----------



## Nizzen

domdtxdissar said:


> Best dies are already saved for a 2.75Ghz boost (2.95 typ. gaming) 475W 18176 CUDA cores 96MB L2 cache 24Gb beast (to *try* to compete with RDNA3 in raster)
> Performance 10~20% above 4090
> 
> __ https://twitter.com/i/web/status/1580024711072739329


Rumors about Amd gpu's is as bad as their drivers


----------



## owikh84

Nizzen said:


> I had *3x EK that nickel plating failed*.... Still buying EK and Bykski. Bykski because it's almost allways first to the marked. That means less waiting overclocking stuff on water.
> PS: I preordered EK strix 4090 block, so I guess we have to wait and see when it's coming


I used to have nickel failure issue with EK nickel blocks, until I changed to using Corsair XL5 clear coolant (made by Mayhems), so far good.


----------



## Glottis

ArcticZero said:


> Really wish FE was available in more countries as it really seems like the way to go this gen. The cooler handles it quite well and is a comparably "reasonable" size. I know I can ship it from overseas but the import tax is a killer especially at this card's value.


Yeah people keep talking about how Nvidia wants to get rid of AIBs and only sell FE, well they will never be able to do it if they only sell FE in like 5 countries exclusively on their website.


----------



## ArcticZero

owikh84 said:


> I used to have nickel failure issue with EK nickel blocks, until I changed to using Corsair XL5 clear coolant (made by Mayhems), so far good.


EK 3090 main block (nickel + plexi) had failed nickel plating on Mayhems X1 Clear for me, which has been perfectly fine with everything else I have used so far. It is hit or miss with EK I suppose since the active backplate's plating is still pristine.

But the rate of hit or miss at their exorbitant prices is something I don't want to take a chance on again.


----------



## dr/owned

ArcticZero said:


> EK 3090 main block (nickel + plexi) had failed nickel plating on Mayhems X1 Clear for me, which has been perfectly fine with everything else I have used so far. It is hit or miss with EK I suppose since the active backplate's plating is still pristine.
> 
> But the rate of hit or miss at their exorbitant prices is something I don't want to take a chance on again.


IMO just sell me a copper block. Easy to restore to pristine with a dunk in phosphoric acid and there's no coating to wear or flake. 10 years ago I used to think "oh I'll get the nickel version that should be more bullet proof" but it just seems hit and miss. Bykski 1080Ti has been retired and the plating still looks perfectly fine. Then the 3090 TUF took a couple years to fade to copper. Now the 3090 FTW3 I can "see" the flow path after 5 months because it's worn down (I'll take a picture of it when I pull the card out for the 4090)



> The amount of phosphorous in the plating can impact its attributes. Low-phosphorus plating provides the hardest coating, while medium is softer but plates the fastest. High-phosphorus plating is the softest but is the best for corrosion protection. Baking all coatings can increase hardness, but this will reduce corrosion protection.


I found this bit fascinating because I can see a "cheap" brand like Byski trying to push the cycle time down by saying "give us the fastest version of the process" and then you end up with wear. Or who knows maybe their supplier decided to see if they notice an unrequested change in the process to pad their own profit. Cause in china you gotta watch for this sort of thing...they take it like a national sport to screw you and see if you notice. "if they don't notice then I guess it's their own fault for being less savy than me!"

EK has just been bad forever with platings. And they never own up to it...they were talking crap about Mayhem X1 being the problem when at the time a few years ago Cryofuel was rebadged X1. I had a "gold plated" supremacy block several years ago where the gold was so thin or porous or whatever that it absorbed liquid metal...that shouldn't be a thing with gold. And then the gold plate peeled off like it was a film from the nickel boundary layer on the copper.


----------



## domdtxdissar

Nizzen said:


> Rumors about Amd gpu's is as bad as their drivers


What have that to do with the best AD102 dies getting saved a higher SKU which will get released shortly ?

i see VideoCardz also made a "newspost" about it:








According to the rumor, NVIDIA is already saving AD102 GPUs for GeForce RTX 4090 Ti - VideoCardz.com


NVIDIA RTX 4090 Ti back in the rumor mill NVIDIA is allegedly putting aside the best AD102 chips for the new flagship model. With today’s release of RTX 4090, the focus will now shift towards the upgraded Ti model, which had been rumored for many months already. NVIDIA has indeed left a lot of...




videocardz.com


----------



## Glottis

Corsair 600W cable is here. Now I just need a GPU to plug this bad boy into  (it's so thick and heavy, I have no doubts it's easily capable of 600W, PSU permitting).


----------



## themad

Glottis said:


> Corsair 600W cable is here. Now I just need a GPU to plug this bad boy into  (it's so thick and heavy, I have no doubts it's easily capable of 600W, PSU permitting).
> 
> View attachment 2575569
> View attachment 2575568


Nice!
How many cables are there on the top part of the pin, right below the 600w writing? There are 4 small pins there, but how many cables?


----------



## Shaded War

Newegg already trying to push their stupid bundles. Hope this isn't a sign.


----------



## Glottis

themad said:


> Nice!
> How many cables are there on the top part of the pin, right below the 600w writing? There are 4 small pins there, but how many cables?


2 small cables, one for each pin. (2 pins are active, other 2 are empty)


----------



## themad

Glottis said:


> 2 small cables, one for each pin. (2 pins are active, other 2 are empty)


Thanks!
That's exactly how the cable on my Asus Thor 1200w is as well. I wonder if that is different on ATX 3.0 PSU's cables. Or even the adapter cable provided with the Rtx 4090.
Any ideas why only 2 pins are used and not all 4?


----------



## Glottis

themad said:


> Thanks!
> That's exactly how the cable on my Asus Thor 1200w is as well. I wonder if that is different on ATX 3.0 PSU's cables. Or even the adapter cable provided with the Rtx 4090.
> Any ideas why only 2 pins are used and not all 4?


I think because Corsair cable is permanently configured for 600W mode it only needs 2 sense pins/wires.


----------



## LunaP

Kinda concered on the best buy site, everything shows "soon" while FE's just been unavailable since day 1 lol
Are any of these other brands even worth considering ( ignoring the SPRIM )



Spoiler


----------



## Demonkevy666

dr/owned said:


> SLI always had issues and few games support dx12 frame sli whatever thing because it was a chicken egg problem of no one having dual 3090s so no point making support for it. You’d be throwing $300 on a dead end.


AMD has much better support for mGPU compared to Nvidia. 
Also it's not that you'd cpu limited by going S.L.I, you end up being game engine limited. 
If you need proof, GTA 5 supports S.L.I, but has problems beyond 150-170fps. DLSS3 RTX 4090 just injects frames over the game engines limit, with added latency, requiring nvidia refelx for DLSS3


----------



## ttnuagmada

mGPU never really worked. microstutter was always there. some people just refused to accept it. Took me a decade to finally admit it to myself.


----------



## Demonkevy666

ttnuagmada said:


> mGPU never really worked. microstutter was always there. some people just refused to accept it. Took me a decade to finally admit it to myself.


what mGPU games have You tried?
nvidia's mGPU only works if both cards support S.L.I.
micro studder does not happen on mGPU as they don't use A.F.R renering, that happens on DX11 in SLI with A.F.R rendering.


----------



## ZealotKi11er

In canada the prices suck. Only FE is MSRP and that is the card to get. Base model AIBs are still $150+ more.


----------



## LunaP

ZealotKi11er said:


> In canada the prices suck. Only FE is MSRP and that is the card to get. Base model AIBs are still $150+ more.


if the strix is just 150$ more I'll take it for the extra hdmi, assuming it can be used in tangent and not cause issues based on HOW they made it possible lol, else yeah FE as well and just 2nd card for additional monitors. Def wanna move VR back to my main PC vs its own.


----------



## ZealotKi11er

LunaP said:


> if the strix is just 150$ more I'll take it for the extra hdmi, assuming it can be used in tangent and not cause issues based on HOW they made it possible lol, else yeah FE as well and just 2nd card for additional monitors. Def wanna move VR back to my main PC vs its own.


Cool $660 more for the Strix. Personally If I cant get the FE I will just wait. Dont see why I need to get a model that I dont like, hard to fit in cases and cost a lot more. Apart from VR why so many people care about 2x HDMI?


----------



## LunaP

ZealotKi11er said:


> Cool $660 more for the Strix. Personally If I cant get the FE I will just wait. Dont see why I need to get a model that I dont like, hard to fit in cases and cost a lot more. Apart from VR why so many people care about 2x HDMI?


Yeah that price hike is just dumb, as for the double hdmi, I'm more for just the fact it has 5 ports over 4 which means I could keep from having a 2nd GPU on my system to adrive my last monitor + additional as I'd be content w/ just 5. My fear of having a 2nd GPU( by itself for this is just the 2004 WDDM update from windows back in 2020 that ruined refresh rates ) I'm in the 1% though so not much help for me there.


----------



## Glottis

ZealotKi11er said:


> Apart from VR why so many people care about 2x HDMI?


Because majority of owners will be hooking this up to their 4K TVs via HDMI and that's the only HDMI port used up on day 1. Chances are high throughout life of GPU you'll need more HDMI ports rather than more DP.


----------



## ZealotKi11er

Glottis said:


> Because majority of owners will be hooking this up to their 4K TVs via HDMI and that's the only HDMI port used up on day 1. Chances are high throughout life of GPU you'll need more HDMI ports rather than more DP.


My monitor is a 4K TV but apart from that I never use more than 1 display. Also I planed to use multiple monitors I would use DP only. Also do we know for sure all 5 output work at the same time?


----------



## yzonker

I have two 4k TV's connected. So 2xhdmi would be really nice to have for me.


----------



## dr/owned

ZealotKi11er said:


> My monitor is a 4K TV but apart from that I never use more than 1 display. Also I planed to use multiple monitors I would use DP only. Also do we know for sure all 5 output work at the same time?


The driver limits you to 4 max.


----------



## sakete

Well I guess I'll give this a shot, but the Asus 4090 is nowhere to be found except Microcenter, and I will not be able to be there until 1.5 hrs after opening.

Edit: okay, Newegg has it. Well, let's see if this will be a disaster like 2 years ago, or not.


----------



## LunaP

dr/owned said:


> The driver limits you to 4 max.


This is the thing I couldn't verify w/ Asus but if its true then that makes it entirely pointless for me at least so appreciate that. Since I"m on 2080ti's I had to get a DP -> HDMI 2.1 converter for my TV to do 4k 120hz which has been great, but def looking forward to beyond that. Here's hoping my System will keep up w/ a spare 2080ti driving the additional displays ( that or a good 3060 w/ the money I save from not going strix...)

that aside lil over an hour left, its killing me.

I saw the updates on the whole power cable debacle w/ the burned wires and stuff so I'm guessing since that was isolated its ok to trust the cable splitters that come w/ the cards or should I still order the corsair one ( I have a 1600W ) to be safe?


----------



## Arizor

Aaaand they're out in Australia boys, how many should I buy at this low cost?









*I grabbed the TUF at 2900.


----------



## dr/owned

Swear to god this candle better light in 3 minutes with Best Buy...I'm tired. NowInStock comment has a guy camping microcenter employee said triple digit amount of cards.


----------



## LunaP

I'll be amazed if any of the best buys near me even carry one in stock today ( vs online only ) , since we lost fry's electronics on the west coast there's been nothing else and best buy doesn't really do tech.


----------



## Slaughtahouse

Edit: Had it in my cart for 3mns. 9:01 EST to 9:04 EST. Just curious how fast it would sellout.


----------



## dr/owned

Newegg went live all cards.


----------



## ZealotKi11er

BestBuy Canada falling to add to cart. Giga WF3 for $2250 is CC.


----------



## dr/owned

BestBuy live for some MSI. In store pickup only.


----------



## sakete

Cool, I got the TUF OC. As the regular TUF sold out instantly on Newegg. $1900 for a frigging video card (with tax and shipping). Ugh. There's a decent chance I'll change my mind, lol.

I'll still swing by Microcenter later as well.


----------



## AveragePC

Would a 5600x bottleneck a 4090? Primary concern is with my VR headset, HP Reverb G2, essentially 2 4k screens at 90 fps. I couch game at 4k 60, and at the deck 1440p 144hz.


----------



## Arizor

Holy shiiiiiet glad I grabbed the TUF, sold out instantly here in Oz...


----------



## LunaP

damn, FE went from unavailable to sold out at best buy over the past minute


----------



## RetroWave78

6 am Best Buy no stock (FE).


----------



## RetroWave78

LunaP said:


> damn, FE went from unavailable to sold out at best buy over the past minute


I've been F5'ing the app and online, never saw it in stock.


----------



## 8472

Snagged a TUF non-OC. Surprised I was able to get it. Bestbuy looks like it's out of the founders edition.


----------



## sakete

TUF OC still showing stock on Newegg.


----------



## sakete

8472 said:


> Snagged a TUF non-OC. Surprised I was able to get it. Bestbuy looks like it's out of the founders edition.


Where did you get it?


----------



## heptilion

Can anyone explain to my why the gigabyte master is asking 1000W psu where as the liquid extreme 850? Shouldnt the Liquid needs more power? How does this work? 









AORUS GeForce RTX™ 4090 XTREME WATERFORCE 24G (rev. 1.0) Key Features | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com













AORUS GeForce RTX™ 4090 MASTER 24G Specification | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com


----------



## ttnuagmada

FE seems to have sold out immediately as near as i can tell


----------



## LunaP

RetroWave78 said:


> I've been F5'ing the app and online, never saw it in stock.


Same


----------



## 8472

sakete said:


> Cool, I got the TUF OC. As the regular TUF sold out instantly on Newegg. $1900 for a frigging video card (with tax and shipping). Ugh. There's a decent chance I'll change my mind, lol.
> 
> I'll still swing by Microcenter later as well.


The Non-OC TUF still shows in stock on newegg.


----------



## yzonker

Total fail. Didn't get anything.


----------



## sakete

8472 said:


> The Non-OC TUF still shows in stock on newegg.


Yes, but when you try to check out it says sold out. Believe me, I tried, lol. Was only able to snag the OC version at $200 more.


----------



## 8472

sakete said:


> Where did you get it?


Newegg using their app.


----------



## Christopher2178

Nothing anywhere.. all out of stock.. bots?


----------



## sakete

8472 said:


> Newegg using their app.


Ah cool. I got it on the website. Clicking the checkout button wouldn't work, so I had to do checkout with Paypal instead, which did work.


----------



## LunaP

Seems gigabyte windforce is still available on bestbuy just uncertain if I should grab that or wait since I don't know much about that other than what I've heard in the past.


----------



## sakete

Christopher2178 said:


> Nothing anywhere.. all out of stock.. bots?


Yep, looks that way. It's still a **** show.


----------



## sakete

delete


----------



## LunaP

And it just swapped to sold out wew


----------



## yzonker

2020 all over again. No change. I love all the Youtubers that were so sure people could get cards. Gonna have to suck that down now.


----------



## sakete

So things are still just as bad as 2020, lol. I guess it remains to be seen how soon they can restock. That might be the main differentiator. More opportunities to compete with bots. I'm surprised I even got one, though not the one I wanted.


----------



## dr/owned

FE coming online right now! (EDIT and she back to sold out. It was on the "high availability" and "available for pickup at <store>"


----------



## energie80

Sold out already


----------



## ttnuagmada

People already trying to scalp them on Ebay.


----------



## bmagnien

dr/owned said:


> FE coming online right now!


where


----------



## Arizor

Looks like, as we expected, the Strix is _complete_ overkill and is pretty much exactly the same performance in cooling and frames as the FE.


----------



## kairi_zeroblade

We have a good amount supply here in Asia..problem is, nobody can afford it at 2000++$ even for the zotac AIRO..lol..

Maybe people can't really live w/o 2 kidneys..


----------



## Zurv

nvidia's site went boom boom


----------



## yzonker

Well I guess the scalpers did me a favor. Now I can read reviews and wait for AMD.


----------



## ttnuagmada

bmagnien said:


> where


Ebay for 2200 dollars


----------



## dr/owned

It went "available for pickup" on BBY and then back to sold out. It was still "unavailable" gray button though for a minute.


----------



## kairi_zeroblade

Seems like another pandemic all over on ebay..


----------



## LunaP

Zurv said:


> nvidia's site went boom boom


I completely forgot that they even sold GPU's till u mentioned that lol.. now I feel dumb ( always been just w/ evga )


----------



## sakete

Arizor said:


> Looks like, as we expected, the Strix is _complete_ overkill and is pretty much exactly the same performance in cooling and frames as the FE.


Nah, you're wrong. You get 1-2fps extra!


----------



## RetroWave78

I started F5'ing at 5:55, around every 3 seconds, both the app and online and a Strix on Newegg for good measure. I managed to add a Strix to my card on newegg but hesitated at total purchase price ($2200) and I have FE EKWB on pre-order and am not fond of power hungry GPU's dumping heat into my case. Strix + waterblock is literally over $2500 value proposition. No thanks.

I may just wait for FE and cancel my water-block pre-order, half the heat from FE goes out the rear of the case. Run undervolt and keep card @ or under 400w for 2D gaming, done.

I honestly don't even need this crap, I have exactly one title that is need of more GPU compute and that's Assettto Corsa Competizione in VR (Pimax 8KX). I'm running Marvel's Spiderman @ 90 FPS with RT maxed and quality DLSS 90% of the time @ nearly 4K (with dips to 70 FPS extreme scenarios, 5120x1440, Samsung Odyssey G9 Neo, FALD HDR looks amazing) and then half of the dips I'm CPU limited with 12900k @ 5.1 GHz all P-Cores. Like I don't need this and I don't need to obsess or stress about it and I sure as hell don't need nor desire to support scalpers.

F5'ed for 15 minutes straight, this card was never in stock on Best Buy, what a farce. Stock can clear up or not I don't give a crap. Honestly 90% of my gaming is spent in bed with my Steam Deck now that I can no longer heat my apartment mining Ethereum as we go into fall.

So I honestly don't care, just venting frustration. 5:30 am wake up to F5 for 15 minutes, I had everything lined up and done correctly, the system failed us, not us.

Broken purchase system rife with abuse. I can nearly guarantee you that Best Buy and Nvidia somehow back-door offloaded the initial shipments to scalpers on ebay or scalped them themselves (via "3rd Party", plausible deniability).

Think about it, why sell a product for MSRP when you can get another 33% up and over that and be exempt from warranty liability to top it off?

Don't think this happens?


__
https://www.reddit.com/r/hardware/comments/j6idky









MSI Subsidiary Caught Scalping RTX 3080, 3090 Cards on eBay


Following an investigation, MSI 'found that an error' allowed company sales subsidiary Starlit Partner to access the RTX 3000 graphics cards, which it then sold on eBay at inflated prices.




www.pcmag.com


----------



## Baasha

I had the Asus RoG Strix in cart and the pos sold out before I could check out. What kind of BS is that? I thought once you add it to cart it's "yours" to check out - at least for a few minutes. Goddamn 2FA boinked me.


----------



## sakete

I'm still going to swing by Microcenter in an hour or so. Will see what they have left. If they have the regular TUF still (highly doubt it), I'll buy that and probably cancel my order with Newegg for the TUF OC. Or I could do someone a solid here and still buy it, and send it onwards, at cost of course. But not sure how that works with warranty claims.


----------



## nyk20z3

Plenty of stock on Newegg

NVM 😅


----------



## RetroWave78

Baasha said:


> I had the Asus RoG Strix in cart and the pos sold out before I could check out. What kind of BS is that? I thought once you add it to cart it's "yours" to check out - at least for a few minutes. Goddamn 2FA boinked me.


Card is overpriced, internet RNG and Newegg, NV and Best Buy all most assuredly directly implicated in scalping themselves saved your wallet. 

Wait until they find out that $1600 was the wall that 90% of us were not comfortable with and that these things won't fetch the $2500 they think they will on ebay.

RDNA3 right around the corner, this nonsense won't last very long.


----------



## Christopher2178

Amazon just no shows.. nothing at all from them?


----------



## ZealotKi11er

Is winforce 3 same as gaming


----------



## lordkahless

I did a Best Buy chat to ask why the FE went from unavailable to out of stock instantly. They said they haven't released that one yet.


----------



## kairi_zeroblade

ZealotKi11er said:


> Is winforce 3 same as gaming


PCB wise yes, just the cooling differed and the vbios..


----------



## Christopher2178

Where’s the guy with that $200 a year Best Buy total tech club thing he said they were gonna email members for early access for fe.. Wonder if they did?


----------



## lordkahless

Christopher2178 said:


> Where’s the guy with that $200 a year Best Buy total tech club thing he said they were gonna email members for early access for fe.. Wonder if they did?


That would have been me. and they didn't send me squat. I complained by opening a window to total tech support that I spent $199 and there was no access. instant out of stock, nothing. Their response was idiotic. Said we haven't released it yet! I said it hasn't been in stock yet? Then they said we are working with our partners to acquire more inventory and when we get more rest assured you will get exclusive access!

LOL ridiculous


----------



## RetroWave78

lordkahless said:


> I did a Best Buy chat to ask why the FE went from unavailable to out of stock instantly. They said they haven't released that one yet.


I mean it's possible, I was F5'ing Nvidia web portal as well and it never changed from "coming soon".

Nvidia said availability at 6 am PST and Best Buy handles their orders.

But I don't think this is an error or miscommunication. They 100% internally scalped the entire initial shipment, either Nvidia or Best Buy, Newegg as well. See my post before last.

It's just really not ****ing cool to dick people around like this. These companies / corporations have zero respect for us.


----------



## 8472

Bestbuy definitely had non-FE cards with the add to cart button earlier. All gone now.


----------



## ttnuagmada

Yeah there are 3 up on Ebay already, one with a screenshot of the checkout


----------



## RetroWave78

8472 said:


> Bestbuy definitely had non-FE cards with the add to cart button earlier. All gone now.


What are the odds that no-one here who participated in this charade was actually able to purchase a card? I see one member out of 10 who states they were able to buy Tuf variant. 

I suspect internal scalping, 100% inside job.


----------



## ttnuagmada

a couple on best buy site went from sold out to "see details" to wait in line. they may just be trickling them out. This would be a great way to stick it to scalpers.


----------



## nyk20z3

I never understand why anyway would over pay for these cards on eBay. And don’t the sellers have to pay a large fee so where is the profit margin really?


----------



## RetroWave78

ttnuagmada said:


> Yeah there are 3 up on Ebay already, one with a screenshot of the checkout


Don't support this! People stand by your principles. Stock will clear up quickly with RDNA3 right around the corner. Nvidia has gobs of Lovelace dies. This is part strategic (artificial scarcity to drive up demand hysteria) and part internal scalping.

Just be patient. Everyone reading this, you will be able to buy the card you want, including FE, within 2 weeks time, 100% guaranteed.

Stand firm.


----------



## dr/owned

It may help but here's the FE link that runs an extra "check inventory" thing that happens after the queue.






NVIDIA GeForce RTX 4090 24GB GDDR6X Graphics Card Titanium and black 900-1G136-2530-000 - Best Buy


Shop NVIDIA GeForce RTX 4090 24GB GDDR6X Graphics Card Titanium and black at Best Buy. Find low everyday prices and buy online for delivery or in-store pick-up. Price Match Guarantee.




www.bestbuy.com


----------



## lordkahless

Seems like Best Buy didnt't have any anti-bot measures in place


----------



## RetroWave78

ttnuagmada said:


> a couple on best buy site went from sold out to "see details" to wait in line. they may just be trickling them out. This would be a great way to stick it to scalpers.


It's 6:45, you still F5'ing my man? Talk about determination. "I will not be defeated so easily!"


----------



## RetroWave78

lordkahless said:


> Seems like Best Buy didnt't have any anti-bot measures in place


Why would they? That would destroy plausible deniability that it was not they, Nvidia and Newegg directly involved in the scalping themselves. 









MSI Subsidiary Caught Scalping RTX 3080, 3090 Cards on eBay


Following an investigation, MSI 'found that an error' allowed company sales subsidiary Starlit Partner to access the RTX 3000 graphics cards, which it then sold on eBay at inflated prices.




www.pcmag.com


----------



## lordkahless

I did the total tech chat again and the person acted confused. Said their "experts" were trying to determine what happened. I said you ripped me off with $199. There was no exclusive access. "Rest assured we are trying to look into the matter"


----------



## 8472

RetroWave78 said:


> What are the odds that no-one here who participated in this charade was actually able to purchase a card? I see one member out of 10 who states they were able to buy Tuf variant.
> 
> I suspect internal scalping, 100% inside job.


Something's definitely fishy. Amazon and Nvidia's site apparently didn't even try, Bestbuy never had the FE. 

Newegg looks to still have the Gigabyte card with the 360mm AIO though.


----------



## RetroWave78

8472 said:


> Something's definitely fishy. Amazon and Nvidia's site apparently didn't even try, Bestbuy never had the FE.
> 
> Newegg looks to still have the Gigabyte card with the 360mm AIO though.


"Didn't even try" 

Dude. 

Internal scalping 100%. 

All of us were F5'ing, the cards were never even available. 

They've already been caught internally scalping. 

Do the math.


----------



## Shaded War

After seeing the FE was instantly sold out on bestbuy, I ended up with an MSI Gaming Trio from newegg. Never owned an MSI card, but it was on my short list of choices.

I'm actually impressed how long some of $1599 cards were in stock.


----------



## LunaP

Shaded War said:


> After seeing the FE was instantly sold out on bestbuy, I ended up with an MSI Gaming Trio from newegg. Never owned an MSI card, but it was on my short list of choices.
> 
> I'm actually impressed how long some of $1599 cards were in stock.
> 
> View attachment 2575590


So far the FE never went up on Bestbuy.

I'm staying up for another 5 minutes ( shoudl have just woken up early instead of staying up for it in the event of ) to see if it goes up for 7am


----------



## IronAge

Founders Edition been sold out within 2-3 Mins in Germany, i got one ordered.


----------



## ttnuagmada

RetroWave78 said:


> It's 6:45, you still F5'ing my man? Talk about determination. "I will not be defeated so easily!"


Im at work, so i've just been doing it at random to see what happens


----------



## Baasha

LunaP said:


> So far the FE never went up on Bestbuy.
> 
> I'm staying up for another 5 minutes ( shoudl have just woken up early instead of staying up for it in the event of ) to see if it goes up for 7am


Yep, only saw it as "Sold Out" which is ridiculous. 

I thought there'll be "plenty of stock" as people were saying. I guess that was foolish.

Sigh... this is so frustrating.


----------



## lordkahless

I'm guessing bots scalped the FE on Best Buy instantly. Hopefully somebody on Best Buy can look at suspicious orders and cancel them and release them back into the system.


----------



## LunaP

Baasha said:


> Yep, only saw it as "Sold Out" which is ridiculous.
> 
> I thought there'll be "plenty of stock" as people were saying. I guess that was foolish.
> 
> Sigh... this is so frustrating.


I'll wait for the "record sales" article to populate some fake numbers lol on certain known paid websites ( those that shill)


----------



## dr/owned

Amazon order placed for MSI one.


----------



## LunaP

Gigabytes back in stock on bestbuy ( windforce)


----------



## LunaP

Looks like theres a slim chance they might still be trickling in.


----------



## bmagnien

dr/owned said:


> Amazon order placed for MSI one.
> 
> View attachment 2575595


same. now how do you find out if this is reference or not to see if you can use Alphacool's block...


----------



## LunaP

bmagnien said:


> same. now how do you find out if this is reference or not to see if you can use Alphacool's block...


Ask on their twitter they respond pretty fast on that if ur unable to get an answer/find it mentioned. Hopefully they have schematics already.


----------



## dr/owned

bmagnien said:


> same. now how do you find out if this is reference or not to see if you can use Alphacool's block...


They said they had NDA that they couldn't say a week ago which cards were reference. So maybe they'll update now or we'll soon-ish get reviews for it with PCB shots.


----------



## Benni231990

I have bought a 4090 MSI Suprim X and have the ship confermation allready


----------



## LunaP

dr/owned said:


> They said they had NDA that they couldn't say a week ago which cards were reference.


Weird that they'd be on NDA vs others, Optimus ( granted could be just saying things ) stated that the strix was diff pcb than the FE so it'd be 2 sep blocks.




Benni231990 said:


> I have bought a 4090 MSI Suprim X and have the ship confermation allready


Grats! Mine sent me an authorization code then after confirming me put in cart and clicked pay for it to say sold out lol.


----------



## alitayyab

I scored 26 300 in Port Royal


Intel Core i7-12700K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Zotac airo version.
Bios is locked to 110% power delivery as for now (unless i am missing something). This is the performance bios. 
Runs cooler and quieter than my previous palit 3090 Game Rock OC


----------



## ttnuagmada

Nvidia store is finally saying something other than "coming soon". The FE is the only card they're listing as out of stock.


----------



## dr/owned

This is the render that alphacool has for the reference PCB. Looks like if it doesn't have 2 rows of polymer caps (the button ones) it's not reference. So the Gigabyte OC one isn't reference either:










Also doesn't look like the MSI base model Gaming Trio is reference. Has way more screws on the back.


----------



## Feklar

RetroWave78 said:


> Don't support this! People stand by your principles. Stock will clear up quickly with RDNA3 right around the corner. Nvidia has gobs of Lovelace dies. This is part strategic (artificial scarcity to drive up demand hysteria) and part internal scalping.
> 
> Just be patient. Everyone reading this, you will be able to buy the card you want, including FE, within 2 weeks time, 100% guaranteed.
> 
> Stand firm.


2 weeks time guaranteed for FE? Nope. It sold out in seconds. You won't find one except Ebay this year. This is modern times. Use bots scripts cheats whatever it takes. Unfortunately true.


----------



## LunaP

MSI just went back in stock on BB added, waited then out, no way to tell if scalper bots are hammering or people refreshing/fighting per lol its a FFA currently.


----------



## dr/owned

LunaP said:


> MSI just went back in stock on BB added, waited then out, no way to tell if scalper bots are hammering or people refreshing/fighting per lol its a FFA currently.


It's the scraps of people whose payment methods had issues or whatever.


----------



## Baasha

The Asus Strix says it's in Stock in NewEgg but I can't click 'secure checkout' it keeps saying "out of stock" ***?!?!


----------



## nyk20z3

I will see what Micro Center has later, it’s not really hard to get a card. They stock twice a week but it’s a matter of showing up an hour before then open.


----------



## sakete

OK, I had purchased the TUF OC on Newegg. Just walked into local micro center and they already sold out of the regular TUF, but plenty of the TUF OC in stock. Decided to purchase there as well and will cancel Newegg order (rather give MC my business).

If anyone wants the TUF OC that I ordered from Newegg, I can forward it to you, at cost + shipping. Send me a DM right away if interested, will otherwise cancel in the next hour or so.


----------



## 8472

RetroWave78 said:


> "Didn't even try"
> 
> Dude.
> 
> Internal scalping 100%.
> 
> All of us were F5'ing, the cards were never even available.
> 
> They've already been caught internally scalping.
> 
> Do the math.


That's what I'm saying. Lol. Didn't even try means that they didn't even try to sell it to customers trying to buy it.


----------



## LunaP

dr/owned said:


> It's the scraps of people whose payment methods had issues or whatever.


Yeah its odd it keeps swapping between gigabyte and msi, I wait in line sometimes get conf code and bam sold out, bots be quick


----------



## WayWayUp

I see the strix available on newegg...
if you want to dish out 2k lol


----------



## sirneb

WayWayUp said:


> I see the strix available on newegg...
> if you want to dish out 2k lol


I think it's fake, can't checkout with it.


----------



## ZealotKi11er

Pulled the trigger on CC for the GIGABYTE GeForce RTX 4090 GAMING OC for $2250 CAD, $2550 CAD after TAX. Only other card for same price is Zotac (3-Slot) and MSI. Has anyone seen any reviews of the MSI card.


----------



## nyk20z3

Let’s be honest, who actually realistically needs this card ?


----------



## WayWayUp

asus tuf was just in sock for a minute there on newegg but cant checkout


----------



## LunaP

nyk20z3 said:


> Let’s be honest, who actually realistically needs this card ?


Me, for various reasons.

EZ arguments for others would be VR foremost
Gaming 2nd ( 4k+ / gshade, mods/any shader overlays that eat up VRAM and resources)
Production 3rd as well as getting off 2xxx SLI


----------



## lordkahless

I noticed on Newegg some of the order quantities were limited to 5. Why would they allow people to buy 5 of these things? That makes no sense.


----------



## WayWayUp

nyk20z3 said:


> Let’s be honest, who actually realistically needs this card ?


the price isnt bad when you already have an 3090 ftw 3 with optimus block and get 1k for it

just a few hundred bucks upgrade in reality


----------



## LunaP

lordkahless said:


> I noticed on Newegg some of the order quantities were limited to 5. Why would they allow people to buy 5 of these things? That makes no sense.


"Due to an oversight from our system the number 5 was portrayed vs 1 per user... we sincerely apologize and will work to ensure this doesn't happen again" kek


----------



## ttnuagmada

nyk20z3 said:


> Let’s be honest, who actually realistically needs this card ?


_Need? _


----------



## 8472

Msi gaming appears to be available on newegg. I can't add it to cart since I've already ordered a card.


----------



## coelacanth

Could have just bought Zotac Trinity on Newegg for $1599 but going to skip these cards.


----------



## LunaP

U gotta be kidding me, I got it in my cart and it removed the store now it says I can't ship to any store cuz theres no stores within 250 miles of me when theres 20+

THIS is why it keeps coming back in stock cuz the systems busted.

In stock but shipping not available, must pick up at a store. ok purchase, great, theres no stores around you lol...


----------



## Gorod

Great, yet another paper launch. FE out of stock, newegg sold out. This is hilarious, people have to fight for privilege  to dish out 1.6 to 2K for a graphics card  Oh well, might just as well wait and see what RX 7950XT brings up to the table then, so much for hype


----------



## Shaded War

nyk20z3 said:


> Let’s be honest, who actually realistically needs this card ?


I have a 4k 120HZ TV and 3440x1440 175Hz monitor, and both need more power.

I put off playing cyberpunk because it ran under 30FPS on my 3090. 4090 benchmarks showing 90ish FPS without DLSS.


----------



## zlatanselvic

I got the Zotac Amp Extreme - anyone know if it will take the 600w BIOS?


----------



## coelacanth

Gorod said:


> Great, yet another paper launch. FE out of stock, newegg sold out. This is hilarious, people have to fight for privilege  to dish out 1.6 to 2K for a graphics card  Oh well, might just as well wait and see what RX 7950XT brings up to the table then, so much for hype


Newegg keeps putting stuff back in stock. Could have just placed an order on one a moment ago.


----------



## WayWayUp

no more miners they said
5x more day 1 stock they said


----------



## LunaP

WayWayUp said:


> no more miners they said
> 5x more day 1 stock they said


"Social media hates him, find out how 1 company is disrupting the graphics card industy..... (not clickbait) "


----------



## lordkahless

I saw 4 Founders cards US on ebay just after Best Buy went to out of stock. Then nothing on Ebay. Where did all the Founders go? I would expect 10,000+ on Ebay already lol.


----------



## LunaP

lordkahless said:


> I saw 4 Founders cards US on ebay just after Best Buy went to out of stock. Then nothing on Ebay. Where did all the Founders go? I would expect 10,000+ on Ebay already lol.


Ebay should still have a history of sales though, granted theres other sites/avenues now as I think ebay takes nearly 40% ? They up it every other year.


----------



## dr/owned

Eyeballing the MSI gaming: it might use the same PCB as the Suprim X. The holes line up for mounting, and certain reference things like inductors, PCB cutout, fan connectors are all the same.\


----------



## AvengedRobix

zlatanselvic said:


> I got the Zotac Amp Extreme - anyone know if it will take the 600w BIOS?


Not for now.. probably they add the 600w BIOS pater.. in worst scenario bring nvflash and flash another BIOS 😉


----------



## mirkendargen

FYI from Techpowerup reviews. BIOSes are PROBABLY crossflashable, but yeah. Surprising.


----------



## morph.

Ordered a Strix to pickup when the store opens downunder!


----------



## Spicedaddy

Could've ordered a TUF from Newegg Canada and Gigabyte from CC, but they don't fit in my case. 😂 

FE on BB Canada never showed in stock for me, and they probably don't ship to Quebec because there's no french on the box. How do you buy FE cards in Quebec without buying a house in another province?


----------



## sakete

Spicedaddy said:


> Could've ordered a TUF from Newegg Canada and Gigabyte from CC, but they don't fit in my case. 😂
> 
> FE on BB Canada never showed in stock for me, and they probably don't ship to Quebec because there's no french on the box. How do you buy FE cards in Quebec without buying a house in another province?


Vertical mount, my friend. It's a beautiful thing, anything will fit.


----------



## Spicedaddy

sakete said:


> Vertical mount, my friend. It's a beautiful thing, anything will fit.


They're all too long for a Meshify 2 Compact unless I remove front fans. I'd need a diagonal mount!


----------



## sakete

Spicedaddy said:


> They're all too long for a Meshify 2 Compact unless I remove front fans. I'd need a diagonal mount!


Ah ok. I'm using a Phanteks Enthoo 719 case, it's huge. But I have a custom loop.


----------



## Baasha

WayWayUp said:


> I see the strix available on newegg...
> if you want to dish out 2k lol


It was "in stock" for over 5 mintues but it wouldn't let me 'checkout' - kept saying out of stock.

I added it to cart like 20 times and nothing worked - I tried the website, app on my phone, and app on my ipad and nothing worked!

This is so goddamn frustrating.


----------



## Krzych04650

Wow, availability in the EU is like nonexistent. It is not even like things are out of stock, most shops don't even have anything listed  I am doing full rebuild anyway so I can wait until 20th since I need to for RPL anyway, but man, this does not look good.


----------



## LunaP

at least the media will slam them for their antics on high availability on day 1 release, curious to see how they'll cover that up though.


----------



## WayWayUp

I'm allowed to continue to paypal checkout on newegg but then it's removed before it can process


----------



## Alelau18

Availabilty in Spain bar Strix which supply was just non existant, the rest seemed fine, cards were on stock for a long time, ordered my TUF card, supposedly getting it by Friday since today is national holiday


----------



## sakete

Anyone interested in a Asus TUF OC that I got on Newegg? Not scalping here, I was also able to get the same card at my local Micro Center, so before I cancel my Newegg order, thought I'd ask here. Would be at cost + shipping of course.


----------



## Hanks552

sakete said:


> Anyone interested in a Asus TUF OC that I got on Newegg? Not scalping here, I was also able to get the same card at my local Micro Center, so before I cancel my Newegg order, thought I'd ask here. Would be at cost + shipping of course.


Just cancel it, I’m ready to buy it as soon it is available haha, bought a msi trio and regret it, i want 4 connectors


----------



## sakete

Hanks552 said:


> Just cancel it, I’m ready to buy it as soon it is available haha, bought a msi trio and regret it, i want 4 connectors


Just canceled it.


----------



## LunaP

I don't get whats going on w/ best buy atm, any card that comes back in as available u can't ship because it goes from shipping to u , to ship to store, to no store available to ship to, then time runs out cuz u can't proceed w/o selecting and it just dies.


----------



## Hanks552

I see one but 1799.99. I’m not paying this much, only 1599


----------



## Hanks552

LunaP said:


> I don't get whats going on w/ best buy atm, any card that comes back in as available u can't ship because it goes from shipping to u , to ship to store, to no store available to ship to, then time runs out cuz u can't proceed w/o selecting and it just dies.


I was unable to even see the FE card available, i don’t think they had it at all


----------



## lordkahless

Finally got in line at Best Buy for a purchase for the Founders. But when it was finally my turn it looks like they are not shipping these. Have to live near a store that actually has it. Which I of course don't. Screwed again.


----------



## LunaP

Hanks552 said:


> I was unable to even see the FE card available, i don’t think they had it at all


the Gigabyte OC and MSI trio keep coming back in stock for 5-10 minutes at a time.



lordkahless said:


> Finally got in line at Best Buy for a purchase. But when it was finally my turn it looks like they are not shipping these. Have to live near a store that actually has it. Which I of course don't. Screwed again.


yes this, its absolute BS because there are none, its just a façade


----------



## sakete

Hanks552 said:


> I see one but 1799.99. I’m not paying this much, only 1599


That must be the one I canceled


----------



## sakete

Anyone know when 4080 reviews will drop? I'm going to wait at least a few days before I open the box, in case I reconsider, as it's a f u c k load of money for a graphics card.


----------



## Hanks552

sakete said:


> That must be the one I canceled


Yeah I think so, but I’m not paying 200 more for “OC” haha
But yeah, now is gone, I ordered one thru B&H, is going to take a while because is backorder, but let’s see, I’m not in a rush, I still need to wait for the waterblock anyway


----------



## sakete

Hanks552 said:


> Yeah I think so, but I’m not paying 200 more for “OC” haha
> But yeah, now is gone, I ordered one thru B&H, is going to take a while because is backorder, but let’s see, I’m not in a rush, I still need to wait for the waterblock anyway


I didn't really want to either, but I suspect the regular TUF was probably had a minimal production run, and that Asus made a lot of TUF OCs and Strix's, assuming people would buy them anyway (and I'm one of the suckers that did). So the regular TUF will always sell out quick I think.


----------



## nycgtr

Anyone with the strix bios?


----------



## Mad Pistol

Bit late replying, but I was able to get a Gigabyte Windforce 4090 on Newegg for $1600. Not my first pick, but based on what I'm reading, the FE is only good if you really care about aesthetics or plan to try and overclock the snot out of it (which I'm not sure you can).

Should be here in about a week.


----------



## sakete

So are new power cables required to power these beasts? I have a Seasonic Platinum 1000W. Saw someone post something about 600W cables for their Corsair PSU.


----------



## kairi_zeroblade

sakete said:


> So are new power cables required to power these beasts? I have a Seasonic Platinum 1000W. Saw someone post something about 600W cables for their Corsair PSU.


Short answer is YES, if seasonic has made one for their older PSU lineup..

Corsair posted a list of its product lineup that is compatible with the atx v3 pcie 5.0 power connector they made.


----------



## ttnuagmada

Which one of you morons paid 3800 dollars for a Strix?


----------



## sakete

kairi_zeroblade said:


> Short answer is YES, if seasonic has made one for their older PSU lineup..
> 
> Corsair posted a list of its product lineup that is compatible with the atx v3 pcie 5.0 power connector they made.


Well, I have a Seasonic PX-1000, so it's their newer line-up. Bought it 2 years ago.

So I guess I will need a cable from Seasonic to use it on my PSU? The Corsair cable wouldn't work?


----------



## kairi_zeroblade

sakete said:


> Well, I have a Seasonic PX-1000, so it's their newer line-up. Bought it 2 years ago.
> 
> So I guess I will need a cable from Seasonic to use it on my PSU? The Corsair cable wouldn't work?


YES if the 4090 you bought didn't came with the adaptor (4x8pin power connector to 1 gen 5 power adaptor)

The FE comes with the adaptor, probably you may want to check yours when you have it or yet check unboxing reviews


----------



## jomama22

ttnuagmada said:


> Which one of you morons paid 3800 dollars for a Strix?
> View attachment 2575605


Don't worry, I have no doubt that the one who bought it will just scam the seller.


----------



## sakete

kairi_zeroblade said:


> YES if the 4090 you bought didn't came with the adaptor (4x8pin power connector to 1 gen 5 power adaptor)
> 
> The FE comes with the adaptor, probably you may want to check yours when you have it or yet check unboxing reviews


Oh I have it in hand right now (picked up at Microcenter earlier this morning), but am not opening the box yet until I've fully decided to keep it. It's a ridiculous amount of money, not sure yet I really want to keep it. Only was alerted to the existence of the 4090, so it's all been pretty impulsive.

But according to the Asus site, the TUF OC has an included adapter cable, so I'm good. To get it for free from Seasonic, I have to take pictures of the energy label and bla bla bla, would require unmounting my PSU, and disconnecting all cables. Really not in the mood for that. And well I bought it more than a year ago anyway, so I don't qualify for the free cable anyway.


----------



## dr/owned

jomama22 said:


> Don't worry, I have no doubt that the one who bought it will just scam the seller.


The ole return-a-brick play and ebay will auto-refund as soon as the return package shows delivered.

Seriously people selling these things on ebay are f-ing stupid. Ebay has *way *more buyer protection than seller.


----------



## ArcticZero

sakete said:


> Well, I have a Seasonic PX-1000, so it's their newer line-up. Bought it 2 years ago.
> 
> So I guess I will need a cable from Seasonic to use it on my PSU? The Corsair cable wouldn't work?


Seasonic has a 12VHPWR cable you can request for.

https://seasonic.com/cable-request/


----------



## sakete

ArcticZero said:


> Seasonic has a 12VHPWR cable you can request for.
> 
> https://seasonic.com/cable-request/


Thanks, I saw that through a google search. I don't qualify however, only if you purchased the PSU within the past year. I bought mine 2 years ago, and I can't even find the receipt anymore 

But the GPU has an adapter cable, so no biggie. And I'm not too concerned about looks


----------



## dr/owned

I'm in the BBY queue









(annnd sold out)


----------



## lordkahless

I have been in the queue 4 times for the Founders on BB. If you do manage to get in won't matter much as they are not shipping these. Have to live somewhere that actually has it to be able to go pick up. Which will never be the case for me.


----------



## LunaP

dr/owned said:


> I'm in the BBY queue
> View attachment 2575607
> 
> 
> (annnd sold out)


Me x 50 but 20% of the time u get to payment but have to select a store cuz suddenly its not shipping anywhere and has to be at a store that has it, like *** kind of system is this? LOL


----------



## MrTOOSHORT

Plenty Gigabyte models at Memoryexpress atm. Want a Strix or FE myself though.

Looks like they are going fast!


----------



## dr/owned

lordkahless said:


> I have been in the queue 4 times for the Founders on BB. If you do manage to get in won't matter much as they are not shipping these. Have to live somewhere that actually has it to be able to go pick up. Which will never be the case for me.


I have 14 stores in range of me so fingers crossed but it ain't looking good.


----------



## sakete

dr/owned said:


> I have 14 stores in range of me so fingers crossed but it ain't looking good.


Do you have a Micro center nearby? They had a lot of stock this morning.


----------



## mirkendargen

sakete said:


> Do you have a Micro center nearby? They had a lot of stock this morning.


The entire west coast except for LA is ****ed when it comes to Microcenter


----------



## Spiriva

I got a Strix 4090.










27.990SEK = €2545 or $2466

Gonna be fun to start testing it out!


----------



## mirkendargen

Has anyone east of PST tried walking into a Best Buy yet to see if they're holding stuff in store?


----------



## J7SC

After a quick review of a few tests of AIB cards, they seem to have good temps similar to the FE and also clock between 50 - 90 MHz higher with the top-tier offerings (ie. Aorus Master, Strix) for which 3GHz seems to be indeed possible.

Anyway, as posted a while back, I hope that AMD's upcoming RDNA3 will light a fire under NVidia - and with AMD's chiplet design / related patents and history of multi-chiplets, that is entirely possible...my plan is still to wait until early next year when AMD's and NVidia's top cards (ie. 4090 Ti) are released. More (get the salt) rumours are also beginning to swirl about 4090 Ti as was to be expected after official release of the 4090. 

@PLATOON TEKK - DerBauer's vid on the 4090 Aorus Master uncovered a potential setup for SLI/NVLink (see timestamped vid) - for the 4090 Ti which likely might share PCBs ?


----------



## Talon2016

Picked it up this morning at Microcenter.


----------



## dante`afk

what a ****show again


----------



## Avacado

Talon2016 said:


> View attachment 2575608
> 
> 
> Picked it up this morning at Microcenter.


Gross how chonky that cooler is.


----------



## Hanks552

mirkendargen said:


> Has anyone east of PST tried walking into a Best Buy yet to see if they're holding stuff in store?


I don’t think Best Buy is going to have in store Rtx 4000


----------



## sakete

Not opening the box yet until I decide I keep it.


----------



## mirkendargen

Hanks552 said:


> I don’t think Best Buy is going to have in store Rtx 4000


People that have gotten in the queue for various cards there were store pickup only. Might be ship to store, might be stocked in the store.


----------



## J7SC

Avacado said:


> Gross how chonky that cooler is.


...jumbo-sized


Spoiler


----------



## lordkahless

I am starting to see more and more founders 4090 on Ebay. Just saw one seller that has 5. Why would Best Buy allow 5 to one person


----------



## mirkendargen

lordkahless said:


> I am starting to see more and more founders 4090 on Ebay. Just saw one seller that has 5. Why would Best Buy allow 5 to one person


Cause Best Buy got paid either way.


----------



## Baasha

Talon2016 said:


> View attachment 2575608
> 
> 
> Picked it up this morning at Microcenter.


Very nice. How I wish there was a Microcenter where I was. Sigh.. the closest one is ~ 3.5 hours away.


----------



## Sir Beregond

Avacado said:


> Gross how chonky that cooler is.


Wait till you see the RTX 5090 leaks.


----------



## alasdairvfr

Picked up a Gigabyte Gaming OC today. I was hoping for a Suprim X since they are much cheaper than the Strix but this one was in stock this AM. I bumped it up a bit, core +195 + memory +1250 and landed this score on:

Port Royal: NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)











TimeSpy Extreme: NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)











Firestrike Ultra: NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)












Edits as I add more results


----------



## J7SC

...this one could be interesting for those who do not want to do a full custom block / loop setup - at least it's a 360 rad AIO (price in Can$).


----------



## Sir Beregond

alasdairvfr said:


> Picked up a Gigabyte Gaming OC today. I was hoping for a Suprim X since they are much cheaper than the Strix but this one was in stock this AM. I bumped it up a bit, core +195 + memory +1250 and landed this score on Port Royal: NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)


Very close to 3GHz, nice.

Guess no one wants the Zotacs at my local Micro Center. Their website is _usually _accurate for stock, but not entirely sure for launch day. The other 11 models are in the out of stock tab.


----------



## itzzgau

Hey is it safe to connact a 2 pci-e 8(6+2)-pin with pigitail split connectors to the RTX 4090 with 1 x 16-pin to 8-pin Adapter Cable?
Should i buy an extra pci-e 8(6+2)-pin cable or should i buy a new pcie 5.0 PSU?


----------



## marc0053

This video certainly aged well 




Hopeful to join the club here soon.


----------



## Benni231990

Today i have contact with Alphacool and i can Confirm that alphacool makes an Waterblock for Suprim X Because ill send him my card and they make the Block and i get my Block for free in maybe 4weeks you can buy the block


----------



## sakete

Sir Beregond said:


> Very close to 3GHz, nice.
> 
> Guess no one wants the Zotacs at my local Micro Center. Their website is _usually _accurate for stock, but not entirely sure for launch day. The other 11 models are in the out of stock tab.
> 
> View attachment 2575612


Yeah, you would've needed to be earlier. The stock seemed to be reasonably accurate at my location, though when the site was still showing the regular Asus TUF in stock, in store it was already gone. So I got the TUF OC instead.

Oh and I see Denver store. So we're in the same metro area.


----------



## Mad Pistol

itzzgau said:


> Hey is it safe to connact a 2 pci-e 8(6+2)-pin with pigitail split connectors to the RTX 4090 with 1 x 16-pin to 8-pin Adapter Cable?
> Should i buy an extra pci-e 8(6+2)-pin cable or should i buy a new pcie 5.0 PSU?


Do you remember those old 2x6-pin -> 8-pin adapters?

Hear me out.

8x6-pin -> 4x8-pin -> 1x16-pin connector.


----------



## WayWayUp

looks like in port royal someone got 3,360 Mhz @ -35c
For comparative purposes, at the same temp with a golden chip the 3090 was doing roughly 2360Mhz or a full 1000Mhz less

so this gpu scales well with temp. For Overclocking purposes, using a waterblock will yield much much better results than any bios or high power draw. 

With Ln2 and some time we will see 3600+ Mhz


----------



## RetroWave78

Feklar said:


> 2 weeks time guaranteed for FE? Nope. It sold out in seconds. You won't find one except Ebay this year. This is modern times. Use bots scripts cheats whatever it takes. Unfortunately true.


This is day 1, Nvidia allowed the conditions for this scenario because it generates hysteria on top of the near certainty that inside scalping was occurring. 

Watch how quickly stock manifests as we approach RDNA3. 

This situation will not hold. NV have gobs of dies. 

As for overclocking Lovelace, yep I called it, undervolting is the way to go:


----------



## RetroWave78

J7SC said:


> After a quick review of a few tests of AIB cards, they seem to have good temps similar to the FE and also clock between 50 - 90 MHz higher with the top-tier offerings (ie. Aorus Master, Strix) for which 3GHz seems to be indeed possible.
> 
> Anyway, as posted a while back, I hope that AMD's upcoming RDNA3 will light a fire under NVidia - and with AMD's chiplet design / related patents and history of multi-chiplets, that is entirely possible...my plan is still to wait until early next year when AMD's and NVidia's top cards (ie. 4090 Ti) are released. More (get the salt) rumours are also beginning to swirl about 4090 Ti as was to be expected after official release of the 4090.
> 
> @PLATOON TEKK - DerBauer's vid on the 4090 Aorus Master uncovered a potential setup for SLI/NVLink (see timestamped vid) - for the 4090 Ti which likely might share PCBs ?


Great point, I might as well wait for 4090 Ti as well. 

People, getting this card right now at any means necessary and any cost isn't going to make you happier. Get outside with sunlight and fresh air and a nice walk or hike.


----------



## Emmanuel

Woke up at 5:30AM for the non-existent FE... Best Buy is a joke tying up their stock to physical locations. The only one I could have successfully purchased was the TUF OC on Newegg, but with tax it was a ridiculous $1850 or so, so I let my cart perish.

Before going back to sleep, I signed up for notifications on B&H for the regular TUF, and had a pre-order link emailed to me by the time I woke up again  Best part, no tax, $1612 out the door. I don't mind waiting till Christmas to receive it given the price and the fact that I didn't feel disrespected as a customer.

It's pathetic how it's become the norm to generate a ton of hype (and scalping) by not taking preorders but instead creating a f*fest on release day between real customers, bots and scalpers. I guess that's what tech porn looks like.


----------



## LuckyImperial

dante`afk said:


> what a ****show again



It's annoying man. I know I'll get a 4090 eventually (I was able to submit an order through my local PC hardware store) but watching Best Buy and NewEgg sell out in sub 60 seconds was really frustrating considering the rumors of inventory said there were plenty. 

I guess it's my fault for believing the rumor mill. Hopefully supply stabilizes over the next month.


----------



## sblantipodi

Hi all, I received my Suprim X, I have a Corsair HX1200i and the new cable from Corsair.

I installed the card and all is well until I install the Nvidia driver, when I install the Nvidia driver it reboots abruptly, than it loads windows, it shows the desktop and it reboot, and again and again.

If I reboot in safe mode or if I uninstall Nvidia driver the pc works well.

Is my GPU faulty?


----------



## dr/owned

Tease me daddy:


----------



## Sir Beregond

sakete said:


> Yeah, you would've needed to be earlier. The stock seemed to be reasonably accurate at my location, though when the site was still showing the regular Asus TUF in stock, in store it was already gone. So I got the TUF OC instead.
> 
> Oh and I see Denver store. So we're in the same metro area.


Oh I am not trying to buy, was just curious.


----------



## WayWayUp

after watching the reviews, anyone else concerned that gpus are outpacing cpus in performance increases?

the 4090 maxes out 1440p and is often capped. These measly 15% gains in cpus wont cut it....
4090ti is supposed to be 20% faster

but the real star coming is the 5090 and rdna 4. I know this generation was a huge increase but next gen with hopper efficiency + chiplet design will be the real hero and i expect an even larger gen over gen bump

Normally this would get normalized with game engines increasing fidelity with much higher system requirements but with consoles being the main target and with game development costs so high for new cutting tech, nobody is really pushing boundaries besides raytracing

seriously need next gen unreal engine in new games


----------



## dr/owned

WayWayUp said:


> after watching the reviews, anyone else concerned that gpus are outpacing cpus in performance increases?


Unless something has changed "cpu bound" has always been a function of framerate and not the underlying GPU. 1080 is "cpu bound" because the framerate is ridiculously high not solely because it's 1080.


----------



## RetroWave78

"Whoops" like we totally didn't do this ourselves, it was some other scalpers, we would never internally scalp and sell on eBay ourselves"

*Q. “I couldn’t even add this to my cart or anything. It went from “unavailable” to “item not available for shipping” and “unavailable nearby” exactly at 9am. Now it just says Sold Put. How is this possible? I was refreshing the page once every second.

A. Best buy had a design flaw with the website that allowed people to enter the queue prior to 6 am PST. Only the scalpers and botters knew this trick. You append &addchv=true to the end of the product url and it allowed you to enter the queue multiple minutes before 6 am.*


----------



## Mad Pistol

WayWayUp said:


> after watching the reviews, anyone else concerned that gpus are outpacing cpus in performance increases?
> 
> the 4090 maxes out 1440p and is often capped. These measly 15% gains in cpus wont cut it....
> 4090ti is supposed to be 20% faster
> 
> but the real star coming is the 5090 and rdna 4. I know this generation was a huge increase but next gen with hopper efficiency + chiplet design will be the real hero and i expect an even larger gen over gen bump
> 
> Normally this would get normalized with game engines increasing fidelity with much higher system requirements but with consoles being the main target and with game development costs so high for new cutting tech, nobody is really pushing boundaries besides raytracing
> 
> seriously need next gen unreal engine in new games


Without a doubt the 4090 has outpaced current CPUs. AMD dropped a bomb with the 5800x3D, and even then, I fear it is not enough to push these cards to the max.


----------



## alitayyab

sblantipodi said:


> Hi all, I received my Suprim X, I have a Corsair HX1200i and the new cable from Corsair.
> 
> I installed the card and all is well until I install the Nvidia driver, when I install the Nvidia driver it reboots abruptly, than it loads windows, it shows the desktop and it reboot, and again and again.
> 
> If I reboot in safe mode or if I uninstall Nvidia driver the pc works well.
> 
> Is my GPU faulty?


Check with the bundled 12vhpwr to pcie cable. If the behavior is the same, it may be the PSU. Also I had issues with rebar being enabled in the bios. Setting it to auto fixed by rebooting issues. (gigabyte z690 Gaming X)


----------



## inedenimadam

brought home the 4090 gaming x trio 

OC to 2940 core and +1850 mem, 
Port Royal 27155. Its only 2000 points away from my best 3090 SLI results. INSANE generational jump. 

Need shunt mod or higher TDP bios to go higher. and a waterblock.


----------



## ZealotKi11er

Got the gaming OC. Tried small OC. The card is basically voltage limited. More power does nothing. Even with Furmark it does not go over 520w. I also dont see the point to watercool this card apart from fitting in my case. I dont know how That will happen with the stupid connector .


----------



## lordkahless

RetroWave78 said:


> "Whoops" like we totally didn't do this ourselves, it was some other scalpers, we would never internally scalp and sell on eBay ourselves"
> 
> *Q. “I couldn’t even add this to my cart or anything. It went from “unavailable” to “item not available for shipping” and “unavailable nearby” exactly at 9am. Now it just says Sold Put. How is this possible? I was refreshing the page once every second.
> 
> A. Best buy had a design flaw with the website that allowed people to enter the queue prior to 6 am PST. Only the scalpers and botters knew this trick. You append &addchv=true to the end of the product url and it allowed you to enter the queue multiple minutes before 6 am.*


Would be nice if Best Buy found those orders and canceled them. At the very least ones prior to 6am.


----------



## kx11

I'm in 

Aorus Xtreme waterforce, the guy sold it me thought it was MASTER model  i got away with it


----------



## Emmanuel

lordkahless said:


> Would be nice if Best Buy found those orders and canceled them. At the very least ones prior to 6am.


As far as they're concerned, they made money either way. They don't care who they sold to.


----------



## heptilion

kx11 said:


> I'm in
> 
> Aorus Xtreme waterforce, the guy sold it me thought it was MASTER model  i got away with it


How is it? I am also thinking of getting this.


----------



## WayWayUp

kx11 said:


> I'm in
> 
> Aorus Xtreme waterforce, the guy sold it me thought it was MASTER model  i got away with it


can you share some temps and OCs with us when you get a chance? I want to see how how much difference the aio makes


----------



## RaMsiTo

It would be nice if different bios were uploaded, I'm waiting for a suprim X and the bios is 520w, it will be the first thing I do when it can be changed.


----------



## KedarWolf

I bought a 7950x, motherboard and DDR5 in the next two months. I think that CPU can keep up with a 4090.

I won't be able to get a 4090 or 4090 Ti until March though when I get my tax refund.


----------



## Shaded War

inedenimadam said:


> brought home the 4090 gaming x trio
> 
> OC to 2940 core and +1850 mem,
> Port Royal 27155. Its only 2000 points away from my best 3090 SLI results. INSANE generational jump.
> 
> Need shunt mod or higher TDP bios to go higher. and a waterblock.


 How is the colling performance and fan noise levels? Ordered the non x model of this and dont see reviews yet.


----------



## sakete

So what's this talk of a 4090 Ti already?


----------



## KedarWolf

Shaded War said:


> How is the colling performance and fan noise levels? Ordered the non x model of this and dont see reviews yet.


It's not a typical GPU AIO with fans on the memory and a block on the core, but a full waterblock and backplate attached to an AIO type rad and pump, which is cool. 

Edit: The AIO SUPRIM is the fan/core type, not a full water block, which is what I'm referring to as the difference.






Gigabyte AORUS GeForce RTX™ 4090 XTREME WATERFORCE 24GB PCI-E w/ HDMI, Triple DP - PCI-E Video Cards - Memory Express Inc.







www.memoryexpress.com










MSI GeForce RTX 4090 SUPRIM LIQUID X 24GB PCI-E w/ Triple DP, HDMI - PCI-E Video Cards - Memory Express Inc.







www.memoryexpress.com


----------



## kx11

I scored 10 541 in Speed Way


Intel Core i9-12900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## RetroWave78

lordkahless said:


> Would be nice if Best Buy found those orders and canceled them. At the very least ones prior to 6am.


They are the ones behind it. How did randoms know precisely what to add at the end of the URL? This gives them plausible deniability "gee we don't know what happened" it also generates media publicity / free advertising and generates hysteria. 

Don't think this happens? I posted how MSI got caught scalping themselves multiple times here recently, you may have missed it. 

This is 100% inside job, do not be fooled.


----------



## RetroWave78

Emmanuel said:


> As far as they're concerned, they made money either way. They don't care who they sold to.


They make more when they scalp themselves, see MSI example in previous posts. 

This should be illegal but it isn't. "Whatever the market will bear" 

Don't like it? Don't support Best Buy or Nvidia. They 100% could have implemented a fair system to prevent this (i.e. EVGA's queue system), they didn't _because they are the ones behind it, scalping internally. _


----------



## EastCoast

Can someone explain to me why I keep seeing weaker 1080p and 1440p benchmark results against last gen GPUs? 
Sure, sure it's not in every title. But it's in enough titles to actual take notice that this is a trend. Now, at 4K it more a winner in. 
But in lower resolutions it really struggles. This does not look like a cpu bottleneck issue. Something else is going on.


----------



## RetroWave78

RaMsiTo said:


> It would be nice if different bios were uploaded, I'm waiting for a suprim X and the bios is 520w, it will be the first thing I do when it can be changed.


It may be limited to 520w for safety reasons, scrutinizing TPU review data the memory temps were 10c higher than FE. They may not want a repeat of 3090 micron catching on fire debacle.

VRM's connect to the cold plate. More power = hotter everything, especially memory.


----------



## sakete

Anyone think a 1000W PSU is not sufficient for the 4090 + 5950X + a whole lot of fans and pumps (custom loop) and monitoring accessories, as well as a bunch of HDDs, SSDs, and NVMe SSDs?


----------



## sakete

EastCoast said:


> Can someone explain to me why I keep seeing weaker 1080p and 1440p benchmark results against last gen GPUs?
> Sure, sure it's not in every title. But it's in enough titles to actual take notice that this is a trend. Now, at 4K it more a winner in.
> But in lower resolutions it really struggles. This does not look like a cpu bottleneck issue. Something else is going on.


It would be very interesting to see these graphs with power consumption on top. I wonder if the 4090 would have lower power usage than the previous gen cards at the same performance at those resolutions. I bet that the answer is yes.


----------



## sblantipodi

alitayyab said:


> Check with the bundled 12vhpwr to pcie cable. If the behavior is the same, it may be the PSU. Also I had issues with rebar being enabled in the bios. Setting it to auto fixed by rebooting issues. (gigabyte z690 Gaming X)


I used the bundled 4 cables splitter and the Corsair 12vhpwr special cable for my power supply. Same crap.

Is the GPU broken?


----------



## EastCoast

sakete said:


> It would be very interesting to see these graphs with power consumption on top. I wonder if the 4090 would have lower power usage than the previous gen cards at the same performance at those resolutions. I bet that the answer is yes.


Pretty good interpretation of what might be going on. They too are using 850wat psu. I wonder what the results would be on the evga 1600watt T2?


----------



## dk_mic

inedenimadam said:


> brought home the 4090 gaming x trio
> 
> OC to 2940 core and +1850 mem,
> Port Royal 27155. Its only 2000 points away from my best 3090 SLI results. INSANE generational jump.
> 
> Need shunt mod or higher TDP bios to go higher. and a waterblock.


do you know the differences between the two bios verions on it?
just a 15 MHz boost difference?

from the website:
Extreme Performance: 2610 MHz (MSI Center)
Boost: 2595 MHz (GAMING & SILENT Mode) 

also, is the 450W power limit due to the 3 x 8 pin adapter or due to bios?


----------



## KedarWolf

RetroWave78 said:


> It may be limited to 520w for safety reasons, scrutinizing TPU review data the memory temps were 10c higher than FE. They may not want a repeat of 3090 micron catching on fire debacle.
> 
> VRM's connect to the cold plate. More power = hotter everything, especially memory.



With a water block, and maybe an active backplate, or just a really good block like Optimus Water Cooling makes, I doubt that would be a problem.

And with my 3090 with an Optimus Strix block and no active backplate, my memory temps don't break 60C while benching or running games. And this is on one pump and one 360 rad.


----------



## KedarWolf

Nvidia reportedly preparing RTX 4090 Ti cards with up to 20% increased performance over the RTX 4090


If you thought RTX 4090 was too overpowered, wait until Nvidia launches the 4090 Ti models with the best binned AD102 dies. These beasts would get even more CUDA cores, up to 2.95 GHz boost core clock out of the box, 24 Gbps GDDR6X VRAM and 475 W TGP. Could probably cost an arm and a leg, too.




www.notebookcheck.net


----------



## sakete

EastCoast said:


> Pretty good interpretation of what might be going on. They too are using 850wat psu. I wonder what the results would be on the evga 1600watt T2?


I hope my Seasonic PX-1000 is good enough for my setup  Don't really want to shell out another $300 for a 1300W PSU.


----------



## sakete

KedarWolf said:


> Nvidia reportedly preparing RTX 4090 Ti cards with up to 20% increased performance over the RTX 4090
> 
> 
> If you thought RTX 4090 was too overpowered, wait until Nvidia launches the 4090 Ti models with the best binned AD102 dies. These beasts would get even more CUDA cores, up to 2.95 GHz boost core clock out of the box, 24 Gbps GDDR6X VRAM and 475 W TGP. Could probably cost an arm and a leg, too.
> 
> 
> 
> 
> www.notebookcheck.net


Starting at the low price of $2,499.99!


----------



## KedarWolf

sblantipodi said:


> I used the bundled 4 cables splitter and the Corsair 12vhpwr special cable for my power supply. Same crap.
> 
> Is the GPU broken?


Maybe a Windows clean install time? 

Or use DDU to remove the drivers in safe mode, then install them not in safe mode?


----------



## sblantipodi

sblantipodi said:


> Hi all, I received my Suprim X, I have a Corsair HX1200i and the new cable from Corsair.
> 
> I installed the card and all is well until I install the Nvidia driver, when I install the Nvidia driver it reboots abruptly, than it loads windows, it shows the desktop and it reboot, and again and again.
> 
> If I reboot in safe mode or if I uninstall Nvidia driver the pc works well.
> 
> Is my GPU faulty?


Any help here?

I tried both the bundled 4 cables splitter and the Corsair dedicated cable for my psu.


----------



## sakete

KedarWolf said:


> Maybe a Windows clean install time?
> 
> Or use DDU to remove the drivers in safe mode, then install them not in safe mode?


Man, I really need to do that again, re-install Win10 from scratch (or maybe wait till I upgrade to Win11 in a year or so). When I went from a 4790K to a 3900X, and then to a 5950X (including swapping out mobo, etc.), I didn't even re-install Windows. Just deleted drivers, and installed new drivers. I had never done that before, but I was too lazy to have to re-install all my software. Haven't had any issues though.


----------



## KedarWolf

I find it strange the Corsair 12-pin adaptors only use two 8-pin to the one 12-pin. 🐺


----------



## EastCoast

sakete said:


> I hope my Seasonic PX-1000 is good enough for my setup  Don't really want to shell out another $300 for a 1300W PSU.


It really looks like we have to think outside the box and review the 4090 using 2 different PSUs. One top end 850w vs a top end 1600w and see if there is any differences in performance at:
1080p
1440p
4K

Just to see if what we are seeing make sense or not. This card should shine throught the resolution scaling. And for whatever reason it's not. With rumors of the 4090ti being put into production I wonder if that card fixes the 4090 issue? From what I am hearing Nvidia is saving the best dies for the 4090ti. 

What's more interesting to note is why is Nvidia rushing out to release the TI when they usually wait a year or so as tradition? Is it the 7900xt???


----------



## sakete

sblantipodi said:


> Any help here?
> 
> I tried both the bundled 4 cables splitter and the Corsair dedicated cable for my psu.


If it only happened because you put in the new GPU and you didn't change anything else, there's a chance you have a DOA. Does the GPU itself turn on at all? Can you get into BIOS?


----------



## KedarWolf

sakete said:


> Man, I really need to do that again, re-install Win10 from scratch (or maybe wait till I upgrade to Win11 in a year or so). When I went from a 4790K to a 3900X, and then to a 5950X (including swapping out mobo, etc.), I didn't even re-install Windows. Just deleted drivers, and installed new drivers. I had never done that before, but I was too lazy to have to re-install all my software. Haven't had any issues though.


Try DDU in safe mode.


----------



## dk_mic

sakete said:


> I hope my Seasonic PX-1000 is good enough for my setup  Don't really want to shell out another $300 for a 1300W PSU.











Netzteil-Armageddon dank RTX 4090? Praxistests mit aktuellen Netzteilen von 650 bis 1.650 Watt


Ist die RTX 4090 der befürchtete Netzteil-Killer oder können wir in unserem Praxis-Test Entwarnung für den Alltag geben?




www.pcgameshardware.de




german article quickly testing 650W Platinum, 750W Gold and higher rated PSUs.. 
200W 12700k cinebench + furmark on a 33% increased powerlimit didnt trip the 750W psu


----------



## sakete

EastCoast said:


> It really looks like we have to think outside the box and review the 4090 using 2 different PSUs. One top end 850w vs a top end 1600w and see if there is any differences in performance at:
> 1080p
> 1440p
> 4K
> 
> Just to see if what we are seeing make sense or not. This card should shine throught the resolution scaling. And for whatever reason it's not. With rumors of the 4090ti being put into production I wonder if that card fixes the 4090 issue? From what I am hearing Nvidia is saving the best dies for the 4090ti.
> 
> What's more interesting to note is why is Nvidia rushing out to release the TI when they usually wait a year or so as tradition? Is it the 7900xt???


I really am wondering if AMD has an ace up their sleeves this time, and they'll release a killer GPU, making Nvidia panic. Also typically Nvidia releases the xx80 first, or at least at the same time as the xx90. Them doing the xx90 first could be for marketing purposes to make everyone think they have a killer GPU and to draw all the attention.

That said, I don't think I'll get a AMD GPU - Nvidia drivers are more reliable overall.


----------



## GraphicsWhore

Anyone saving $200 or $500 and going with TUF/OC vs Strix? I was all set on the Strix but it's going on the Optimus so I'm happy not to pay $2k.


----------



## sakete

GraphicsWhore said:


> Anyone saving $200 or $500 and going with TUF/OC vs Strix? I was all set on the Strix but it's going on the Optimus so I'm happy not to pay $2k.


I went with the TUF OC. Would've gone with the regular TUF even if it didn't sell out immediately. Strix is absolutely unnecessary if you're putting it on water.


----------



## Luda

menko2 said:


> It will be a shame because my 10900k is great and already overclocked and memory tweaked. The 11900k temps and power draw are not worth upgrading but for sure I don't want to change motherboard.
> 
> It's imposible to know until someone tries the 4090 with a 10900k I guess.




















Result







www.3dmark.com





out of the box with some stuff open and apparently a slightly worse CPU tune, but that gives you an idea. Not bottlenecked persay, but there's definitely room to improve CPU side. 
The right score is a kindpin 3090 on gaming clocks.


----------



## EastCoast

sakete said:


> I really am wondering if AMD has an ace up their sleeves this time, and they'll release a killer GPU, making Nvidia panic. Also typically Nvidia releases the xx80 first, or at least at the same time as the xx90. Them doing the xx90 first could be for marketing purposes to make everyone think they have a killer GPU and to draw all the attention.
> 
> That said, I don't think I'll get a AMD GPU - Nvidia drivers are more reliable overall.


Hard to say really. However, I've read that AMD/Nvidia keeping tabs on each other through TSMC. So they know what the other's performance is before released. But all rumors aside Nvidia usually release the x80 and x90. While waiting on the x70. This time things are a bit different. That alone doesn't really tell us much. However, releasing a 4090ti this early? Well, all I can say is if that's actually true that's big news. And will signal those in the know to wait for rest of the gpus to drop 1st. 

As far as drivers go: I've had both Nvidia and AMD and neither are any better then the other. So, I never base my decision on drivers alone. Both have had their ups and downs. It's price, performance and holistic platform for me.


----------



## chispy

3dmark port royal benchmark world record has been destroyed by a rtx 4090 on liquid nitrogen









Average clock frequency = 3,360 MHz

















I scored 30 061 in Port Royal


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













Takukou`s 3DMark - Port Royal score: 30061 marks with a GeForce RTX 4090


The GeForce RTX 4090 @ 2820/1462MHzscores getScoreFormatted in the 3DMark - Port Royal benchmark. Takukouranks #6 worldwide and #6 in the hardware class. Find out more at HWBOT.




hwbot.org





😃😁🙃


----------



## dpoverlord

kx11 said:


> I'm in
> 
> Aorus Xtreme waterforce, the guy sold it me thought it was MASTER model  i got away with it


Any chance you can give us the solid dimensions of both parts? I am curious on this one as I think this could actually fit in my case. This is a really unique design compared to the others out there. 

How is it running?


----------



## sakete

EastCoast said:


> Hard to say really. However, I've read that AMD/Nvidia keeping tabs on each other through TSMC. So they know what the other's performance is before released. But all rumors aside Nvidia usually release the x80 and x90. While waiting on the x70. This time things are a bit different. That alone doesn't really tell us much. However, releasing a 4090ti this early? Well, all I can say is if that's actually true that's big news. And will signal those in the know to wait for rest of the gpus to drop 1st.
> 
> As far as drivers go: I've had both Nvidia and AMD and neither are any better then the other. So, I never base my decision on drivers alone. Both have had their ups and downs. It's price, performance and holistic platform for me.


Ok, good to know RE: drivers. I've had Radeon cards in the past, but it's been many years. I haven't decided yet if I'll stick with the 4090, but at least Nvidia typically has a lot of options for watercooling blocks, and that's important to me with my custom loop. But I may still in the end decide to return it and wait out all the releases in November. I'm also betting that the 4080 16GB won't be a huge difference to the 4090, at a significantly lower cost.

I honestly wasn't even in the market for a new GPU until I saw the review yesterday. Talk about being impulsive, lol.


----------



## Mad Pistol

I hate waiting. Wish I had a Micro Center near me.


----------



## ZealotKi11er

I owned 4090 for a good 2 hours and dont know what to do with it. There is nothing out there worthy of this cards performance atm or near future.


----------



## stahlhart

ZealotKi11er said:


> I owned 4090 for a good 2 hours and dont know what to do with it. There is nothing out there worthy of this cards performance atm or near future.


You could try 3DMark Speed Way.


----------



## Mad Pistol

ZealotKi11er said:


> I owned 4090 for a good 2 hours and dont know what to do with it. There is nothing out there worthy of this cards performance atm or near future.


Do you have the 3DMark Ray Tracing Feature Test? (Not Port Royal). Run that and report back.


----------



## Alemancio

ZealotKi11er said:


> I owned 4090 for a good 2 hours and dont know what to do with it. There is nothing out there worthy of this cards performance atm or near future.


[email protected] maxed out with RT


----------



## sblantipodi

sakete said:


> If it only happened because you put in the new GPU and you didn't change anything else, there's a chance you have a DOA. Does the GPU itself turn on at all? Can you get into BIOS?


If I uninstall Nvidia driver it works. 
The problem happen when I install Nvidia driver


----------



## sakete

sblantipodi said:


> If I uninstall Nvidia driver it works.
> The problem happen when I install Nvidia driver


Do you still have an old Nvidia card? Put in that card, install the latest Nvidia driver and see what happens. If it still bootloops then, then it's the driver somehow not meshing with your system (a clean Windows install might resolve that). If that old card works fine with the latest driver, then it's probably the GPU, though you could still try doing a clean windows install. If you don't want to redo your whole system on the off-chance it won't make a difference, which I'd totally understand, then see if you can do a fresh Windows install on an external drive and boot from that, and then install drivers with new GPU to see if that works.

Also make sure you do a clean driver install (do custom driver installation instead of express install, and select "clean installation" or something like that. That should reset any setting you made in the control panel).


----------



## Sir Beregond

sakete said:


> Starting at the low price of $2,499.99!


People will still line up for it.


----------



## NeeDforKill

ZealotKi11er said:


> I owned 4090 for a good 2 hours and dont know what to do with it. There is nothing out there worthy of this cards performance atm or near future.


Hi, did you research information about Gaming OC, it must pull to 600w - bios have +33% power limit. Reviewrs wrote about 600w when overclock.
Can you tell more about Gaming OC, tomorrow will be start in my country and on start will be only 3 models: Gigabyte Gaming OC, Gigabyte Windforce and MSI GAMING X TRIO.
MSI GAMING X TRIO have 3x8 pin connectors why? all others have 4x8 pin.
So i not know what to chose


----------



## sakete

Sir Beregond said:


> People will still line up for it.


Oh for sure! I 100% will not. I was at Micro Center this morning already debating if I should really go for it, the 4090. And then I let the sales rep upsell me to a 2yr in-store warranty, which with a 3yr warranty on the card isn't really necessary, though they do make it pretty hassle free (had do it once before I believe). And I'm still on the fence about keeping it, though now I'm at least tempted to open the box. I can still return it after either way.


----------



## ZealotKi11er

+150 looks stable.
+200 artifacts. 

Would be interest if FE is better binned. 

We need more voltage. 

@NeeDforKill, yes Gigabyte Gaming OC comes with 4x PCIe connector and ca do 133% ~ 600W.


----------



## WayWayUp

chispy said:


> 3dmark port royal benchmark world record has been destroyed by a rtx 4090 on liquid nitrogen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Average clock frequency = 3,360 MHz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 30 061 in Port Royal
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Takukou`s 3DMark - Port Royal score: 30061 marks with a GeForce RTX 4090
> 
> 
> The GeForce RTX 4090 @ 2820/1462MHzscores getScoreFormatted in the 3DMark - Port Royal benchmark. Takukouranks #6 worldwide and #6 in the hardware class. Find out more at HWBOT.
> 
> 
> 
> 
> hwbot.org
> 
> 
> 
> 
> 
> 😃😁🙃


i dont know what was used but Ln2 would produce better than -35c


----------



## Alemancio

Are there any major AIB differences so far that would make you incline more towards one brand/model?

AFAIK only Zotac has a low Watt limit. Cooling seems to be within error for almost all AIB


----------



## sakete

Indeed a big boy. I've decided I'm going to at least install it to see how it runs.

TUF 4090 vs FTW3 3080 + Optimus block. 

And I LOVE quick disconnects!


----------



## NeeDforKill

ZealotKi11er said:


> +150 looks stable.
> +200 artifacts.
> 
> Would be interest if FE is better binned.
> 
> We need more voltage.
> 
> @NeeDforKill, yes Gigabyte Gaming OC comes with 4x PCIe connector and ca do 133% ~ 600W.


Thanks for reply. You said above you can't get more than 520w. How much you get with +150mhz wattage and how much boost clock?


----------



## KedarWolf

sblantipodi said:


> If I uninstall Nvidia driver it works.
> The problem happen when I install Nvidia driver


You're not listening to me. Google DDU, and use it to fully and cleanly uninstall your drivers. Use it in safe mode.

Then boot into Windows and install your drivers.

DDU can fix issues with older installs of the driver.

I hate holding a PC users hands, it's not rocket surgery.

But use this in safe mode.









Wagnardsoft


Wagnardsoft : Computer software (DDU, ISLC), support and news.




www.wagnardsoft.com


----------



## mirkendargen

KedarWolf said:


> I find it strange the Corsair 12-pin adaptors only use two 8-pin to the one 12-pin. 🐺


Why? A 12VHPWR connector has 6 +12V wires and 6 ground wires. A Corsair PSU 8pin connector has 4 +12V wires and 4 ground wires. Why isn't 8 +12V and 8 ground wires enough?


----------



## kx11

WayWayUp said:


> can you share some temps and OCs with us when you get a chance? I want to see how how much difference the aio makes


----------



## Rei86

mirkendargen said:


> Why? A 12VHPWR connector has 6 +12V wires and 6 ground wires. A PSU 8pin connector has 4 +12V wires and 4 ground wires. Why isn't 8 +12V and 8 ground wires enough?


Because the FE models come with a ugly four 8pin PCIe connectors to one 12VHPWR connector.
Apparently you can do 3 if you run with no OC/slight on the FE model and go all four for 600W BIOS.


----------



## 8472

EastCoast said:


> Can someone explain to me why I keep seeing weaker 1080p and 1440p benchmark results against last gen GPUs?
> Sure, sure it's not in every title. But it's in enough titles to actual take notice that this is a trend. Now, at 4K it more a winner in.
> But in lower resolutions it really struggles. This does not look like a cpu bottleneck issue. Something else is going on.


My guess is that in addition to CPU bottlenecks, it's either game engine limitations or Nvidia just optimized the drivers for 4k.


----------



## sblantipodi

sakete said:


> Do you still have an old Nvidia card? Put in that card, install the latest Nvidia driver and see what happens. If it still bootloops then, then it's the driver somehow not meshing with your system (a clean Windows install might resolve that). If that old card works fine with the latest driver, then it's probably the GPU, though you could still try doing a clean windows install. If you don't want to redo your whole system on the off-chance it won't make a difference, which I'd totally understand, then see if you can do a fresh Windows install on an external drive and boot from that, and then install drivers with new GPU to see if that works.
> 
> Also make sure you do a clean driver install (do custom driver installation instead of express install, and select "clean installation" or something like that. That should reset any setting you made in the control panel).


I have a 2080, with the 2080 it works no problem.

I even tried a fresh windows install. 
My 4090 works well until I install Nvidia driver, once I install Nvidia driver it starts bootlooping.
If I reboot in safe mode, it works well.

This is so weird


----------



## KedarWolf

mirkendargen said:


> Why? A 12VHPWR connector has 6 +12V wires and 6 ground wires. A Corsair PSU 8pin connector has 4 +12V wires and 4 ground wires. Why isn't 8 +12V and 8 ground wires enough?


I am pretty sure originally I saw like four 8-pin to one 12-pin adapters.


----------



## mirkendargen

Rei86 said:


> Because the FE models come with a ugly four 8pin PCIe connectors to one 12VHPWR connector.
> Apparently you can do 3 if you run with no OC/slight on the FE model and go all four for 600W BIOS.


Those don't plug into a PSU, they plug into PCIE 8pins. A Corsair PSU 8pin is not a PCIE 8pin, a PCIE 8pin has 3 12V/ground wires. It's not like there's black magic inside a PSU with a native 12VHPWR connector where it needs to combine 3 wires into one wire. A wire is a wire as long as the gauge is adequate for the current.


----------



## sakete

Whew, it barely fits, but it works. Had to ghetto rig a bunch of QDCs to keep the rest of my loop intact. I'll need to do some serious maintenance if I decide to keep this and eventually have a block for this 4090.

With a waterblock though it should take up a lot less space.


----------



## mirkendargen

-


----------



## superino091

sblantipodi said:


> I have a 2080, with the 2080 it works no problem.
> 
> I even tried a fresh windows install.
> My 4090 works well until I install Nvidia driver, once I install Nvidia driver it starts bootlooping.
> If I reboot in safe mode, it works well.
> 
> This is so weird


you see another user having the same problem.
you have in common the power supply AX1200i
I sent you a private message


----------



## KedarWolf

superino091 said:


> you see another user having the same problem.
> you have in common the power supply AX1200i
> I sent you a private message


Yeah, I Googled it, and someone changing their PSU fixed it. Or if you're not using the Corsair 2x8-pin to 12-pin adapter, buy one and try that.


----------



## Spiriva

sblantipodi said:


> I have a 2080, with the 2080 it works no problem.
> 
> I even tried a fresh windows install.
> My 4090 works well until I install Nvidia driver, once I install Nvidia driver it starts bootlooping.
> If I reboot in safe mode, it works well.
> 
> This is so weird


Can you try with another PSU?


----------



## Kampers

How can orders 4090 FE in EU ? Nvidia Website? I know now it's impassible getting one's but I will try buy somewhere. Can find the card locally shop ?


----------



## ZealotKi11er

So far getting for OC
+150 Core Pass, +200 Fail
+1500 Mem Pass, +2000 Fail

Perf uplift with 3DMark Speedway ~ 8.5%


----------



## Rei86

mirkendargen said:


> Those don't plug into a PSU, they plug into PCIE 8pins. A Corsair PSU 8pin is not a PCIE 8pin, a PCIE 8pin has 3 12V/ground wires. It's not like there's black magic inside a PSU with a native 12VHPWR connector where it needs to combine 3 wires into one wire. A wire is a wire as long as the gauge is adequate for the current.


Just saying for the confusion as I would hazard to guess anyone running the two 8pin to one 12pin will have issues.
Since in spec 75+150(2) vs 75+150(4).



EastCoast said:


> Spoiler: Pictures
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Pictures
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone explain to me why I keep seeing weaker 1080p and 1440p benchmark results against last gen GPUs?
> Sure, sure it's not in every title. But it's in enough titles to actual take notice that this is a trend. Now, at 4K it more a winner in.
> But in lower resolutions it really struggles. This does not look like a cpu bottleneck issue. Something else is going on.


CPU bottleneck and hitting that game engine limitations in certain games.


----------



## sakete

Ran a benchmark in Cyberpunk at 1440P, everything maxed out. Total board power draw as reported by GPU-Z was 370W.


----------



## Spiriva

Kampers said:


> How can orders 4090 FE in EU ? Nvidia Website? I know now it's impassible getting one's but I will try buy somewhere. Can find the card locally shop ?


From what Ive read most EU countries didnt get any FE cards and all. I think Germany got some tho.


----------



## mirkendargen

Rei86 said:


> Just saying for the confusion as I would hazard to guess anyone running the two 8pin to one 12pin will have issues.
> Since in spec 75+150(2) vs 75+150(4).


I think the confusion is what Corsair is selling. They're selling a modular PSU cable, not an adapter. The only spec on modular cables plugging into Corsair PSU's is whatever Corsair decides based on the connectors they source.


----------



## tps3443

sakete said:


> Ran a benchmark in Cyberpunk at 1440P, everything maxed out. Total board power draw as reported by GPU-Z was 370W.


Hey, can you share your exact video graphics settings? I want to run my 3090 and get an idea of the up lift. I also run 2560x1440P.


----------



## Kampers

Spiriva said:


> From what Ive read most EU countries didnt get any FE cards and all. I think Germany got some tho.


Ok but have we info, who's sell the card? Nvidia Website, shop ?


----------



## Luda

kx11 said:


> I'm in
> 
> Aorus Xtreme waterforce, the guy sold it me thought it was MASTER model  i got away with it


That combo looks amazing! the screen on the monoblock is very slick. How are you liking the waterforce?


----------



## sakete

With DLSS on Quality and everything else on High or Ultra (whatever the max value was, except for Psycho, where I left it on Ultra), max power draw was 316W or 70%.


----------



## sakete

tps3443 said:


> Hey, can you share your exact video graphics settings? I want to run my 3090 and get an idea of the up lift. I also run 2560x1440P.


I had DLSS at Off, and all the other sliders as far to the right as possible (so some max out at High, or Ultra or Psycho), though I think I had things like Motion Blur and Lens Flare (and two other settings in that section) set to off.


----------



## Spiriva

Kampers said:


> Ok but have we info, who's sell the card? Nvidia Website, shop ?


Nvidia webshop is the only place to get the FE cards in EU as far as i know, and most EU countries didnt get any FE cards to be sold on the Nvidia website. If you want a FE card the best bet is prolly to look on Nvidias German page or try to buy one from BestBuy and hope they ship to EU. But that will prolly cost more then to just buy the Strix in EU due to import tax and shipping.


----------



## slayer6288

I see someone kindly uploaded some Suprim Bioses does anyone know if the latest nvflash works with the 4090 yet?


----------



## sakete

Day 1 and everyone is flashing bioses already. Love it!


----------



## Glottis

sakete said:


> Whew, it barely fits, but it works. Had to ghetto rig a bunch of QDCs to keep the rest of my loop intact. I'll need to do some serious maintenance if I decide to keep this and eventually have a block for this 4090.
> 
> With a waterblock though it should take up a lot less space.


I see you have TUF. Is it OC model? Can you confirm how high power slider goes and is it 600W bios?


----------



## kx11

Luda said:


> That combo looks amazing! the screen on the monoblock is very slick. How are you liking the waterforce?


A lot better than i thought, the mobo was in a rough state early 2022 but after a month of bios updates it got stable and steady running, the gpu is my 1st ever WF and i like what it can do


----------



## Rei86

mirkendargen said:


> I think the confusion is what Corsair is selling. They're selling a modular PSU cable, not an adapter. The only spec on modular cables plugging into Corsair PSU's is whatever Corsair decides based on the connectors they source.


Eh didn't follow the start but I guess...

I mean it seems cut and dry to me
EVGA








Corsair 









I just don't understand how it can supply 600W


----------



## slayer6288

sakete said:


> Day 1 and everyone is flashing bioses already. Love it!


Does it work though? I figured why pay extra for a suprim if I can flash it to a gaming trio. My trio runs 3k core fine no issues right now.


----------



## sakete

Glottis said:


> I see you have TUF. Is it OC model? Can you confirm how high power slider goes and is it 600W bios?


It's the OC model. I can drag the power limit all the way to 133% in Afterburner. I don't do a lot of OC'ing, especially not with GPUs. Is that what you're looking for? And I have no idea how to check if I have the 600W BIOS. My goal will be to undervolt to limit temps and power draw.


----------



## sakete

Man, this thing is dumping a ridiculous amount of heat into my office. I thought the 3080 was bad, wow.


----------



## sblantipodi

KedarWolf said:


> Yeah, I Googled it, and someone changing their PSU fixed it. Or if you're not using the Corsair 2x8-pin to 12-pin adapter, buy one and try that.


Can you link me the article with the user that uses hx1200i that have the same problem please?

I don't have another PSU to try


----------



## RaMsiTo

TechPowerUp


Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.




www.techpowerup.com


----------



## bmagnien

...maybe I don't need to upgrade yet. top spot in speedway for the base 3090, ambient closed loop:








I scored 6 411 in Speed Way


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 3090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## GAN77

sblantipodi said:


> Can you link me the article with the user that uses hx1200i that have the same problem please?


You have to check what mode the block has, Single/multiple line mode


----------



## slayer6288

RaMsiTo said:


> TechPowerUp
> 
> 
> Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.
> 
> 
> 
> 
> www.techpowerup.com


does the current nvflash work though?


----------



## sblantipodi

GAN77 said:


> You have to check what mode the block has, Single/multiple line mode


tried both modes and same problem


----------



## tubs2x4

J7SC said:


> ...this one could be interesting for those who do not want to do a full custom block / loop setup - at least it's a 360 rad AIO (price in Can$).
> View attachment 2575611


About $1000 too high


----------



## RaMsiTo

slayer6288 said:


> does the current nvflash work though?


 The graph arrives Friday or Monday, I don't know, someone try hehehe


----------



## Glottis

sakete said:


> It's the OC model. I can drag the power limit all the way to 133% in Afterburner. I don't do a lot of OC'ing, especially not with GPUs. Is that what you're looking for? And I have no idea how to check if I have the 600W BIOS. My goal will be to undervolt to limit temps and power draw.


Thanks, this sounds like 600W BIOS. You can check all info with GPU-Z and also using GPU-Z you can upload BIOS to TechPowerUp database, since there isn't TUF bios uploaded yet


----------



## J7SC

Sir Beregond said:


> People will still line up for it.


...it's like going to a reasonably good steakhouse in inflationary times and either paying $41 for a 6 oz steak - or $57 for a 10 oz


----------



## Templar2k

Got my Palit 4090 GameRock OC today. I can just say right away, if you have a 3080/3090/ti, this upgrade is still worth it. The card runs 10-15C cooler than even a repasted and repadded Asus 3090Ti TUF using Thermal Grizly pads and paste but the vapor chamber over the GPU and RAM is one serious upgrade.
The speed, fracking hell but I haven't done any measurements but you feel everything that used to run smooth still runs smoother which means latency and the 0.1% lows are improved.
It's a pretty thing too, here's a pict of the rig.
Worth it? Depends but for hardware tinkerers and PC prousers it's a must.








Photos from 12 Oct 2022


Keep your photos and videos safe with Jottacloud. All your photos will be stored in Norway. We keep your original image size and quality, of course.




www.jottacloud.com


----------



## motivman

MSI gaming trio... hitting 450W limit in port royal... man I wish I got one of those 600W cards. Currently #22 in the world for single cards. Card hits about 3030mhz on the core clock...









I scored 26 900 in Port Royal


Intel Core i9-12900KS Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Quick disconnects for the win. Swapping my Watercooled 3090 strix to this was EZ Peezy....


----------



## sakete

Glottis said:


> Thanks, this sounds like 600W BIOS. You can check all info with GPU-Z and also using GPU-Z you can upload BIOS to TechPowerUp database, since there isn't TUF bios uploaded yet












I tried uploading BIOS, but it say BIOS reading not supported on device.


----------



## cletus-cassidy

sakete said:


> I went with the TUF OC. Would've gone with the regular TUF even if it didn't sell out immediately. Strix is absolutely unnecessary if you're putting it on water.


I did the exact same thing with same reasoning (I targeted TUF OC because I assumed regular TUF would sell out faster). Planning on the Optimus Block when it comes out. Only question for me is whether better VRM on Strix will impact OC, but based on my experience with recent Nvidia cards, I suspect the answer is no.


----------



## dr/owned

Is bmgjet still alive here? They were the ones who put out the ampere bios editor (ABE) that showed power limits on minor rails. I can dig on my server to see if I still have a copy and if it works. (EDIT: No, "Unsupported Device")

And yeah someone try running nvflash. Start by trying to flash your own bios backup


----------



## sakete

Hmm, when I run NVFlash64, I get ERROR: No NVIDIA display device adapters found. And when I try exporting my vBIOS in GPU-Z I get "BIOS reading not supported on this device".


----------



## 8472

Looks like my card from Newegg won't ship out until tomorrow. Doesn't make sense for there to be a delay in shipping.


----------



## motivman

Anyone know if cards using three PCIE cables can eventually run the 600W bios?


----------



## dr/owned

sakete said:


> Hmm, when I run NVFlash64, I get ERROR: No NVIDIA display device adapters found. And when I try exporting my vBIOS in GPU-Z I get "BIOS reading not supported on this device".


Running the latest version of GPU-Z? v.2.5.0 says 4090 support. It must be working for other people because there's a bunch of 4090 BIOS's in the TPU database now.


----------



## sakete

dr/owned said:


> Running the latest version of GPU-Z? v.2.5.0 says 4090 support. It must be working for other people because there's a bunch of 4090 BIOS's in the TPU database now.


Yep, running 2.50.0. Even tried disabling the GPU in Device Manager and then exporting, but no luck.


----------



## mirkendargen

motivman said:


> Anyone know if cards using three PCIE cables can eventually run the 600W bios?













Ground the correct sense pins on your adapter and the card won't know. Obviously, do this at your own risk, there isn't much current going through the sense pins but there's a lot going through the main 12 pins that you don't want to mess with.


----------



## yzonker

dr/owned said:


> Running the latest version of GPU-Z? v.2.5.0 says 4090 support. It must be working for other people because there's a bunch of 4090 BIOS's in the TPU database now.


Well the weird thing is that it had been showing 15 bios available for the last couple of days or more, but they were not visible. So I'm not sure those were uploaded by people that got cards today. 

Device ID appears to be the same on all I looked at including the FE, so that's good news for flashing possibly.


----------



## motivman

mirkendargen said:


> View attachment 2575665
> 
> 
> 
> Ground the correct sense pins on your adapter and the card won't know. Obviously, do this at your own risk, there isn't much current going through the sense pins but there's a lot going through the main 12 pins that you don't want to mess with.


what if the 600W bios is flashed to the card, and then use a cable with the 4 PCIE power cables? will that work instead? or maybe there are additional restrictions on the PCB preventing this? I have to compare PCB of Suprim X and Gaming Trio to see if there is a difference in the power section of the cards.


----------



## mirkendargen

motivman said:


> what if the 600W bios is flashed to the card, and then use a cable with the 4 PCIE power cables? will that work instead?


If the card can do 600w with the right sense pins grounded, it already has a 600w BIOS. Maybe some BIOSes ignore the sense pins but I doubt it (other than XOC BIOSes that aren't public)


----------



## kx11




----------



## kx11

sakete said:


> View attachment 2575663
> 
> 
> I tried uploading BIOS, but it say BIOS reading not supported on device.


----------



## dr/owned

kx11 said:


>


Hol up...that giant ass card that is the size of a SFF case...has a 500W bios?!










(EDIT: oh probably not grounding both the sense pins? that would make sense)


----------



## kx11

dr/owned said:


> Hol up...that giant ass card that is the size of a SFF case...has a 500W bios?!
> 
> View attachment 2575675


Nope, it's Xtreme WF the one with AiO


----------



## sakete

kx11 said:


> Nope, it's Xtreme WF the one with AiO


That's even more ridiculous.


----------



## Shaded War

8472 said:


> Looks like my card from Newegg won't ship out until tomorrow. Doesn't make sense for there to be a delay in shipping.


Same here, hasn’t shipped yet.

It makes sense if you know what goes on in product distribution. Sales may look over order before a picker gets notified to pick your order. Then the item needs to make it to packager. Then theres shipping. Theres at least 3-4 people dealing with your order spending time on it before it touches a ups truck. Pile on 10,000+ extra orders to what people normally do in a day, and there will be delays.


----------



## dr/owned

RetroWave78 said:


> What angers me is Nvidia dicking all of is around with 6 am PST order time. What a colossal waste of REM deep sleep.
> 
> Like seriously, don't ****ing waste our ****ing time.


You get used to it. For both the 3090 and now the 4090, Amazon has been the MVP of actually letting me buy it.


----------



## tconroy135

Asus 4090 not posting with 1600w seasonic psu red light on 4090 near where adapter plugs into 4090. Tried 1600w EVGA g+ AND 1300 w silverstone titanium. Any suggestions?


----------



## motivman

tconroy135 said:


> Asus 4090 not posting with 1600w seasonic psu red light on 4090 near where adapter plugs into 4090. Tried 1600w EVGA g+ AND 1300 w silverstone titanium. Any suggestions?


sheeesh.... that's TUF bruv


----------



## gerardfraser

sblantipodi said:


> I have a 2080, with the 2080 it works no problem.
> 
> I even tried a fresh windows install.
> My 4090 works well until I install Nvidia driver, once I install Nvidia driver it starts bootlooping.
> If I reboot in safe mode, it works well.
> 
> This is so weird


Took me 8 Hrs on RTX4090 to get rid of this crap,I only got HDMI post working so far,Display ports with 3 different HDR monitor not working on any port.

How I finally got HDMI to work on the RTX 4090.

IGPU plugged into HDMI.
Had RTX 3090 plugged in and insalled, RTX 4090 would not do crap although I did get a post 2 times in 8 hrs.
Reinstalled Windows 11 through HDMI on IGPUI,tossed in RTX 4090 and finally working fine only on HDMI.

Going to start drinking now and watch some hockey.Will tackle display ports tomorrow.

RTX 4090 Gigabyte OC
♦ CPU - AMD Ryzen™ 7 7700X With MSI MAG Core Liquid 360R V2 
♦ GPU -GIGABYTE GeForce RTX 4090 GAMING OC
♦ RAM - CORSAIR Vengeance 32GB CL36 6000MT's 
♦ Mobo - ASRock X670E Steel Legend (BIOS 1.08.AS02 Beta) 
♦ SSD - ADATA XPG GAMMIX S70 Blade Read:7400MB/s; Write: 5500MB/s 
♦ DSP - LG B9 65" 4K UHD HDR OLED G-Sync Over HDMI 
♦ PSU - EVGA GQ 1000W 
♦ CASE -Phanteks Eclispe 500A


----------



## Mad Pistol

8472 said:


> Looks like my card from Newegg won't ship out until tomorrow. Doesn't make sense for there to be a delay in shipping.


Same here. Originally said it was shipping out today... oh well.


----------



## Alelau18

Some reviewers had issues with Zen 4 and RTX 4090 with the motherboard refusing to post, be sure you have the motherboard bios fully updated

__ https://twitter.com/i/web/status/1579014774901260288


----------



## sniperpowa

I ended up getting a msi suprim liquid x ordered from Best Buy. It was pretty hard to get a card lol I tried for like 5 hours before I got to checkout damn bots are hard to compete with.


----------



## heptilion

sniperpowa said:


> I ended up getting a msi suprim liquid x ordered from Best Buy. It was pretty hard to get a card lol I tried for like 5 hours before I got to checkout damn bots are hard to compete with.


Hi can you post a GPU-Z screenshot? Would like to see power target


----------



## RaMsiTo

..


----------



## RaMsiTo

heptilion said:


> Hi can you post a GPU-Z screenshot? Would like to see power target


 530w


----------



## heptilion

RaMsiTo said:


> 530w


Thanks. Is that the Default? Max is 530W? 
Why are these AIO GPUs have lower power target? Are they more efficient?


----------



## RaMsiTo

heptilion said:


> Thanks. Is that the Default? Max is 530W?
> Why are these AIO GPUs have lower power target? Are they more efficient?


suprim x liquid default 450, max 530w
suprim x default 450 ,max 520w


----------



## LunaP

Talked to a few different best buy reps, both virtual and on the phone, everyone seems to have the same story to share, check back next month might have a few more units in stock and mention shortage due to industry stuff like its 2021 again or something. Just hung up and went to bed.


----------



## dr/owned

Has anyone else tried nvflash? I'm itchy to see if it even works.

4090 BIOS's: TechPowerUp
Latest NVFlash that hasn't been updated in a few months: NVIDIA NVFlash (5.735.0) Download

nvflash64.exe --protectoff
nvflash64.exe -6 [bios].rom
yes, yes
good luck everyone
profit.


----------



## slayer6288

dr/owned said:


> Has anyone else tried nvflash? I'm itchy to see if it even works.
> 
> 4090 BIOS's: TechPowerUp
> Latest NVFlash that hasn't been updated in a few months: NVIDIA NVFlash (5.735.0) Download
> 
> nvflash64.exe --protectoff
> nvflash64.exe -6 [bios].rom
> yes, yes
> good luck everyone
> profit.


Does not seem to work. Needs to be updated for the 4090 me thinks.


----------



## th3illusiveman

LunaP said:


> Talked to a few different best buy reps, both virtual and on the phone, everyone seems to have the same story to share, check back next month might have a few more units in stock and mention shortage due to industry stuff like its 2021 again or something. Just hung up and went to bed.


Yea figured as much, im sure supply will be fine by mid November. Only soo many people willing to throw $1600+ USD for a graphics card after all and scalping these seems like a fools errand.

Right now, just sit back and enjoy other people beta testing/finding the limits of the Cards and maybe the upcoming RDNA and "4080" reviews. Like our friend over here: 



gerardfraser said:


> Took me 8 Hrs on RTX4090 to get rid of this crap,I only got HDMI post working so far,Display ports with 3 different HDR monitor not working on any port.
> 
> How I finally got HDMI to work on the RTX 4090.
> 
> IGPU plugged into HDMI.
> Had RTX 3090 plugged in and insalled, RTX 4090 would not do crap although I did get a post 2 times in 8 hrs.
> Reinstalled Windows 11 through HDMI on IGPUI,tossed in RTX 4090 and finally working fine only on HDMI.
> 
> Going to start drinking now and watch some hockey.Will tackle display ports tomorrow.
> 
> RTX 4090 Gigabyte OC
> ♦ CPU - AMD Ryzen™ 7 7700X With MSI MAG Core Liquid 360R V2
> ♦ GPU -GIGABYTE GeForce RTX 4090 GAMING OC
> ♦ RAM - CORSAIR Vengeance 32GB CL36 6000MT's
> ♦ Mobo - ASRock X670E Steel Legend (BIOS 1.08.AS02 Beta)
> ♦ SSD - ADATA XPG GAMMIX S70 Blade Read:7400MB/s; Write: 5500MB/s
> ♦ DSP - LG B9 65" 4K UHD HDR OLED G-Sync Over HDMI
> ♦ PSU - EVGA GQ 1000W
> ♦ CASE -Phanteks Eclispe 500A


----------



## dr/owned

slayer6288 said:


> Does not seem to work. Needs to be updated for the 4090 me thinks.


Probably but if you want to go a bit sketchier I guess this rando redditor made a version that was supposed to be looser for the 3000 series:

__
https://www.reddit.com/r/ZephyrusG15/comments/np5xoc/_/hi5iowz

-6 I guess not needed with that version.


----------



## ttnuagmada

pretty disappointed. Had pre-ordered an FE block and was going to try to jump on one today. Decided to cancel the pre-order and just not upgrade for now since it doesn't seem like they're going to be easy to get anytime soon.


----------



## th3illusiveman

RetroWave78 said:


> "Quantity: 3 sold"
> 
> Not one of us who got up a 5:30 am to F5 for 45 min could get one but this scalper has already sold 3 at $2900.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 24GB GDDR6X Graphics Video Card 900-1G136-2530-000 for sale online | eBay
> 
> 
> Find many great new & used options and get the best deals for NVIDIA GeForce RTX 4090 24GB GDDR6X Graphics Video Card 900-1G136-2530-000 at the best online prices at eBay! Free shipping for many products!
> 
> 
> 
> www.ebay.com
> 
> 
> 
> 
> 
> Inside job?
> 
> What angers me is Nvidia dicking all of is around with 6 am PST order time. What a colossal waste of REM deep sleep.
> 
> Like seriously, don't ****ing waste our ****ing time.
> 
> Notice also how Nvidia owned sub-reddit locked down the "Tustin CA Microcenter Campout" thread because of all the negative publicity and they aren't letting anyone vent their anger there.
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/y1trff
> 
> *** Nvidia, *** Best Buy, *** scalpers and *** this dying hobby.
> 
> I'm putting the $1750 I would've dumped on this towards prepping for the Greatest Depression. This was and is a waste of time, mental energy and money.
> 
> I bet Jensen Wang is among the Satanic Cabal behind the Great Reset, probably rode Lolita Express and visited Bohemian Grove to participate in the "Cremation of Care" and literally derives pleasure in the suffering of others.


its a graphics card dude.... take a chill pill lol.


----------



## inedenimadam

Shaded War said:


> How is the colling performance and fan noise levels? Ordered the non x model of this and dont see reviews yet.


surprisingly good considering the TDP. fans are not bad on stock curve, but are loud at 100%.


----------



## inedenimadam

sakete said:


> Day 1 and everyone is flashing bioses already. Love it!


Not yet! NVFLash needs an update.


----------



## J7SC

FYI - some more quick reviews of a few custom AIB models...


----------



## dr/owned

let’s see who the first to shunt mod it.
C’mon ya crazy bastards, entertain me while I wait 2 weeks for Amazon to get mine out the door! (Usually they deliver in a week though even when it says some far out days, but my TUF 3090 did take 6 weeks to ship)


----------



## dante`afk

I don't see how shunt modding would do anything here


----------



## J7SC

...this gen seems to be more about Volt-mods, at least for the upper-tier cards. DerBauer is checking (per his YT comment) whether Elmor's latest EVC2 might be able to do s.th.


----------



## ZealotKi11er

Very impressed with Gaming OC. Temps are better than my 3080 TUF which was using about 100W less. Also very quite. This I confirmed has 600W BIOS.


----------



## ZealotKi11er

I am not sure how much higher we can safely go over 1.1v. That already seems high for 5nm.


----------



## mattskiiau

Can anyone confirm if the Gaming X Trio has 600w bios?


----------



## ponycar50l

I got a 4090 gigabyte oc. Dont have spare cable. Anyone know if its fine to use only 3x8 like fe model ?


----------



## ponycar50l

I got the gygabyte 4090 oc model. Dont have any spare cable. Anyone know if its fine to use only 3x8pin like FE model ?


----------



## dr/owned

mattskiiau said:


> Can anyone confirm if the Gaming X Trio has 600w bios?


Believe all the Msi cards we’ve see are 520/530W. At least the Suprim which is supposed to be a higher tier.


----------



## nyk20z3

th3illusiveman said:


> its a graphics card dude.... take a chill pill lol.


Its the era we live in now we people need instant gratification instead of being patient, no one needs this card on launch day. Its complete overkill for 99% of people but this will happen every single time. Ive been in the game since early 2000's so its not even about being salty, its actually interesting to see so many people so quick to let go of so much money and actually get stressed out over a graphics card. Eventually every one who wants one will get one


----------



## mattskiiau

dr/owned said:


> Believe all the Msi cards we’ve see are 520/530W. At least the Suprim which is supposed to be a higher tier.


Damn think I got scammed jumping on the Trio X.
The gigabyte OC looks to be 600w, is cheaper and inclusion of 4x adaptor where as the Trio X I have coming is only 520w~ with 3x adaptor.


----------



## dr/owned

mattskiiau said:


> Damn think I got scammed jumping on the Trio X.
> The gigabyte OC looks to be 600w, is cheaper and inclusion of 4x adaptor where as the Trio X I have coming is only 520w~ with 3x adaptor.


It may not matter most of the time judging by the reviews. If GPU isn’t saying perfcap=pwr it’s going full speed.

plus it’s only a matter of time before nvflash gets updated and you can drop any 600w bios on there. At least this gen we don’t have to worry about 2vs3 connector cards and the bios not being fully compatible.


----------



## mirkendargen

Anyone have the cheapest Gigabyte card (Windforce) that can speak to the power limit? I know the Gaming OC is 600w but I couldn't immediately find info on the Windforce and I don't want to dig through stupid youtube videos looking there


----------



## sakete

For those still needing to pull the trigger for the next round, Asus TUF OC definitely has a 600W bios.

I'll be testing some undervolting.


----------



## mattskiiau

Welp, too late to swap now!


----------



## Blameless

Would have liked to order an FE today, but that was virtually the only model I never saw in stock. Unfortunately, it's also the only one that will fit in my intended build.


----------



## sblantipodi

gerardfraser said:


> Took me 8 Hrs on RTX4090 to get rid of this crap,I only got HDMI post working so far,Display ports with 3 different HDR monitor not working on any port.
> 
> How I finally got HDMI to work on the RTX 4090.
> 
> IGPU plugged into HDMI.
> Had RTX 3090 plugged in and insalled, RTX 4090 would not do crap although I did get a post 2 times in 8 hrs.
> Reinstalled Windows 11 through HDMI on IGPUI,tossed in RTX 4090 and finally working fine only on HDMI.
> 
> Going to start drinking now and watch some hockey.Will tackle display ports tomorrow.
> 
> RTX 4090 Gigabyte OC
> ♦ CPU - AMD Ryzen™ 7 7700X With MSI MAG Core Liquid 360R V2
> ♦ GPU -GIGABYTE GeForce RTX 4090 GAMING OC
> ♦ RAM - CORSAIR Vengeance 32GB CL36 6000MT's
> ♦ Mobo - ASRock X670E Steel Legend (BIOS 1.08.AS02 Beta)
> ♦ SSD - ADATA XPG GAMMIX S70 Blade Read:7400MB/s; Write: 5500MB/s
> ♦ DSP - LG B9 65" 4K UHD HDR OLED G-Sync Over HDMI
> ♦ PSU - EVGA GQ 1000W
> ♦ CASE -Phanteks Eclispe 500A


I tried HDMI and another monitor, same problem.


----------



## sblantipodi

Alelau18 said:


> Some reviewers had issues with Zen 4 and RTX 4090 with the motherboard refusing to post, be sure you have the motherboard bios fully updated
> 
> __ https://twitter.com/i/web/status/1579014774901260288


Using z690 Extreme from Asus with latest BIOS


----------



## WayWayUp

So it looks like the 4090 was the first graphics card to hit 20k in port royal.
Surprised the 3090tk never hit it with an unlocked bios, volt mods, Ln2, tweaks andhundreds/thousand of runs

but the crazy thing is… 30k has also been posted with the 4090 and it’s literally the first day

30k port royal…..

Do you guys even remember the 3080 launched and ppl were doing 11.5k with their 3080s

most insane gen over gen jump in a longggg time


----------



## DokoBG

Hey boiz , wats the consensus on the MSI Gaming X trio bios (3x8pin) ? 520W ? I have one of these coming in my way in a few days. Can we use a 4x8pin + some high power bios on these ? We need answers !!!!!


----------



## gerardfraser

sblantipodi said:


> I tried HDMI and another monitor, same problem.


Like I said,over 8 hrs untill I got HDMI to work,I feel bad for novice compter people,this is ridiculous.


----------



## sblantipodi

gerardfraser said:


> Like I said,over 8 hrs untill I got HDMI to work,I feel bad for novice compter people,this is ridiculous.


But how did you make it to work?
Did you have similar problem of mine of pc rebooting it self when loading windows as soon as you get into the desktop?

If I uninstall the Nvidia driver the problem vanish but I have a 2d card in that case.damn


----------



## WayWayUp

These prices… whether they sell or not, will unfortunately attract more scalpers
I imagined them getting scalped for 1900-2200
This gap tho is insane.
scumbags don’t even include shipping LOL


----------



## sirneb

mirkendargen said:


> Anyone have the cheapest Gigabyte card (Windforce) that can speak to the power limit? I know the Gaming OC is 600w but I couldn't immediately find info on the Windforce and I don't want to dig through stupid youtube videos looking there


I'd love to know as well since that's what I was able to get my hands on. This review does show that it does draw up to 563w: Gigabyte RTX 4090 Windforce 24G Review | Power and Temperatures | GPU & Displays


----------



## mirkendargen

DokoBG said:


> Hey boiz , wats the consensus on the MSI Gaming X trio bios (3x8pin) ? 520W ? I have one of these coming in my way in a few days. Can we use a 4x8pin + some high power bios on these ? We need answers !!!!!


There's no such thing as a 3x8pin card or a 4x8pin card. They're all 1x16pin cards. You're just talking about the adapter they come with which doesn't necessarily need to be used.


----------



## Antsu

Well damn, I slept over the release hoping there would be some left with these absurd prices of 2400€ for a TUF, but of course there isn't. Oh well, I hope they atleast get blocks out faster this time so I can slap one on the card when I get it.


----------



## Mad Pistol

Welp... this is what I get for jumping the gun on a 4090. My case would barely fit it with ZERO room to breathe, and I can't find a 3rd PCIe power cable anywhere, which means the card could not be powered. So...











I thought about building a new system from scratch, but that will have to wait for another day.


----------



## Thebc2

I feel your pain. My motherboard tray saved the day[emoji1787]




















Sent from my iPhone using Tapatalk Pro


----------



## sakete

Well I've been playing Cyberpunk for the past half hour, with most settings maxed out and DLSS on Quality. On 1440p. 

Undervolted at 910mv, getting ~50% power draw, 230W peak power draw, 60C temp, 67C hotspot. This has been stable. I had tried lower voltages but Cyberpunk would then instantly crash. I'm gonna go with this setting for now.

Getting 75-110 fps, depending on the complexity of the scene being rendered.


----------



## slayer6288

Anyone try flashing a bios yet or does the nvflash version also posted here fail?


----------



## gerardfraser

sblantipodi said:


> But how did you make it to work?
> Did you have similar problem of mine of pc rebooting it self when loading windows as soon as you get into the desktop?
> 
> If I uninstall the Nvidia driver the problem vanish but I have a 2d card in that case.damn


I explained how I made it work after 8hrs on HDMI only,it sucks that Displayport is not working yet but I started drinking and will try tomorrow ,but I think it is BIOS problem on my side,may not be on yours.


----------



## sakete

Man, sorry some of you guys are having issues. Nothing worse than buying new hardware and it's non-stop problems and troubleshooting to resolve it. Ugh.

I've been there, but thankfully not this time.


----------



## Thebc2

Very impressed so far. Trying to find some limits but coming up short. Been playing New World in 4k for the last hour with the clocks fully turned up and no stability issues. Temps have stayed below 63 despite a solid clock of 2970 that isn’t wavering. Interested to see what this does under water but the factory air cooling really surprised me.











Sent from my iPhone using Tapatalk Pro


----------



## bmagnien

Anyone have the Asus TUF (non OC) and could confirm the max power limit?


----------



## Arizor

bmagnien said:


> Anyone have the Asus TUF (non OC) and could confirm the max power limit?


Just installed and tested my TUF quickly. Easy clock to 2900mhz, +1500 on the memory. I'll check out power limits and let you know soon, but MSI AB lets you push power up to 133%.


----------



## 8472

Got the shipping notification from Newegg. I don't think UPS actually has the package yet, so I probably won't get it until Friday.


----------



## bmagnien

Arizor said:


> Just installed and tested my TUF quickly. Easy clock to 2900mhz, +1500 on the memory. I'll check out power limits and let you know soon, but MSI AB lets you push power up to 133%.
> 
> View attachment 2575707


Thank you! This is the non oc tuf right? If you go into gpuz you should be able to see the max power limit if you go to advanced tab, then click the nvidia bios drop down. But 133% of 450 = 600 so sounds like that’s confirmed which is awesome


----------



## Arizor

bmagnien said:


> Thank you! This is the non oc tuf right? If you go into gpuz you should be able to see the max power limit if you go to advanced tab, then click the nvidia bios drop down. But 133% of 450 = 600 so sounds like that’s confirmed which is awesome


Yes mate, non-OC. Just got GPU-Z going and can confirm it at least breaches 500W


----------



## pewpewlazer

So much for the claims of cards stockpiled in warehouses long ahead of launch, claims of excessive stock because because Nvidia overestimated demand for this generation based on mining (or whatever people claim), claims of demand being back to normal because "mining is dead", claims of no one wanting these things because they're priced too high, etc...

Here's to hoping us normal folk can purchase one of these things at MSRP anytime soon without bots, scripts, secret insider BestBuy links, camping out at Microcenter overnight, pure luck, etc.

I'd have no problem paying 2x MSRP up front to 'back-order' a card, with half my money being refunded upon shipment of the card. Make the entire purchase non-refudnable if you cancel the order. That would be SO much better than playing this "wait for a re-stock" game.


----------



## sakete

Let's see how it plays out. It could very well be that this time they'll have stock more frequently. It's also their top of the line card, so they'll have more production capacity allocated to the 4080 series coming out next month instead of the 4090 of which they'll just move a lot less. 

If the 4080 launch is also a mess, then by all means Nvidia is full of it.


----------



## th3illusiveman

pewpewlazer said:


> So much for the claims of cards stockpiled in warehouses long ahead of launch, claims of excessive stock because because Nvidia overestimated demand for this generation based on mining (or whatever people claim), claims of demand being back to normal because "mining is dead", claims of no one wanting these things because they're priced too high, etc...
> 
> Here's to hoping us normal folk can purchase one of these things at MSRP anytime soon without bots, scripts, secret insider BestBuy links, camping out at Microcenter overnight, pure luck, etc.
> 
> I'd have no problem paying 2x MSRP up front to 'back-order' a card, with half my money being refunded upon shipment of the card. Make the entire purchase non-refudnable if you cancel the order. That would be SO much better than playing this "wait for a re-stock" game.


lol, so they just keep your $3K cause you needed to cancel....? Consumers devaluing themselves soo much is what leads to these crappy launches and marketing antics from Nvidia. Why yeild to them? THEY want YOUR money. Demand better, not less.


----------



## J7SC

sakete said:


> Let's see how it plays out. It could very well be that this time they'll have stock more frequently. It's also their top of the line card, so they'll have more production capacity allocated to the 4080 series coming out next month instead of the 4090 of which they'll just move a lot less.
> 
> If the 4080 launch is also a mess, then by all means Nvidia is full of it.


...yeah, we're barely through day one of sales, so let's see. Also, retailers got stuck with a lot of RTX3K stock they're still trying to move through the channel (3090 Ti anyone?). It is understandable that they want to see what early-adopter demand is before ordering higher numbers into inventory for something that is this pricey. Inventory financing is becoming more expensive as well.


----------



## mattskiiau

Just received my Msi Gaming X Trio.
Bit of a 1st world problem but disappointed I can only increase power limit by 6% where lower end models from other AIBs get the full 33%.

*Stock Boosting results:
TimeSpy*: 2805Mhz
*Port Royal*: 2805Mhz
*Board Power Draw*: 420W
*PerfCap*: VRel


----------



## Alemancio

My god so many crybabies ITT not getting their 4090 on launch day... Cant you wait another week(s) for stock to stabilize?


----------



## RaMsiTo

mattskiiau said:


> Just received my Msi Gaming X Trio.
> Bit of a 1st world problem but disappointed I can only increase power limit by 6% where lower end models from other AIBs get the full 33%.
> 
> *Stock Boosting results:
> TimeSpy*: 2805Mhz
> *Port Royal*: 2805Mhz
> *Board Power Draw*: 420W
> *PerfCap*: VRel


trio includes 3x8 pin adapter so those 450w limit was to be expected.


----------



## DokoBG

450W is stock mode. If he can do 6% power limit increase than this would make the MSI Gaming Trio about 480w power limit card with that bios.


----------



## mattskiiau

mattskiiau said:


> Just received my Msi Gaming X Trio.
> Bit of a 1st world problem but disappointed I can only increase power limit by 6% where lower end models from other AIBs get the full 33%.
> 
> *Stock Boosting results:
> TimeSpy*: 2805Mhz
> *Port Royal*: 2805Mhz
> *Board Power Draw*: 420W
> *PerfCap*: VRel


*Quick OC result:

Voltage:* +100
*Core: *+150
*Power Limit:* 106%
*Boost:* 3015Mhz

Temps are decent, 65c HotSpot, 58c GPU.
Hitting PerfCap PWR and sometimes VREL. 
Topping out 450w, I'll need to find a 4x8pin Adapter and wait until a 600w BIOS is working but considering I'm hitting VREL, it's probably not going to improve much with a 600w BIOS?


----------



## Carillo

Hey. I got the Gigabyte Windforce model yesterday here in Norway, only card i was able to purchase. All gone in seconds. I know the Gaming OC is 600 watt, but not sure about the windforce. I will receive the card probably next week, since the nearest retailer is 7 hours away  Anyone else got the Windforce ?


----------



## 8472

Just got an email from cablemod regarding there 12VHPWR adapter. 

"One thing of note here is that due to the cable bend, you’re going to need roughly 4 to 4.5cm of space from your graphics card to your side panel in order to properly install this cable. Please check this measurement with your PC’s case." 

I can see why the PCIE 4.0 vertical brackets for the lian li 011 dynamic xl are sold out.


----------



## Nizzen

dr/owned said:


> let’s see who the first to shunt mod it.
> C’mon ya crazy bastards, entertain me while I wait 2 weeks for Amazon to get mine out the door! (Usually they deliver in a week though even when it says some far out days, but my TUF 3090 did take 6 weeks to ship)


Looks like powerlimit isn't the bottleneck. Low "voltage" is.


----------



## Madness11

Guys got the palit game rock OC 4090 , and this is normal ???




Loud and sound like "uuuuuuuu" or it's ok for palit ?


----------



## Arizor

Madness11 said:


> Guys got the palit game rock OC 4090 , and this is normal ???
> 
> 
> 
> 
> Loud and sound like "uuuuuuuu" or it's ok for palit ?


sounds a bit like really loud coil whine or similar, doesn’t sound right to me, or at least my TUF doesn’t sound like that.


----------



## PLATOON TEKK

J7SC said:


> @PLATOON TEKK - DerBauer's vid on the 4090 Aorus Master uncovered a potential setup for SLI/NVLink (see timestamped vid) - for the 4090 Ti which likely might share PCBs ?


Thanks for the stamp. Really interesting, seems the rumors I heard are far more accurate than even I believed.

Also, judging by how the galax team went about breaking the ln2 WR, as I mentioned on the FIRST page of this post, VOLTAGE is crucial this time around. The importance of using an EVC (if it works) or similar (HOF voltage control) will potentially make or break significant OCs imo.


----------



## Hulk1988

Could you share your MAX memory OC, please? I would like to understand what are the average good and bad values.

My OC STRIX is making +1600mhz on MEM. Did not test further yet. Seems to be a good chip.


----------



## Carillo

From what i've seen , it seems like the cards clock pretty close to each other because they all are voltage limited. If Elmor find a solution to that problem, we will probably find the silicon limitations.


----------



## LunaP

Looks like BestBuy still has stuff up and down, so probably gonna keep an eye out tonight/early tomorrow to see if they do a second run, though might be friday for the next set.

I've got a case labs but I might have to move my tube res to fit this looking at teh pics of peoples 3080's next to the 4090 since I have a 2080ti Idk if its longer or shorter than the 3080 and I've got like 4 inches of clearance ( at best 6 )


----------



## RaMsiTo

*NVIDIA NVFlash 5.763.0 someone try it?









NVIDIA NVFlash (5.792.0) Download


NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID




www.techpowerup.com




*


----------



## braincracking

Joining the club with a 4090 zotac amp. Waterblock will arrive in the weekend, then it's time for some fun


----------



## heptilion

Got myself the STRIX. Can anyone confirm if there powelimit is at 90%? Also boosting to 2760mhz


----------



## domdtxdissar

Nvidia are smart 








GeForce RTX 4090 Game Ready Driver: Beyond Fast GPU Unleashed, First DLSS 3 Games & DirectX 12 Performance Improvements


Unlock the full potential of the new, beyond fast GeForce RTX 4090, increase DirectX 12 performance on all GeForce RTX GPUs, get support for the latest DLSS games, including the first DLSS 3 titles, and get Game Ready for the upcoming launch of A Plague Tale: Requiem, Gotham Knights, Scorn...



www.nvidia.com






> These improvements include shader compilation optimization, reduced CPU overhead, and Resizable BAR profiles for _Forza Horizon 5_ and _F1Ⓡ 22_. Many of these optimizations are most beneficial in CPU-bound scenarios, enabling your graphics card to be utilized to a greater degree. In _Assassin’s Creed Valhalla_, _Cyberpunk 2077_, and _Forza Horizon 5_, performance improved by up to 24%


- NV sandbagged themselves. Basically launch new card, ensure it gets the performance improvement drivers but the old cards don't, then when reviews are rolled out launch updated drivers for the old cards.

Wonder how much these new drivers narrow the gap between the 4090 and the 3090Ti.


----------



## Madness11

Arizor said:


> sounds a bit like really loud coil whine or similar, doesn’t sound right to me, or at least my TUF doesn’t sound like that.


Share some video pls , how ur card work . 30% , 60, 80, 100  please


----------



## Glottis

heptilion said:


> Got myself the STRIX. Can anyone confirm if there powelimit is at 90%? Also boosting to 2760mhz
> 
> View attachment 2575740
> 
> 
> 
> View attachment 2575738
> View attachment 2575739


Strix has 500W as default power limit, so 90% represents Nvidia's stock 450W power limit.


----------



## Arizor

If anyone has their voltage greyed out in MSI AB I can confirm, at least for ASUS, their new GPU tweak 3 allows voltage to go up to 1.1, which enables over 3ghz on my TUF. Though I’d say diminishing returns past a 2.9ghz OC for any games.


----------



## LunaP

damn, even amazon's full of it now.


----------



## Arizor

Madness11 said:


> Share some video pls , how ur card work . 30% , 60, 80, 100  please


Here you go mate, squelching noises courtesy of water pump fyi 😂


----------



## Madness11

Arizor said:


> Here you go mate, squelching noises courtesy of water pump fyi 😂


Yep , yee it's quite) palit just noise , but I guess it's all palit models


----------



## Blameless

domdtxdissar said:


> Wonder how much these new drivers narrow the gap between the 4090 and the 3090Ti.


I benched my 3080 with them. There is an uplift, but it's quite small...low single digit percentages.


----------



## Rbk_3

I am having a really strange issue and it is driving me nuts. 

The RTX features are not working on my Gigabyte Gaming OC 4090. Obviously have the new driver installed. 

So I did a fresh windows install on a Samsung 980 Pro which is PCIE Gen 4 and the RTX features do not work, but they work on my older 970 Evo than is PCIE Gen 3. I wonder if it is related to that? 

No games have DLSS or Raytracing settings. 3DMark Port Royal says my PC is unable to run this test. You need a graphics card with drivers that support Direct X Raytracing to run this test.

I tried reinstalling Windows again with no luck. I posted it on the Nvidia Subreddit and someone replied they had the same issue but it got locked right away so I don't know if this is a widespread issue.










The RTX features are not working on my Gigabyte Gaming...


So I did a fresh windows install on a Samsung 980 Pro which is PCIE Gen 4 and the RTX features do not work, but they work on my older 970 Evo than is PCIE Gen 3. No games have DLSS or Raytracing settings. 3DMark Port Royal says my PC is unable to run this test. You need a graphics card with...




www.overclock.net


----------



## AvengedRobix

sblantipodi said:


> But how did you make it to work?
> Did you have similar problem of mine of pc rebooting it self when loading windows as soon as you get into the desktop?
> 
> If I uninstall the Nvidia driver the problem vanish but I have a 2d card in that case.damn


Have you try to switch PCI gen on BIOS? 3,4 and 5 and not on auto? Enable/disable the res-bar?


----------



## Rei86

Finally about the stupid adapters


----------



## domdtxdissar

Blameless said:


> I benched my 3080 with them. There is an uplift, but it's quite small...low single digit percentages.


Depend on what games that are tested i guess


----------



## Blameless

domdtxdissar said:


> Depend on what games that are tested i guess


I did test with Cyberpunk, which was one of the games on their list, but didn't see quite the same sort of gains. I am running at GPU bound settings with a 5800X3D and the SMT patch though...the overhead improvements might not mean as much on my setup as on their test system with it's extremely vague "32GB RAM".



Rei86 said:


> Finally about the stupid adapters


Also: Myth Busted: This is how NVIDIA’s 4-way adapter for the 12VHPWR port of the GeForce RTX 4090 really works! | igor'sLAB

Anyway, I was always going to just run two 8-pins from one of my existing PSUs directly into a 12VHPWR connector with the sense pins grounded. Any non-defective/undamaged PSU and six 16AWG wire pairs can handle 600w.


----------



## mattskiiau

Rei86 said:


> Finally about the stupid adapters


Wait so.. If i buy a 4x8pin for my Gaming X Trio which only has 3x8pin, It'll magically be 600w now? Regardless of BIOS?
Or am I mistaken here


----------



## Nico67

heptilion said:


> Got myself the STRIX. Can anyone confirm if there powelimit is at 90%? Also boosting to 2760mhz


Yep, I am see 90% in AB. Might even be AB thats dropping it to that, not that I have seen it hit 450w yet.



Blameless said:


> Anyway, I was always going to just run two 8-pins from one of my existing PSUs directly into a 12VHPWR connector with the sense pins grounded. Any non-defective/undamaged PSU and six 16AWG wire pairs can handle 600w.


I believe the 12VHPWR uses 16AWG wire, so a 2 x 8pin also using 16AWG plugging directly into the PSU should be fine, I think ASUS and Seasonic have cables like that with newer PSU's.
The three and four connector adapters are there because most PCIE 6 and 8pin cables are 18AWG or worse.


----------



## bmagnien

domdtxdissar said:


> Nvidia are smart
> 
> 
> 
> 
> 
> 
> 
> 
> GeForce RTX 4090 Game Ready Driver: Beyond Fast GPU Unleashed, First DLSS 3 Games & DirectX 12 Performance Improvements
> 
> 
> Unlock the full potential of the new, beyond fast GeForce RTX 4090, increase DirectX 12 performance on all GeForce RTX GPUs, get support for the latest DLSS games, including the first DLSS 3 titles, and get Game Ready for the upcoming launch of A Plague Tale: Requiem, Gotham Knights, Scorn...
> 
> 
> 
> www.nvidia.com
> 
> 
> 
> 
> 
> - NV sandbagged themselves. Basically launch new card, ensure it gets the performance improvement drivers but the old cards don't, then when reviews are rolled out launch updated drivers for the old cards.
> 
> Wonder how much these new drivers narrow the gap between the 4090 and the 3090Ti.


New drivers took Cyperpunk 4k benchmark at maxed out settings, 4k, DLSS performance from 60 to 72fps avg on my 3090. That's a lot.


----------



## LunaP

Just saw this show up on the 4090 on best buy


----------



## Thebc2

bmagnien said:


> Thank you! This is the non oc tuf right? If you go into gpuz you should be able to see the max power limit if you go to advanced tab, then click the nvidia bios drop down. But 133% of 450 = 600 so sounds like that’s confirmed which is awesome


Mine is an OC TUF, never saw power actually go above 475 yesterday, despite the slider being at 133%. 


Sent from my iPhone using Tapatalk Pro


----------



## bmagnien

Thebc2 said:


> Mine is an OC TUF, never saw power actually go above 475 yesterday, despite the slider being at 133%.
> 
> 
> Sent from my iPhone using Tapatalk Pro


TY! Yah seems like most gaming loads won't. Run furmark and i bet it tops 600w.


----------



## alitayyab

sblantipodi said:


> I used the bundled 4 cables splitter and the Corsair 12vhpwr special cable for my power supply. Same crap.
> 
> Is the GPU broken?


1. Try disabling or setting rebar to auto in the bios, if not already done.
2. I would honestly consider the PSU as the first culprit. Unless its easy for you to RMA and get a quick replacement on the GPU, try borrowing a PSU to test first.


----------



## bmagnien

LunaP said:


> Just saw this show up on the 4090 on best buy
> 
> View attachment 2575792


and now that badge has been removed. They don't have their act together at all.


----------



## LunaP

bmagnien said:


> and now that badge has been removed. They don't have their act together at all.


Yeah that was at most 10 minutes of it on *** is going on lol


----------



## lordkahless

LunaP said:


> Yeah that was at most 10 minutes of it on *** is going on lol


Yeah I have exclusive access. Tried to make an order. Said unavailable nearby. Did the chat. I said why can't you just ship it??? They said you can pick it up at your store on the 18th. I said great! But I cant make an order. They said we will make one for you. I verfied my details. They responded saying we are sorry we are experiencing system issues and are unable to make an order for you. Please try back later. Do they even know what they are doing??


----------



## LunaP

lordkahless said:


> Yeah I have exclusive access. Tried to make an order. Said unavailable nearby. Did the chat. I said why can't you just ship it??? They said you can pick it up at your store on the 18th. I said great! But I cant make an order. They said we will make one for you. I verfied my details. They responded saying we are sorry we are experiencing system issues and are unable to make an order for you. Please try back later. Do they even know what they are doing??


I sincerely believe the virtual agents are just overseas people with no power that companies hire as scapegoats for this stuff, same as calling a bank anytime after noon, u'll get someone w/ no authority or anything despite being a manager, it sucks.


----------



## BigMack70

So are FE cards not possible to buy anymore now that they are locked behind Best Buy's crappy store and policies?


----------



## LunaP

The Q&A is piling up w/ complaints on that matter and no ones getting answers other than the "scalpers took advantage of a feature we're leaving on" I'm sure they'll have more in a week or 2 but clearly noone ( during day or evening ) knows/has any info, and there are people defending it in the comments w/ huge walls vs just giving a simplified yes/no


----------



## Rei86

If what the other poster said was true and they got, got. They might be doing a stop ship order on all the ones that has yet to leave their warehouses and sending out cancel notifications, asking customers if they want one or none if they have been bought with the same billing address and email.
So they might be seeing in the background ones come up as available but going in and out of inventory.

And yes they would care if they have a deal with nVidia about street release price and they could be fined a penalty for early release depending on the contract. But if the margins on these things are slim and since nVidia likes to use them as a dumping ground so they don't have to deal with the logistics of running an online store... well dunno what's in their contract, probably just a stern wording.


----------



## LunaP

Rei86 said:


> If what the other poster said was true and they got, got. They might be doing a stop ship order on all the ones that has yet to leave their warehouses and sending out cancel notifications, asking customers if they want one or none if they have been bought with the same billing address and email.
> So they might be seeing in the background ones come up as available but going in and out of inventory.
> 
> And yes they would care if they have a deal with nVidia about street release price and they could be fined a penalty for early release depending on the contract. But if the margins on these things are slim and since nVidia likes to use them as a dumping ground so they don't have to deal with the logistics of running an online store... well dunno what's in their contract, probably just a stern wording.


I pray for the former as people put up pre orders on ebay, and would be hilarious to see them have to refund people cuz they got reprimanded for buying 5 vs 1. Ah to live in a functional society lmao, one can dream.

The stores in the valley here basically said they don't ever carry the high end cards so would have to check places like CA or NY only sadly, or hope they come back in stock online, seems rather silly but they're obviously aware of what they're doing so moot point.


----------



## lordkahless

Wow these people, I did the chat again and I said please try to add it to cart for me. Gave my details again. They said you can now pick it up on the 21st at your store. I said great so its placed? They sent me a link authorizing the charge to my card. The link said $28.67. I said what is 28.67?? This is Sku 6521430 and its $1599. They responded we are sorry we are experiencing system issues. I said keep trying!


----------



## yzonker

domdtxdissar said:


> Nvidia are smart
> 
> 
> 
> 
> 
> 
> 
> 
> GeForce RTX 4090 Game Ready Driver: Beyond Fast GPU Unleashed, First DLSS 3 Games & DirectX 12 Performance Improvements
> 
> 
> Unlock the full potential of the new, beyond fast GeForce RTX 4090, increase DirectX 12 performance on all GeForce RTX GPUs, get support for the latest DLSS games, including the first DLSS 3 titles, and get Game Ready for the upcoming launch of A Plague Tale: Requiem, Gotham Knights, Scorn...
> 
> 
> 
> www.nvidia.com
> 
> 
> 
> 
> 
> - NV sandbagged themselves. Basically launch new card, ensure it gets the performance improvement drivers but the old cards don't, then when reviews are rolled out launch updated drivers for the old cards.
> 
> Wonder how much these new drivers narrow the gap between the 4090 and the 3090Ti.


I got about 10% in CP2077 on my 3090. Max settings, 4k, RTX on, DLSS Quality.


----------



## LunaP

lordkahless said:


> Wow these people, I did the chat again and I said please try to add it to cart for me. Gave my details again. They said you can now pick it up on the 21st at your store. I said great so its placed? They sent me a link authorizing the charge to my card. The link said $28.67. I said what is 28.67?? This is Sku 6521430 and its $1599. They responded we are sorry we are experiencing system issues. I said keep trying!


Its not 100% out of the question of it being an error in your favor though ....lol some risks are worth it


----------



## Rei86

LunaP said:


> I pray for the former as people put up pre orders on ebay, and would be hilarious to see them have to refund people cuz they got reprimanded for buying 5 vs 1. Ah to live in a functional society lmao, one can dream.
> 
> The stores in the valley here basically said they don't ever carry the high end cards so would have to check places like CA or NY only sadly, or hope they come back in stock online, seems rather silly but they're obviously aware of what they're doing so moot point.


Yeah for the RTX 30 series they didn't ship them during the height of the pandemic. My friend who wanted one at 699.99 for a FE model actually took the day off and drove out to city that BB actually shipped them to and stood outside for one.
These 4090s are also pick up in store too from what I can tell.

Either ways I'm sure another person with a bot will get them once they do go live again. 

Hell I think one time BB did cancel all the orders and redid the sale again with crap ton of stupid capthca a week later.


----------



## lordkahless

LunaP said:


> Its not 100% out of the question of it being an error in your favor though ....lol some risks are worth it


Oh the link was invalid for sure when I attempted to make that purchase lol.
The person on chat is attempting to place the order again for me. I told them to keep trying


----------



## wooden

I managed to order both a Gainward 4090 Phantom and a Msi 4090 Trio X, but one of them is going to be returned.

I noticed the Gainward one seem to come with a 4x 8-_pin_ Adapter, while the MSI only has 3. The gainward also have a vapor chamber, but I'm probarly going to replace the cooler with a waterblock in a few months. Couldnt find much information about either card. Any ideas which card to keep? Gainward one was also 200 usd cheaper.


----------



## BigMack70

Seeing as how I bought Titan X cards, a 1080 Ti, 2080 Ti, and 3090 all launch day from Nvidia's store directly, I expected I'd get one of these GFE invites to buy a FE card from Best Buy, but nope. 

I really think it was a mistake for them to make Best Buy their sole supplier of FE cards. Newegg is so much easier to work with (usually).


----------



## Blameless

bmagnien said:


> New drivers took Cyperpunk 4k benchmark at maxed out settings, 4k, DLSS performance from 60 to 72fps avg on my 3090. That's a lot.


What CPU/memory?


----------



## bmagnien

Blameless said:


> What CPU/memory?


5800x3d, 32gb c14 3800 bdie. similar specs to this run: I scored 6 411 in Speed Way


----------



## PharmingInStyle

I could get a 4090 but it would be pre installed in prebuilt at Origin. I know prebuilts aren't popular here but if anyone can give advice on Origin. I hoping they're an exception here, that they're dependable for a high end prebuilt.

I could build my own but Origin has some good reviews. Anyway after all this odds are if I select the custom features I want then hit the buy button Origin will flash the dreaded message that the order is pending awaiting availability of the 4090 and they will notifying me when it arrives, ugh.


----------



## Blameless

bmagnien said:


> 5800x3d, 32gb c14 3800 bdie.


This is very similar to my setup. Looks like I need to rerun the tests.


----------



## bmagnien

NVIDIA is testing "Verified Priority Access" for GeForce RTX 4090 Founders Edition purchases - VideoCardz.com


NVIDIA is copying EVGA Elite Priority Access? If you live in a country that takes part in NVIDIA’s trial program, then you might be getting an invitation to purchase RTX 4090 graphics cards soon. NVIDIA is testing a new program called “Verified Priority Access”. It gives gamers and content...




videocardz.com





heck of a way to force us all to reinstall geforce experience lol


----------



## parky.fp

Anyone else with an MSI X570-A pro experiencing an issue getting a display signal? VGA light is red with an ASUS TUF 4090 - system boots fine with a 3090.

Bios updated to latest,1000w PSU


----------



## sakete

I like how this 4090 isn't even bundled with a game, just with a lame free 1-month trial of Adobe Creative Cloud (which you can also get when you don't buy a GPU).


----------



## sblantipodi

hi all, are there someone with an Asus Z690 and a 4090?

we are having problems here... 








RTX4090 and boot loop


Hi all, I have a Corsair HX1200i and an MSI Suprim X RTX 4090. PC works well until I have no nvidia driver, as soon as I install nvidia driver the PC reboots abruptly, then windows starts, shows the desktop and reboot, this happen in a loop. If I uninstall nvidia drivers PC works well. Is my...




www.overclock.net





are there some infos on the problem?

thanks!


----------



## sblantipodi

parky.fp said:


> Anyone else with an MSI X570-A pro experiencing an issue getting a display signal? VGA light is red with an ASUS TUF 4090 - system boots fine with a 3090.
> 
> Bios updated to latest,1000w PSU


we have a similar issue here:








RTX4090 and boot loop


Hi all, I have a Corsair HX1200i and an MSI Suprim X RTX 4090. PC works well until I have no nvidia driver, as soon as I install nvidia driver the PC reboots abruptly, then windows starts, shows the desktop and reboot, this happen in a loop. If I uninstall nvidia drivers PC works well. Is my...




www.overclock.net


----------



## sblantipodi

alitayyab said:


> 1. Try disabling or setting rebar to auto in the bios, if not already done.
> 2. I would honestly consider the PSU as the first culprit. Unless its easy for you to RMA and get a quick replacement on the GPU, try borrowing a PSU to test first.


I disabled rebr but it didn't helped. I will try a new 1500W PSU tomorrow but there are other users that have tried different PSUs in the other thread and it didn't worked.


----------



## Benni231990

Holy **** look at this 






If this is True UV is DEAD on 4090


----------



## BigMack70

Benni231990 said:


> Holy **** look at this
> 
> 
> 
> 
> 
> 
> If this is True UV is DEAD on 4090


The issue is that the 4090 is voltage limited, not power limited, so undervolting is a dumb idea. UV was good on 30 series because it was power limited.


----------



## slayer6288

Anyone try the updated nvflash to see if u can get a 600w bios to work on a 3 pin adapter? Seems like the snse pins are grounded properly so it should theoretically work.


----------



## Blameless

sakete said:


> I like how this 4090 isn't even bundled with a game, just with a lame free 1-month trial of Adobe Creative Cloud (which you can also get when you don't buy a GPU).


I stopped caring about software bundles when they stopped coming on physical media. Even if I were using the distribution clients the keys are usually for, there tends to be arbitrary barriers to actually using them that mean I wouldn't bother anyway.



Benni231990 said:


> If this is True UV is DEAD on 4090


This is not unlike what happens on AMD CPUs and GPUs. Can still undervolt them, just have to keep things like clock stretching in mind.

Even in that video, there were significant efficiency gains from undervolting, even if the behavior isn't as simplistic as it was with prior NVIDIA generations.



BigMack70 said:


> The issue is that the 4090 is voltage limited, not power limited, so undervolting is a dumb idea.


Unless one is still trying to reduce power/heat/noise.


----------



## BigMack70

Blameless said:


> Unless one is still trying to reduce power/heat/noise.


Just reduce the power target. Derbauer already showed don't lose much performance knocking 100-150W off the card by simple power target adjustment.


----------



## Sir Beregond

Blameless said:


> Unless one is still trying to reduce power/heat/noise.


Exactly. Sounded like you could really drop it down some and not lose _that_ much performance.


----------



## cletus-cassidy

bmagnien said:


> NVIDIA is testing "Verified Priority Access" for GeForce RTX 4090 Founders Edition purchases - VideoCardz.com
> 
> 
> NVIDIA is copying EVGA Elite Priority Access? If you live in a country that takes part in NVIDIA’s trial program, then you might be getting an invitation to purchase RTX 4090 graphics cards soon. NVIDIA is testing a new program called “Verified Priority Access”. It gives gamers and content...
> 
> 
> 
> 
> videocardz.com
> 
> 
> 
> 
> 
> heck of a way to force us all to reinstall geforce experience lol
> 
> View attachment 2575815


Did it work?


----------



## sakete

Hmm, still getting NVflash no Nvidia adapters found with the lastest 5.763 on my Asus TUF OC.


----------



## bmagnien

Sir Beregond said:


> Exactly. Sounded like you could really drop it down some and not lose _that_ much performance.


Dropping power limit won't effect pure raster load performance as much as fully loaded RT and Tensor core loads. I'd take the '-150w for -5% performance loss' with a big grain of salt because that won't be the case in heavier loads.


----------



## sakete

bmagnien said:


> Dropping power limit won't effect pure raster load performance as much as fully loaded RT and Tensor core loads. I'd take the '-150w for -5% performance loss' with a big grain of salt because that won't be the case in heavier loads.


Yeah, it'll need to be studied on a case by case basis.


----------



## bmagnien

cletus-cassidy said:


> Did it work?


i dont think any actual invites have gone out. this is just a PR announcement of their plan, but i would make sure you have an active nvidia account and are logged in to an installed geforce experience if you're hoping to get picked.

i feel like NV is working with BB before rolling this out because clearly something ****ed up in their supply stream. Despite rumors that bots breached the system early, im not seeing much evidence that BB really had any US stock for FE at launch. There's hardly any listings for FE on ebay and the few there are mostly with the same photo with the same information so like duplicate accounts. If bots really did buy up a huge chunk of supply, you'd see WAY more listings.

hopefully NV and BB work their **** out and have a proper launch of US FE inventory, but who knows when that will be.


----------



## mirkendargen

If Nvidia are smart they'll send an invite to everyone with a 3090/3090ti, because they know they have no chance of selling those people a 3090/3090ti.


----------



## cletus-cassidy

bmagnien said:


> i dont think any actual invites have gone out. this is just a PR announcement of their plan, but i would make sure you have an active nvidia account and are logged in to an installed geforce experience if you're hoping to get picked.
> 
> i feel like NV is working with BB before rolling this out because clearly something ****ed up in their supply stream. Despite rumors that bots breached the system early, im not seeing much evidence that BB really had any US stock for FE at launch. There's hardly any listings for FE on ebay and the few there are mostly with the same photo with the same information so like duplicate accounts. If bots really did buy up a huge chunk of supply, you'd see WAY more listings.
> 
> hopefully NV and BB work their **** out and have a proper launch of US FE inventory, but who knows when that will be.


Ah I see. On my phone it looked like you took a screen shot -- agree with your take here based on the evidence we have.


----------



## fEJK

alitayyab said:


> 1. Try disabling or setting rebar to auto in the bios, if not already done.
> 2. I would honestly consider the PSU as the first culprit. Unless its easy for you to RMA and get a quick replacement on the GPU, try borrowing a PSU to test first.


I have the same issue and I've tried two PSU's - Corsair RM1000X and Thermaltake GF3 (new one with a dedicated 12VHPWR port) and no luck. Tried all options with rebar.


----------



## mirkendargen

fEJK said:


> I have the same issue and I've tried two PSU's - Corsair RM1000X and Thermaltake GF3 (new one with a dedicated 12VHPWR port) and no luck. Tried all options with rebar.


X670? BIOS updated to the latest? I saw like every X670E board got a BIOS update a few days ago with similar language to this:


----------



## sirneb

bmagnien said:


> i dont think any actual invites have gone out. this is just a PR announcement of their plan, but i would make sure you have an active nvidia account and are logged in to an installed geforce experience if you're hoping to get picked.
> 
> i feel like NV is working with BB before rolling this out because clearly something ****ed up in their supply stream. Despite rumors that bots breached the system early, im not seeing much evidence that BB really had any US stock for FE at launch. There's hardly any listings for FE on ebay and the few there are mostly with the same photo with the same information so like duplicate accounts. If bots really did buy up a huge chunk of supply, you'd see WAY more listings.
> 
> hopefully NV and BB work their **** out and have a proper launch of US FE inventory, but who knows when that will be.


Some people on reddit claimed that they got an email and a notification already. It doesn't sound like they are lying.


----------



## Baasha

I bought 4x RTX 3090 Ti's and 6x RTX 3090s and I don't have their "invite" on GFE.  

I despise this whole thing - I have always bought the cards on launch day from Nvidia's site directly and/or NewEgg/EVGA.

Ever since the 30 series launch, I have not been able to purchase them for at least a month after launch - the sole exception being the RTX 3090 Ti since most people didn't seem to want them(?). 

I want to get an Asus RoG Strix 4090 OC and the RTX 4090 FE for two of my rigs but I have not been able to get either one yet.

To add insult to injury, I've seen scalpers on eBay - one guy in particular - sell SEVEN RTX 4090 FE for > $2,800 each! How tf does one get SEVEN GPUs while we can't get even one?

This is so unbelievably frustrating.


----------



## ZealotKi11er

So the clk Nvidia report is fake now not actual clk. Nice.


----------



## bmagnien

sirneb said:


> Some people on reddit claimed that they got an email and a notification already. It doesn't sound like they are lying.


Yeah just read the Reddit posts. They're saying that they started receiving notifications this morning, which is what today's PR said, and that's great news IMO. My comment was referring to the supposed BB FE launch yesterday, which either didn't happen in the US, or was comically small, so any of us feeling like the bots stole a ton of FEs from yesterday can feel....slightly better I suppose.


----------



## J7SC

...scalping is normal if despicable; just try to get some baseball, hockey or concert tickets during the playoffs. The trick is not falling for that crap with emotional decision-making. Below are scalper ads for the 3090 Strix back in the 'price-bulge' days...


----------



## Mad Pistol

I am curious on those "invites" to purchase a 4090 FE. I purchased a 3080 FE from Best Buy on literally the day they started selling them, and yet, I have not gotten an invite from Nvidia.

Is it possible that these invites are only going out to 3090/Ti customers?


----------



## yzonker

Has anyone tried duplicating the 666w max power draw GN saw with the FE? Just wondering if the FE might actually be underrated on the PL, but can the AIBs also go that high. 

It's just odd that even Kombustor would be able to exceed the PL by that much.


----------



## sirneb

bmagnien said:


> Yeah just read the Reddit posts. They're saying that they started receiving notifications this morning, which is what today's PR said, and that's great news IMO. My comment was referring to the supposed BB FE launch yesterday, which either didn't happen in the US, or was comically small, so any of us feeling like the bots stole a ton of FEs from yesterday can feel....slightly better I suppose.


I agree this is better BUT if we really think about the reality of things, these are really expensive cards. Only a small percentage of users are actually looking to make this purchase. By just choosing randomly out of all Geforce Experience users, I'd imagine majority of the invites are just going to be wasted (or purchase links likely just going to be sold).


----------



## bmagnien

yzonker said:


> Has anyone tried duplicating the 666w max power draw GN saw with the FE? Just wondering if the FE might actually be underrated on the PL, but can the AIBs also go that high.
> 
> It's just odd that even Kombustor would be able to exceed the PL by that much.


I havent seen anyone in this thread with an FE in hand. Or even purchased and waiting.


----------



## bmagnien

J7SC said:


> ...scalping is normal if despicable; just try to get some baseball, hockey or concert tickets during the playoffs. The trick is not falling for that crap with emotional decision-making. Below are scalper ads for the 3090 Strix back in the 'price-bulge' days...
> View attachment 2575831


Well comparatively this seems like a steal, over $300 less. I might have to pull the trigger


----------



## fEJK

mirkendargen said:


> X670? BIOS updated to the latest? I saw like every X670E board got a BIOS update a few days ago with similar language to this:
> 
> View attachment 2575828


Z690 Hero - Updated to latest bios and no luck unfortunately


----------



## Baasha

bmagnien said:


> I havent seen anyone in this thread with an FE in hand. Or even purchased and waiting.


Exactly - has anyone here been able to legit purchase a 4090 FE through Best Buy (not scalping)?

I saw one guy on Reddit who used the 'special link' yesterday say he was able to get it after like 10 tries.


----------



## PLATOON TEKK

Now that I think about it, I feel this harsh “voltage cap” by Nvidia is their way of nerfing cross-flashing with a higher power limit.

When I first mentioned voltage in this post a few people noted it might not even matter (makes sense, due to the diminishing utility of upped voltage on last few series).

However, the people who have voltage control (KPs ⚰, HOFs or an EVC) are faaaar less than those who simply download nvflash. The card also has to have an i2c header or “unlocked” software control. No 4090 announced has this (waiting on more pcb pics for header info & elmor). Wonder what HOF will be this time.

Nvidia playing chess.


----------



## J7SC

PLATOON TEKK said:


> Now that I think about it, I feel this harsh “voltage cap” by Nvidia is their way of nerfing cross-flashing with a higher power limit.
> 
> When I first mentioned voltage in this post a few people noted it might not even matter (makes sense, due to the diminishing utility of upped voltage on last few series).
> 
> However, the people who have voltage control (KPs ⚰, HOFs or an EVC) are faaaar less than those who simply download nvflash. The card also has to have an i2c header or “unlocked” software control. No 4090 announced has this. Wonder what HOF will be this time.
> 
> *Nvidia playing chess.*


...does Elmor (ElmorLabs) play chess, EVC2x to Queen...?


----------



## coelacanth

lordkahless said:


> Yeah I have exclusive access. Tried to make an order. Said unavailable nearby. Did the chat. I said why can't you just ship it??? They said you can pick it up at your store on the 18th. I said great! But I cant make an order. They said we will make one for you. I verfied my details. They responded saying we are sorry we are experiencing system issues and are unable to make an order for you. Please try back later. Do they even know what they are doing??


Hard to tell what the actual issue is but Gamers Nexus stated that "The FE cards are still relatively low volume worldwide." Given the positive press the FE cards got a day before the partner cards it seems like low(?) volume of the FE cards plus tons of demand means that they will be hard to get.


----------



## WayWayUp

so the 2nd fastest 4x sli 1080ti in TSE graphics was 18980 and that was with Ln2.

the 4090 absolutely demolished this already with a score of 22,702 on a single card the first day of release
LOL
actually, the 4090 has beat every single 4x sli score already


----------



## PLATOON TEKK

J7SC said:


> ...does Elmor (ElmorsLabs) play chess, EVC2x to Queen...?


haha I had just finished editing the post to add elmor progress potential. Let’s hope! Apparently it’s EVC2SE now, improvements over EVC2X:


Headers for I2C1/I2C2/UART changed from 2.54mm pitch pin headers to JST XH headers
Improved I2C1 performance at 400 KHz
Added fine-grain adjustments for SRC1/SRC2 DAC channels (0.01µA steps at Vfb = 0.6V)


----------



## WayWayUp

wouldnt it be easier to get a HOF card than to bother messing and attaching an EVC ?

hopefully kingpin signs with another company and they can release a card with voltage control


----------



## mirkendargen

WayWayUp said:


> wouldnt it be easier to get a HOF card than to bother messing and attaching an EVC ?
> 
> hopefully kingpin signs with another company and they can release a card with voltage control


You underestimate how few HOF cards are made and how hard they are to get lol.


----------



## Thebc2

I can’t speak to everyone’s experience. But at our local Microcenter (Boston) they got roughly 100 4090s on release day. There had a random drawing for spots beginning at 8:30 am by QR code registration, at that time there were only 44 people in line. 58 by 9 am (when they opened) and they didn’t actually fully sellout until the afternoon (last remaining cards were all TUF as well). I got there at 8:40, got on the list and was out in about 30 mins.

To contrast the 3090 release. There were well over 100 people camped overnight and they got maybe a dozen crappy initial release wave cards. Checking a few other Microcenter discords, it looks like most stores got 100-150 cards each for the 4090s.

So yes, I would say there was both higher stock and lower demand (at least at Microcenter here) compared to the 3090 launch, but it looks like there was either supply chain issues with other outlets or just much much higher online demand?


Sent from my iPhone using Tapatalk Pro


----------



## sakete

Thebc2 said:


> I can’t speak to everyone’s experience. But at our local Microcenter (Boston) they got roughly 100 4090s on release day. There had a random drawing for spots beginning at 8:30 am by QR code registration, at that time there were only 44 people in line. 58 by 9 am (when they opened) and they didn’t actually fully sellout until the afternoon (last remaining cards were all TUF as well). I got there at 8:40, got on the list and was out in about 30 mins.
> 
> To contrast the 3090 release. There were well over 100 people camped overnight and they got maybe a dozen crappy initial release wave cards. Checking a few other Microcenter discords, it looks like most stores got 100-150 cards each for the 4090s.
> 
> So yes, I would say there was both higher stock and lower demand (at least at Microcenter here) compared to the 3090 launch, but it looks like there was either supply chain issues with other outlets or just much much higher online demand?
> 
> 
> Sent from my iPhone using Tapatalk Pro


With the online demand being driven in big part by bots, I'm sure.


----------



## PLATOON TEKK

WayWayUp said:


> wouldnt it be easier to get a HOF card than to bother messing and attaching an EVC ?
> 
> hopefully kingpin signs with another company and they can release a card with voltage control


If the card has the right header (normally i2c) and the EVC can “communicate” with the controllers sufficiently, soldering isn’t even necessary, as long as the contacts are solid.

If you check one of my earlier 3090s thread replies, I did this to a Strix that I wanted to run the EVC on temporarily. Didn’t leave a single trace I had used it either.

edit:


Spoiler: Non solder EVC2X 



Platoon Tekk EVC2


----------



## carlhil2

Thebc2 said:


> I can’t speak to everyone’s experience. But at our local Microcenter (Boston) they got roughly 100 4090s on release day. There had a random drawing for spots beginning at 8:30 am by QR code registration, at that time there were only 44 people in line. 58 by 9 am (when they opened) and they didn’t actually fully sellout until the afternoon (last remaining cards were all TUF as well). I got there at 8:40, got on the list and was out in about 30 mins.
> 
> To contrast the 3090 release. There were well over 100 people camped overnight and they got maybe a dozen crappy initial release wave cards. Checking a few other Microcenter discords, it looks like most stores got 100-150 cards each for the 4090s.
> 
> So yes, I would say there was both higher stock and lower demand (at least at Microcenter here) compared to the 3090 launch, but it looks like there was either supply chain issues with other outlets or just much much higher online demand?
> 
> 
> Sent from my iPhone using Tapatalk Pro


I got there at about 9:15 and got a Gaming Trio...


----------



## coelacanth

sakete said:


> With the online demand being driven in big part by bots, I'm sure.


Bots and scalpers for sure. I managed to get a Zotac Trinity 2 hours after launch from Newegg, but then cancelled the order. I was thinking of keeping the order and selling it to someone on here at cost though just to make sure it didn't end up getting scalped.


----------



## coelacanth

Thebc2 said:


> I can’t speak to everyone’s experience. But at our local Microcenter (Boston) they got roughly 100 4090s on release day. There had a random drawing for spots beginning at 8:30 am by QR code registration, at that time there were only 44 people in line. 58 by 9 am (when they opened) and they didn’t actually fully sellout until the afternoon (last remaining cards were all TUF as well). I got there at 8:40, got on the list and was out in about 30 mins.
> 
> To contrast the 3090 release. There were well over 100 people camped overnight and they got maybe a dozen crappy initial release wave cards. Checking a few other Microcenter discords, it looks like most stores got 100-150 cards each for the 4090s.
> 
> So yes, I would say there was both higher stock and lower demand (at least at Microcenter here) compared to the 3090 launch, but it looks like there was either supply chain issues with other outlets or just much much higher online demand?
> 
> 
> Sent from my iPhone using Tapatalk Pro


Brett from UFD Tech said his local Micro Center in Ohio had 100 4090s as well.


----------



## Mad Pistol

4090 finally shipped from Newegg. Scheduled to arrive on Tuesday.

I will have my new case and PSU tomorrow, so I'll have plenty of time to get it all transferred over to the new case.


----------



## sakete

Ok lol, playing my favorite game Rocket League (fps capped at 140, because my monitor goes to 144hz, so no point in going higher), I got 90W power draw max, or 20% of TDP, with temps hovering at 45C. It's not breaking a sweat at all.


----------



## WayWayUp

ouch

so much for AMD savior

delayed and weak


----------



## dr/owned

Feel like this thread needs someone else put in charge of OP cause that dude is a ghost.


PLATOON TEKK said:


> If the card has the right header (normally i2c) and the EVC can “communicate” with the controllers sufficiently, soldering isn’t even necessary, as long as the contacts are solid.
> 
> If you check one of my earlier 3090s thread replies, I did this to a Strix that I wanted to run the EVC on temporarily. Didn’t leave a single trace I had used it either.
> 
> edit:
> 
> 
> Spoiler: Non solder EVC2X
> 
> 
> 
> Platoon Tekk EVC2


Elmor doesn't have the register mapping for the MP2891 that is used on a few cards like the MSI ones and Galax: https://elmorlabs.com/forum/topic/evc2-beta-software/?part=11#postid-707


----------



## PLATOON TEKK

dr/owned said:


> Feel like this thread needs someone else put in charge of OP cause that dude is a ghost.
> 
> Elmor doesn't have the register mapping for the MP2891 that is used on a few cards like the MSI ones and Galax: https://elmorlabs.com/forum/topic/evc2-beta-software/?part=11#postid-707


damn. That’s not good news for now, thanks for heads up.


----------



## Thebc2

coelacanth said:


> Bots and scalpers for sure. I managed to get a Zotac Trinity 2 hours after launch from Newegg, but then cancelled the order. I was thinking of keeping the order and selling it to someone on here at cost though just to make sure it didn't end up getting scalped.


The lack of stock at Amazon, Best Buy, Newegg and other online retailers definitely seemed low to completely non-existent depending on the outlet. Especially with how everyone was framing availability.


Sent from my iPhone using Tapatalk Pro


----------



## slayer6288

Anyone else try the new NV flash? I assume it still doesnt work?


----------



## sakete

Thebc2 said:


> The lack of stock at Amazon, Best Buy, Newegg and other online retailers definitely seemed low to completely non-existent depending on the outlet. Especially with how everyone was framing availability.
> 
> 
> Sent from my iPhone using Tapatalk Pro


My local micro center did seem to have a lot in stock. Though that stock should all be gone now.


----------



## sakete

slayer6288 said:


> Anyone else try the new NV flash? I assume it still doesnt work?


It didn't detect any Nvidia adapters on my side when I tried the latest I downloaded earlier this morning.


----------



## BPS Customs

slayer6288 said:


> Anyone else try the new NV flash? I assume it still doesnt work?


It doesn’t work, tried it earlier


----------



## Rbk_3

sakete said:


> My local micro center did seem to have a lot in stock. Though that stock should all be gone now.


Canada Computers had quit a lot of stock in store in Canada. They had some sitting on the shelf most of the day at my store.


----------



## ZealotKi11er

Rbk_3 said:


> Canada Computers had quit a lot of stock in store in Canada. They had some sitting on the shelf most of the day at my store.


Yeah, was the same case for my local CC. I purchased mine at 11 am.


----------



## Sir Beregond

WayWayUp said:


> View attachment 2575841
> 
> 
> ouch
> 
> so much for AMD savior
> 
> delayed and weak


Literally no one expected RDNA3 to have better RT performance than 40-series. Same goes for Raster.

What delay? We knew that Nov 3rd was just the announcement.

I think the hope most people had was that the pricing would be much better than 40-series is.


----------



## WayWayUp

amd engineers were leaking that they would double their raster performance and early leaks suggested that they might even beat nvidia in raster
to be slower is a big fail

we already know about dlss, productivity, drivers, raytracing, streaming, ext.
amd needs to be the best at something.

any price over $1k is DOA if they cant match the raster

but there is a lot of hope still for lower products in the stack but thats also where dlss 3 becomes useful for say the 4060 for instance


----------



## slayer6288

BPS Customs said:


> It doesn’t work, tried it earlier


Damn even with the display driver disabled still a no go?


----------



## Sir Beregond

WayWayUp said:


> amd engineers were leaking that they would double their raster performance and early leaks suggested that they might even beat nvidia in raster
> to be slower is a big fail
> 
> we already know about dlss, productivity, drivers, raytracing, streaming, ext.
> amd needs to be the best at something.
> 
> any price over $1k is DOA if they cant match the raster
> 
> but there is a lot of hope still for lower products in the stack but thats also where dlss 3 becomes useful for say the 4060 for instance


Nvidia rumors were also claiming double raster too up until recently and that's not true. So I don't see what the doom and gloom news here is.


----------



## sakete

slayer6288 said:


> Damn even with the display driver disabled still a no go?


Nope, no go. I oddly also can't export my bios from GPU-Z, keep getting an error.


----------



## Sir Beregond

RetroWave78 said:


> Artificial scarcity.


More or less what I was trying to say earlier. Nividia will work this however they need to to clear 30-series stock. The artificial scarcity isn't aimed at the 4090 buyer who already has a 3090/3090 Ti. It is definitely aimed at anyone with older 10 or 20-series cards, or lower down the rung 30-series cards.



> note the recent announcement that RDNA3 will be delayed.


OK, you're the second person to mention a "delay". Maybe I missed something.


----------



## Mad Pistol

WayWayUp said:


> View attachment 2575841
> 
> 
> ouch
> 
> so much for AMD savior
> 
> delayed and weak


AMD probably wasn't expecting Nvidia to drop a bomb with the RTX 4090. Hell, none of us were expecting it. Rumors are always "Next gen GPUs are gonna be hellafast!!!", but they rarely live up to the hype.

In this case, the product lived up to the hype, and everyone is shocked.


----------



## Rei86

RetroWave78 said:


> Artificial scarcity.


MLiD has reported that nVidia did a buyback on Ampere and gave AIB's cuts. 
Then we got word about EVGA leaving nVidia for all the issue they had with them and another was about having so many 30 series still in stock.
So yeah its believeable that nVidia will pump the market with 4090 cards and hardly any 4080s till the all the 30 series (all as in the ones that matter 3080, 3080 12GB, 80ti, 90, 90Ti) that we'll see the full line up.



RetroWave78 said:


> I suppose we all get to circle jerk each other until this dies down. AMD will not save us. Lisa Su and Jensen Juang are related, note the recent announcement that RDNA3 will be delayed. This is like how the mainstream Democrat and Republican parties in the U.S. are owned by Big Pharma, the Military Industrial Complex and behind them BlackRock and Vanguard. They maintain this facade of competition of ideas and interests but in reality they meet in cigar smoke filled dark rooms and laugh and pat each other on the back as to how they are screwing over the masses.
> 
> 
> 
> 
> 
> 
> Jen-Hsun is Lisa Su's uncle
> 
> 
> Did you know, Jen-Hsun from Nvidia is actually Amd's Lisa Su's uncle. They're competitors and family at the same time. Maybe one told the other at a family meeting about their plans and so the other was forced to make such good GPUs. Source: https://babeltechreviews.com/nvidias-ceo-is-the-uncle-o...
> 
> 
> 
> 
> linustechtips.com


Not a big gotcha, found out she was his niece when she took the job.
Guess a lot of people didn't pay attention ¯\_(ツ)_/¯
They didn't keep it as a secret, but they don't shout it out at every meeting that they are related.


----------



## EastCoast

Mad Pistol said:


> AMD probably wasn't expecting Nvidia to drop a bomb with the RTX 4090. Hell, none of us were expecting it. Rumors are always "Next gen GPUs are gonna be hellafast!!!", but they rarely live up to the hype.
> 
> In this case, the product lived up to the hype, and everyone is shocked.


I am *not* "shocked" I expect this level of performance from one gen to the next. But when you think that way you realize that you don't expect to pay more then expected for it.


----------



## Antsu

ZealotKi11er said:


> So the clk Nvidia report is fake now not actual clk. Nice.


It's been this way since 1000 series atleast, just look at effective clock when overclocking/undervolting and you will know what the actual performance will be.


----------



## Luda

kx11 said:


> A lot better than i thought, the mobo was in a rough state early 2022 but after a month of bios updates it got stable and steady running, the gpu is my 1st ever WF and i like what it can do


I dont think ive ever seen one powered up with the screen on, i did a double take. Very cool, glad it runs as well as it looks now :-D


----------



## sblantipodi

This evening I tried the card on my brother's PC, he has an Asus Z690 Hero,
it works like a charm on his PC, boot loop on my pc.

At this point I really don't know where is the problem.

Tomorrow I'll try a brand new 1500W power supply but I doubt that it's the power supply because the power supply I'm using on my pc is am HX1200i that works well with a 2080ti.

It could be some incompatibility with my Z690 Extreme


----------



## sakete

sblantipodi said:


> This evening I tried the card on my brother's PC, he has an Asus Z690 Hero,
> it works like a charm on his PC, boot loop on my pc.
> 
> At this point I really don't know where is the problem.
> 
> Tomorrow I'll try a brand new 1500W power supply but I doubt that it's the power supply because the power supply I'm using on my pc is am HX1200i that works well with a 2080ti.
> 
> It could be some incompatibility with my Z690 Extreme


Good to hear it's not the card itself. Maybe some weird Bios issue with your Z690. Are there any BIOS updates available for your mobo? Maybe try resetting your mobo to defaults and see how that works? (backup your settings first though so you can do an easy restore).


----------



## katates

What do you think about Msi Gaming X trio?
I am between this and Inno3D X3 OC, and msi is a little bit pricey


----------



## J7SC

Sir Beregond said:


> Nvidia rumors were also claiming double raster too up until recently and that's not true. So I don't see what the doom and gloom news here is.


...agree - rumours about all sorts of things abound...below published within a day of each other... 🥴


----------



## yt93900

RTX 4090 Aorus Xtreme w. 360mm AIO, OC'ed, fans at 100%:








I scored 33 983 in Time Spy


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com




Compared to previous 3090 Strix, OC'ed:








I scored 20 667 in Time Spy


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 3090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Shame Gigabyte did a big stupid with the radiator fans, they have a custom connector!! Both the fans and the 3-way fan split cable. I was already removing them and mounting 12x25 Chromaxes until I noticed the connector is completely different. Now waiting for someone to tear his 4090 Xtreme down to see if the splitter can be replaced with a normal PWM one.
Regarding the temps, the stock curve won't ramp them up from above the default 30% up until 56-58*C somewhere, ~58-60*C seem to be the default temp target.


----------



## Blameless

BigMack70 said:


> Just reduce the power target. Derbauer already showed don't lose much performance knocking 100-150W off the card by simple power target adjustment.


Most of my undervolts involve maxing out the power target and significantly reducing voltage to keep heat/noise down while still allowing for peak performance in outlier loads. For example, my remaining 3080 has a curve that maxes out at 900mV, with most games using about ~200w, but Path of Exile with global illumination enabled will still pull upwards of 400w at 900mV. It's a similar deal with my 6800 XT....a 100mV undervolt, but I doubled the stock power limit and can still occasionally max it out in outlier applications.

Even if the basic F/V curve results in some degree of clock stretching, or the manipulates clock domains that cannot be directly edited yet, it's still going to be very useful for dialing in as close to the behavior I'm looking for. It's good to know it's limitations, and maybe we'll see better tools (3rd party tools for this on the NVIDIA side have lagged serious behind what's been available for AMD since MPT), but neither a mild change to curve functionality, nor an extremely generous power limit, is going to kill undervolting.



bmagnien said:


> Dropping power limit won't effect pure raster load performance as much as fully loaded RT and Tensor core loads. I'd take the '-150w for -5% performance loss' with a big grain of salt because that won't be the case in heavier loads.


Yes.



Baasha said:


> I bought 4x RTX 3090 Ti's and 6x RTX 3090s and I don't have their "invite" on GFE.


I've had a dozen high-end NVIDIA GPUs in the last five years, but haven't dared install GFE bloatware since 2013. So, I guess I'm not likely to see one of these invites either.



Mad Pistol said:


> 4090 finally shipped from Newegg. Scheduled to arrive on Tuesday.
> 
> I will have my new case and PSU tomorrow, so I'll have plenty of time to get it all transferred over to the new case.


Which one did you get?



Sir Beregond said:


> Nvidia rumors were also claiming double raster too up until recently and that's not true. So I don't see what the doom and gloom news here is.


NVIDIA claimed up to double the RTX 3090 (not Ti). There are outliers where this is achieved. Hell, if you throw enough pixels or crazy shader effects at things, it's probably broadly true...even though the overwhelming majority of consumers are never going to see it in real-world scenarios.



WayWayUp said:


> amd engineers were leaking that they would double their raster performance and early leaks suggested that they might even beat nvidia in raster
> to be slower is a big fail
> 
> we already know about dlss, productivity, drivers, raytracing, streaming, ext.
> amd needs to be the best at something.
> 
> any price over $1k is DOA if they cant match the raster


Leaked RDNA3 specs have been all over the place until relatively recently when more credible figures (that have reportedly been set in stone for years) were detailed. AMD probably will double their theoretical rasterization performance, which still may very well result in a card that is generally slower at rasterization than the 4090 in practice.

That said, I don't think AMD needs to match or beat the 4090 to justify over 1000 USD asking price for a card. It just has to beat whatever NVIDIA has in the same price segment. My bet is on the fastest initial Navi31 part competing mostly with the 4080 16GB.



WayWayUp said:


> but there is a lot of hope still for lower products in the stack but thats also where dlss 3 becomes useful for say the 4060 for instance


I'm not sure enough titles will support DLSS3 well enough to be a big selling point, but I'm sure NVIDIA will hype up the feature anyway. I also don't think DLSS 3.0 is most useful on lower-end parts...you want a high enough frame rate to mask latency.


----------



## smushroomed

Does the gigabyte 4090 OC use a reference board?









GeForce RTX™ 4090 GAMING OC 24G Key Features | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com


----------



## DarkRadeon7000

Hey everyone. As I understand, the 12VHPWR connector is not supposed to be bent otherwise it would cause a dangerous situation. I have a Lian Li Lancool 2 Mesh and the connector is somewhat bent after attaching the panel. Does anyone think this is safe to use or should I wait for the 90 degree adapters?


----------



## dr/owned

smushroomed said:


> Does the gigabyte 4090 OC use a reference board?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GeForce RTX™ 4090 GAMING OC 24G Key Features | Graphics Card - GIGABYTE Global
> 
> 
> Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!
> 
> 
> 
> 
> www.gigabyte.com


No.

The only ones that are known to use a reference board are INNO3D boards.

Reference designs have 2 columns of button-capacitors around the VRMs (I think):


----------



## Blameless

J7SC said:


> ...agree - rumours about all sorts of things abound...below published within a day of each other... 🥴
> View attachment 2575867


Both can be true.

There is surely room on the die, the FE and reference PCB, and in Micron's GDDR6X product stack to conjure an A102 based card that is 20% faster than the RTX 4090.

There have likely been several potential configurations of such a part that have been tested and some probably aren't viable...some may well have immolated themselves. If the power/performance target required is not being met by much early silicon, it could take a long time to stock pile enough dies to have a product launch; which likely suits NVIDIA just fine.

So, I could completely believe that NVIDIA has been binning 4090 Ti dies for months already, that three proposed configs have been cancled, and that the product is still on schedule.



katates said:


> What do you think about Msi Gaming X trio?
> I am between this and Inno3D X3 OC, and msi is a little bit pricey


The Inno3D looked pretty impressive for it's size in the few reviews I've seen of it. Might try to import one if I can't secure an FE sometime soon.



DarkRadeon7000 said:


> Hey everyone. As I understand, the 12VHPWR connector is not supposed to be bent otherwise it would cause a dangerous situation. I have a Lian Li Lancool 2 Mesh and the connector is somewhat bent after attaching the panel. Does anyone think this is safe to use or should I wait for the 90 degree adapters?
> 
> View attachment 2575872
> View attachment 2575873


Probably fine, but I'd check it's temperatures after a long stress testing run.



dr/owned said:


> No.
> 
> The only ones that are known to use a reference board are INNO3D boards.
> 
> Reference designs have 2 columns of button-capacitors around the VRMs (I think):
> 
> View attachment 2575874


The Ampere reference board could take cans or SMDs, but that doesn't seem to be the case here.


----------



## sakete

Asus TUF OC, all stock, 100% power limit:









I scored 27 126 in Time Spy


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## DarkRadeon7000

Blameless said:


> Both can be true.
> 
> There is surely room on the die, the FE and reference PCB, and in Micron's GDDR6X product stack to conjure an A102 based card that is 20% faster than the RTX 4090.
> 
> There have likely been several potential configurations of such a part that have been tested and some probably aren't viable...some may well have immolated themselves. If the power/performance target required is not being met by much early silicon, it could take a long time to stock pile enough dies to have a product launch; which likely suits NVIDIA just fine.
> 
> So, I could completely believe that NVIDIA has been binning 4090 Ti dies for months already, that three proposed configs have been cancled, and that the product is still on schedule.
> 
> 
> 
> The Inno3D looked pretty impressive for it's size in the few reviews I've seen of it. Might try to import one if I can't secure an FE sometime soon.
> 
> 
> 
> Probably fine, but I'd check it's temperatures after a long stress testing run.
> 
> 
> 
> The Ampere reference board could take cans or SMDs, but that doesn't seem to be the case here.


You mean the temperature of the GPU right? I can't measure the temp of that connector


----------



## smushroomed

dr/owned said:


> No.
> 
> The only ones that are known to use a reference board are INNO3D boards.
> 
> Reference designs have 2 columns of button-capacitors around the VRMs (I think):
> 
> View attachment 2575874


Thanks, I'm going to go ahead and order ekwb's block because it does list direct compatability with that card


----------



## LunaP

Just a heads up make sure you have this enabled in GE, else you might miss it, I don't really use GE so noticed this was off.


----------



## nycgtr

FE is the only card folks should be after honestly from a price/potential perspective. Nvidia made all the other aibs irrelevant with their price points this time around. As for the whole uncle thing. I watched this on Taiwanese newscast that relationship is far from what you would call uncle. For example, my cousins daughter is considered my niece in chinese. Their relationship is even way more distant then that. There is no direct correlation to translating that to English. As for the shortages I am sure it's controlled roll out related. We will know in 2-3 weeks when the folks that bought just to resell have their return period come up.


----------



## lordkahless

I 


LunaP said:


> Just a heads up make sure you have this enabled in GE, else you might miss it, I don't really use GE so noticed this was off.
> View attachment 2575878


I went and installed this junk on 3 computers today lol. Triple the odds right? The crazy things we do to get a 4090...


----------



## MIST3RST33Z3

I’m hoping nvflash is updated for the 40XX series soon! I picked up a Zotac extreme card and am hoping to flash the gigabyte OC vbios for the +33% power limit.
Anyone take a look at this PCB yet? Is it a reference design, or custom? If custom, what is everyone’s opinion of the board design and components?
Also hoping someone releases a water block for it so I can really let it shine.


----------



## yt93900

I'm wondering if any power limit mods are actually needed? The Waterforce Xtreme can go +11% on the power limit in Afterburner and I haven't seen it hit more than 500W in TS / Firestrike Ultra stress test. Haven't seen any "perfcap reason 'PWR'" either.


----------



## DarkRadeon7000

Can anyone pls help why my GPU usage is not maxed out at 100% with RT Psycho settings and DLSS Quality?
Cyberpunk 2077 C 2020 by CD Projekt RED 2022 10 14 02 47 45 - YouTube

My system specs are

I7 12700K
32GB DDR4 3600MHZ CL14 RAM
Zotac RTX 4090 Amp Extreme Airo
Corsair HX1000 PSU
I am playing at 3440X1440 with DLSS Quality. GPU usage is never locked at 99% and fluctuates frequently even when my CPU is not maxed out. Increasing the resolution using DLDSR solves this. Why is my GPU usage not maxed out when my CPU is also not maxed out? Game is running RT Psycho settings with DLSS Qualirty


----------



## DarkRadeon7000

MIST3RST33Z3 said:


> I’m hoping nvflash is updated for the 40XX series soon! I picked up a Zotac extreme card and am hoping to flash the gigabyte OC vbios for the +33% power limit.
> Anyone take a look at this PCB yet? Is it a reference design, or custom? If custom, what is everyone’s opinion of the board design and components?
> Also hoping someone releases a water block for it so I can really let it shine.


Can you take the picture at an angle. How much is the 12VHPWR adapter bent?


----------



## LunaP

lordkahless said:


> I
> 
> I went and installed this junk on 3 computers today lol. Triple the odds right? The crazy things we do to get a 4090...


Good idea, I should do this too lol


----------



## dr/owned

MIST3RST33Z3 said:


> I’m hoping nvflash is updated for the 40XX series soon! I picked up a Zotac extreme card and am hoping to flash the gigabyte OC vbios for the +33% power limit.
> Anyone take a look at this PCB yet? Is it a reference design, or custom? If custom, what is everyone’s opinion of the board design and components?
> Also hoping someone releases a water block for it so I can really let it shine.


The Amp Extreme isn't a reference card. It looks vaguely like they started with the reference but it's different.


----------



## MIST3RST33Z3

yt93900 said:


> I'm wondering if any power limit mods are actually needed? The Waterforce Xtreme can go +11% on the power limit in Afterburner and I haven't seen it hit more than 500W in TS / Firestrike Ultra stress test. Haven't seen any "perfcap reason 'PWR'" either.


I’m getting the PWR limit, but I only have +10%. Would be funny if all that’s needed is the extra +1% to remove that limit, then I’d imagine I’d be voltage limited. I’m getting 20.5k graphics score in TSE, with +175 on the core and +1425 on memory. Hitting a maximum of 62c, so there is definitely some room left in it. What surprised me is that I’m averaging 2975mhz and peaking at 3050mhz. I know the Zotac cards weren’t great for the 3090, but this card actually seems like a great buy compared to some of the others. Looks to have some of the beefier VRM’s compared to the other AIB’s, besides the STRIX of course.


----------



## slayer6288

DarkRadeon7000 said:


> Hey everyone. As I understand, the 12VHPWR connector is not supposed to be bent otherwise it would cause a dangerous situation. I have a Lian Li Lancool 2 Mesh and the connector is somewhat bent after attaching the panel. Does anyone think this is safe to use or should I wait for the 90 degree adapters?
> 
> View attachment 2575872
> View attachment 2575873


Just like any power cable bending in a curvature that does not kink the cables is perfectly fine. If you kinked the cables that would be an issue. Bending with no kinks is not an issue and whoever said it was doesn't understand electricity and wiring.


----------



## Arizor

DarkRadeon7000 said:


> Can anyone pls help why my GPU usage is not maxed out at 100% with RT Psycho settings and DLSS Quality?
> Cyberpunk 2077 C 2020 by CD Projekt RED 2022 10 14 02 47 45 - YouTube
> 
> My system specs are
> 
> I7 12700K
> 32GB DDR4 3600MHZ CL14 RAM
> Zotac RTX 4090 Amp Extreme Airo
> Corsair HX1000 PSU
> I am playing at 3440X1440 with DLSS Quality. GPU usage is never locked at 99% and fluctuates frequently even when my CPU is not maxed out. Increasing the resolution using DLDSR solves this. Why is my GPU usage not maxed out when my CPU is also not maxed out? Game is running RT Psycho settings with DLSS Qualirty


1440p with DLSS quality is I believe using an internal resolution of 1080p (or close) so it’s likely you’re hitting CPU limits. Try 4K just to see if it uses 99% gpu.

Also cyberpunk can be very weird with GPU usage, sometimes after changing a setting you’ll need to restart the game to get max GPU, other times certain parts of the city don’t use all the gpu.


----------



## MIST3RST33Z3

DarkRadeon7000 said:


> Can you take the picture at an angle. How much is the 12VHPWR adapter bent?


Sure, here you go. It doesn’t have too much of a bend.


----------



## Antsu

A lot of talk about powerlimits, but no real data! What is highest peak you guys reach on TS / TSE GT2 with locked 1.100V? I bet it's really close to 600W.
Hard mode: Quake II RTX with the resolution set as high as possible (the settings menu has a render resolution slider that goes to 200% too). This was hitting 700W on my 3090, not even close to FurMark levels of 850W @ 1.0V, but still much higher than your average game. We all know already that 600W is enough for normal gaming, but what about the power hungry loads?


----------



## Baasha

Glad to see many who have the 4090 already.

Just a quick tip - for those saying you don't see the power usage get to anywhere near 600W, play Metro Exodus Enhanced Edition with Ray Tracing "ON" and watch that best of a game eat up the power! It consistently tops 480W on my 3090 Ti FTW3 Ultra.


----------



## ZealotKi11er

Antsu said:


> A lot of talk about powerlimits, but no real data! What is highest peak you guys reach on TS / TSE GT2 with locked 1.100V? I bet it's really close to 600W.
> Hard mode: Quake II RTX with the resolution set as high as possible (the settings menu has a render resolution slider that goes to 200% too). This was hitting 700W on my 3090, not even close to FurMark levels of 850W @ 1.0V, but still much higher than your average game. We all know already that 600W is enough for normal gaming, but what about the power hungry loads?


Even Furmark does not hit 600W with 1.1v


----------



## RaMsiTo

Baasha said:


> Glad to see many who have the 4090 already.
> 
> Just a quick tip - for those saying you don't see the power usage get to anywhere near 600W, play Metro Exodus Enhanced Edition with Ray Tracing "ON" and watch that best of a game eat up the power! It consistently tops 480W on my 3090 Ti FTW3 Ultra.


yes , 600w.


----------



## yt93900

GTA V 3840x1600 capped at 144FPS with pretty much everything maxed, core +120MHz in Afterburner:


Spoiler














Only raises the fan speed by 4%, 360mm rad is doing the job. Ambient is 19.5*C atm, open case.
This might be the first card that doesn't completely die when driving through grass set to Ultra in GTA V...~90% utilization but doesn't drop below 100 FPS.
Gonna try Quake II RTX now.


----------



## BigMack70

Blameless said:


> Most of my undervolts involve maxing out the power target and significantly reducing voltage to keep heat/noise down while still allowing for peak performance in outlier loads


Guess we'll just have to see if 40 series behaves the same way. I should have my card tomorrow to start testing. Some people are indicating UV just doesn't follow any of the behaviors it did for 30 series; we'll all be able to test it for ourselves soon.


----------



## sakete

I'm starting to think I might return my 4090. I really don't play games that often anymore, and the games that I do play regularly, like Rocket League, don't need this kind of power. It's ridiculous overkill for my needs. Might be better to spend that $2K elsewhere and look at the 50xx series when those get launched in 2-3 years, along with a broader system upgrade at that point.


----------



## yzonker

bmagnien said:


> I havent seen anyone in this thread with an FE in hand. Or even purchased and waiting.


No I meant the 600w AIBs?

No matter though. I just road tripped to Microcenter and got a TUF. 😁


----------



## MIST3RST33Z3

yzonker said:


> No I meant the 600w AIBs?
> 
> No matter though. I just road tripped to Microcenter and got a TUF. 😁


Lucky! I was headed to MC yesterday morning because they had a TUF in stock, got there and it was gone. Ended up “sucking it up” and getting the Zotac Extreme.


----------



## bmagnien

yzonker said:


> No I meant the 600w AIBs?
> 
> No matter though. I just road tripped to Microcenter and got a TUF. 😁


You still there and wanna pick me one up?


----------



## LunaP

sakete said:


> I'm starting to think I might return my 4090. I really don't play games that often anymore, and the games that I do play regularly, like Rocket League, don't need this kind of power. It's ridiculous overkill for my needs. Might be better to spend that $2K elsewhere and look at the 50xx series when those get launched in 2-3 years, along with a broader system upgrade at that point.


Since SLI is dead I'll still upgrade to the 5000 series once it releases sadly, I really push my games (MMO's mainly), especially w/ Gshade , those shaders really get up there but make things look so damn amazing.
VR too can't stop evolving there.

If you're 4k+ and pushing its def (hopefully gonna be a good card for it ) I didn't upgrade to the 3000 series since all games I played were SLI compatible and utilized 100% which was nice, and even in games like FFXIV people were getitng nearly 40%-50% less performance ( on their 3080/3090) than I was due to this which would fall under 60 ffps ( 20-30fps for them ) vs 70-90 for me w/ shaders on ( else 120+ 4k )

I'm holding out on upgrades till HEDT returns intel side ( unless it fails and AMD fully takes over then may have to finally swap, so still pcie 3.0 for me sadly here.

If you don't see yourself getting into anything new or diff then yeah totally justified.


----------



## yt93900

Weird, I thought they all had 600W upper power limit? Mine caps out at ~500-515W with 111% slider, just tried Quake II RTX and it is indeed PWR capped.


----------



## kx11

DarkRadeon7000 said:


> Hey everyone. As I understand, the 12VHPWR connector is not supposed to be bent otherwise it would cause a dangerous situation. I have a Lian Li Lancool 2 Mesh and the connector is somewhat bent after attaching the panel. Does anyone think this is safe to use or should I wait for the 90 degree adapters?
> 
> View attachment 2575872
> View attachment 2575873


Same problem here


----------



## dr/owned

MIST3RST33Z3 said:


> Sure, here you go. It doesn’t have too much of a bend.


Looks like the strain relief part of the cable doing its job. The pins aren't zero-tolerance in the connector...there's going to be some amount of wiggle room and that's ok.


----------



## Antsu

yt93900 said:


> Weird, I thought they all had 600W upper power limit? Mine caps out at ~500-515W, just tried Quake II RTX and it is indeed PWR capped.


How low does the voltage drop? This will tell you how much more you would need to not throttle.


ZealotKi11er said:


> Even Furmark does not hit 600W with 1.1v


Got a source for that? Amazing if true, but I have a really hard time believing this since my 2080Ti sipped 700W easy in FurMark, my 3090 would probably do 1000W+, but I stopped at 1.0V because the consumption started to skyrocket and didn't feel like pushing over the 850W it was already drawing.


----------



## mirkendargen

ZealotKi11er said:


> Even Furmark does not hit 600W with 1.1v


GN got 666W draw with a 600W power limit in Furmark. Definitely power limited.


----------



## sirneb

MIST3RST33Z3 said:


> Lucky! I was headed to MC yesterday morning because they had a TUF in stock, got there and it was gone. Ended up “sucking it up” and getting the Zotac Extreme.


ya, TUF was bugged on the site, it went out of stock last but it was actually out of stock way before then.


----------



## sakete

LunaP said:


> Since SLI is dead I'll still upgrade to the 5000 series once it releases sadly, I really push my games (MMO's mainly), especially w/ Gshade , those shaders really get up there but make things look so damn amazing.
> VR too can't stop evolving there.
> 
> If you're 4k+ and pushing its def (hopefully gonna be a good card for it ) I didn't upgrade to the 3000 series since all games I played were SLI compatible and utilized 100% which was nice, and even in games like FFXIV people were getitng nearly 40%-50% less performance ( on their 3080/3090) than I was due to this which would fall under 60 ffps ( 20-30fps for them ) vs 70-90 for me w/ shaders on ( else 120+ 4k )
> 
> I'm holding out on upgrades till HEDT returns intel side ( unless it fails and AMD fully takes over then may have to finally swap, so still pcie 3.0 for me sadly here.
> 
> If you don't see yourself getting into anything new or diff then yeah totally justified.


I'm on 1440p, don't do VR and don't expect to anytime soon. It's also, if a year from now my situation changes and I get like a 4K monitor, or games come out that I really want to play that my GPU can't handle, then I can always just buy a new card at that point  Or if I buy a new case (I do want a new case, don't like my current case at all) and with moving stuff over I also get the itch to get some new hardware, then sure. But the earliest I'd replace my mobo is with the next series of AMD chips (e.g. 8000/9000 series, whatever their naming convention is for desktop chips), or if Intel comes out with something worthwhile (that's both powerful and relatively energy efficient).

I'll probably make up my mind about this tomorrow, though.


----------



## yt93900

It drops from 1.05V to ~1.01V.


----------



## Mad Pistol

Blameless said:


> Which one did you get?


Just the standard Gigabyte Windforce. Nothing special.


----------



## Baasha

LunaP said:


> Just a heads up make sure you have this enabled in GE, else you might miss it, I don't really use GE so noticed this was off.
> View attachment 2575878


Did you get the link? I tried on all 3 of my rigs - all of them have 3090 Ti's - 2 with EVGA FTW3 Ultra and 1 with 3090 Ti FE SLI. None of them in GFE show the link for me?!


----------



## Antsu

yt93900 said:


> It drops from 1.05V to ~1.01V.


Okay, that doesn't seem too bad then. If I had to guess you would probably need right around 600W to keep 1.100V from throttling. What resolution did you run this on? I've found the load to get increasingly hungry as the resolution increases, so make sure you are using atleast 1440p and 200% render resolution. For the absolute worst case scenario you can do DSR + 200% (and by DSR I specifically mean the old style DSR at the bottom of the drop down list, not the fancy Tensor core aided "new DSR".)


----------



## bastian

Baasha said:


> Exactly - has anyone here been able to legit purchase a 4090 FE through Best Buy (not scalping)?


I managed to get a 4090 FE through Best Buy Canada and I am not a scalper. Apparently Best Buy Canada had around 200 available for the whole country.


----------



## sirneb

bastian said:


> I managed to get a 4090 FE through Best Buy Canada and I am not a scalper. Apparently Best Buy Canada had around 200 available for the whole country.


If I have to guess, that's the number for the US as well.


----------



## heptilion

DarkRadeon7000 said:


> Hey everyone. As I understand, the 12VHPWR connector is not supposed to be bent otherwise it would cause a dangerous situation. I have a Lian Li Lancool 2 Mesh and the connector is somewhat bent after attaching the panel. Does anyone think this is safe to use or should I wait for the 90 degree adapters?
> 
> View attachment 2575872
> View attachment 2575873


I am on HAF700 Evo and this still happens to me. I think it's fine.


----------



## schoolofmonkey

So anyone come across any real reviews for PCIe gen 3 performance on a 10900k/11900k?

Don't know how much it would bottleneck a 4090, I was going to go with a 13900k, but it's one or the other for now, but if I could get another 12 months out of my 10900k for when the next Intel socket drops that'll be fine.


----------



## lordkahless

Baasha said:


> Did you get the link? I tried on all 3 of my rigs - all of them have 3090 Ti's - 2 with EVGA FTW3 Ultra and 1 with 3090 Ti FE SLI. None of them in GFE show the link for me?!


No idea what the actual requirements are. I don't use GFE and only installed it for this. If they are actually looking at how intensively somebody uses GFE, then I likely am the worst candidate for some rewards.


----------



## Mad Pistol

schoolofmonkey said:


> So anyone come across any real reviews for PCIe gen 3 performance on a 10900k/11900k?
> 
> Don't know how much it would bottleneck a 4090, I was going to go with a 13900k, but it's one or the other for now, but if I could get another 12 months out of my 10900k for when the next Intel socket drops that'll be fine.


I'll let you know when I get mine. Currently running a 5800x3D on a B450 Tomahawk, which only supports PCIe Gen 3.


----------



## HiLuckyB

I got the invite to buy a Founders 4090 this morning, And I was able to order one. There only doing in store pickup, And right now it's saying I will be able to pick it up Tuesday.


----------



## RaMsiTo

schoolofmonkey said:


> So anyone come across any real reviews for PCIe gen 3 performance on a 10900k/11900k?
> 
> Don't know how much it would bottleneck a 4090, I was going to go with a 13900k, but it's one or the other for now, but if I could get another 12 months out of my 10900k for when the next Intel socket drops that'll be fine.


tomorrow I will try with my old 9900k, it has been through 2080ti 3090,3090ti and still holding up to intel gen 14


----------



## LunaP

HiLuckyB said:


> I got the invite to buy a Founders 4090 this morning, And I was able to order one. There only doing in store pickup, And right now it's saying I will be able to pick it up Tuesday.


How does that work anyways, did u just see a desktop notification or did it just pop up on ur screen?


----------



## HiLuckyB

LunaP said:


> How does that work anyways, did u just see a desktop notification or did it just pop up on ur screen?


I didn't see anything on my Pc, They sent me a email and I went to check the notification tab and it was there.


----------



## lordkahless

LunaP said:


> How does that work anyways, did u just see a desktop notification or did it just pop up on ur screen?


I think the article on Videocardz was saying that some people got a popup notification from GFE and some got an email. Didn't seem to be any reason behind it. But the statement from Nvidia regarding this program was saying it was experimental and going to be very limited. Might have already ended.


----------



## schoolofmonkey

RaMsiTo said:


> tomorrow I will try with my old 9900k, it has been through 2080ti 3090,3090ti and still holding up to intel gen 14


This is what I thought too, honestly still got 12 months warranty on my current 10900k/Hero setup.

I really like the look of the MSI 4090 Suprim Liquid X, can pre-order one too..


----------



## HiLuckyB

lordkahless said:


> I think the article on Videocardz was saying that some people got a popup notification from GFE and some got an email. Didn't seem to be any reason behind it. But the statement from Nvidia regarding this program was saying it was experimental and going to be very limited. Might have already ended.


Seems like it was pretty random. I was all ready trying to get a Founders 4090, So I finally got some good luck this time around. I did sign up to be notified about the 4090 on Nvidia's website. Not sure if that helped or not.


----------



## mirkendargen

https://www.aliexpress.us/item/3256804656264743.html Looks like a Strix/TUF block is available.


----------



## yzonker

bmagnien said:


> You still there and wanna pick me one up?


I actually bought one for me and a friend here in town. I had to get a manager to approve buying 2, don't think they would have went for 3. Lol! Still 5 left when I headed out (KC). A bunch of Gigabytes too.


----------



## inedenimadam

has anybody been able to find a 4x8 adapter out in the wild? I want to try it with the gaming x trio, which only came with a 3x8


----------



## lordkahless

inedenimadam said:


> has anybody been able to find a 4x8 adapter out in the wild? I want to try it with the gaming x trio, which only came with a 3x8


I just got a custom cablemod 4x8 to 16pin cable delivered today from Fed Ex. Took 2 weeks to get it. Will get to try it someday when I can actually get a 4090 FE.


----------



## mirkendargen

inedenimadam said:


> has anybody been able to find a 4x8 adapter out in the wild? I want to try it with the gaming x trio, which only came with a 3x8











MSI RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com





Judging by the Suprim X BIOS, it isn't gonna do a whole lot for you. It might even already be wired for 600w, you can check with a multimeter.


----------



## mattskiiau

lordkahless said:


> om cablemod 4x8


Do you have a link to it? Going to see if I can source it in my country


----------



## Antsu

RaMsiTo said:


> yes , 600w.


Thanks for the report dude, this is the stuff I expect to find on OCN, not people circlejerking how it only uses 400W (in a light load with half the card not utilized and stock vcore) 

E: now if someone can run Quake II RTX with high resolution on a 600W card, we can actually find out if it's enough for every scenario. (Excluding voltmods, I think we will need to shuntmod if going to 1.2V+, but I don't think it will be a problem if you're already voltmodding.)


----------



## morph.

inedenimadam said:


> has anybody been able to find a 4x8 adapter out in the wild? I want to try it with the gaming x trio, which only came with a 3x8


cablemod


----------



## Arizor

Can easily overclock this TUF to 3ghz and +1500 on the mem with no stability issues in game. Somehow 1sr in 3dmark speedway for 5900x owners, imagine that’ll change quickly 😂


----------



## lordkahless

mattskiiau said:


> Do you have a link to it? Going to see if I can source it in my country


You're only going to be able to order it from Cablemod





Configurator – CableMod Global Store







store.cablemod.com





Go to Custom PSU Cables-> Continue to next step-> Select your brand/then model-> Then select the type of cable you want Modflex or Modmesh->Select Pro->continue to next step->Then go to add a cable and you can select 16 pin 3x8 or 4x8. Then you can continue to customize it with a length, colors, combs, etc. It is very expensive. Mine was around $170 USD to get it within 2 weeks.


----------



## lordkahless

lordkahless said:


> You're only going to be able to order it from Cablemod
> 
> 
> 
> 
> 
> Configurator – CableMod Global Store
> 
> 
> 
> 
> 
> 
> 
> store.cablemod.com
> 
> 
> 
> 
> 
> Go to Custom PSU Cables-> Continue to next step-> Select your brand/then model-> Then select the type of cable you want Modflex or Modmesh->Select Pro->continue to next step->Then go to add a cable and you can select 16 pin 3x8 or 4x8. Then you can continue to customize it with a length, colors, combs, etc. It is very expensive. Mine was around $170 USD to get it within 2 weeks.


They also have these pre-made kits too if you don't want custom. They do have a minimum order charge. So you would need to buy more stuff then just this. Like $100 I think. 





12VHPWR PCI-e Cable – CableMod Global Store







store.cablemod.com


----------



## schoolofmonkey

Seems the MSI RTX 4090 Suprim and Liquid come with a 4x8 pin cable.
I just preordered the Liquid version.








MSI GeForce RTX 4090 Suprim X Review


The MSI GeForce RTX 4090 Suprim X is the quiestest RTX 4090 custom-design that we've tested. It runs at an unbelievable 28.6 dBA in our review, under full load, 4K, 60 FPS with ray tracing enabled, and temperatures are still really good.




www.techpowerup.com


----------



## bmagnien

Antsu said:


> Thanks for the report dude, this is the stuff I expect to find on OCN, not people circlejerking how it only uses 400W (in a light load with half the card not utilized and stock vcore)
> 
> E: now if someone can run Quake II RTX with high resolution on a 600W card, we can actually find out if it's enough for every scenario. (Excluding voltmods, I think we will need to shuntmod if going to 1.2V+, but I don't think it will be a problem if you're already voltmodding.)


This isn’t even a 4090 though…didn’t even have to look at the title just look at the frequency


----------



## RaMsiTo

bmagnien said:


> This isn’t even a 4090 though…didn’t even have to look at the title just look at the frequency


In a few hours I'll get suprim x and I'll try metro exodus, but the fact that it only has 520w until it can be flashed has me worried.


----------



## inedenimadam

morph. said:


> cablemod


i must be dense, I have gone to the cablemod website twice today looking for one, and I cant find it. Would you help a guy out here?

edit: Nevermind. I have reading comprehension issues after 6 pm . 


12VHPWR PCI-e Cable – CableMod Global Store


----------



## Antsu

bmagnien said:


> This isn’t even a 4090 though…didn’t even have to look at the title just look at the frequency


My bad, on mobile at work and the title cuts off, also was just watching the power/voltage, didn't even take a look at the clocks. 

Still good data to compare against tho.


----------



## lordkahless

Somebody in Videocarz forum said the GFE link you get is reusable. Can order 50, 1 for yourself and 49 for your closest eBay friends. Sounds about right


----------



## bmagnien

lordkahless said:


> Somebody in Videocarz forum said the GFE link you get is reusable. Can order 50, 1 for yourself and 49 for eBay. Sounds about right


I saw a guy on Craigslist with two identical orders from the GFE link, $2600 each lol


----------



## lordkahless

bmagnien said:


> I saw a guy on Craigslist with two identical orders from the GFE link, $2600 each lol


People are disgusting. I would never do something like that. Making money off others misery. I guess its how the world operates.


----------



## sirneb

lordkahless said:


> Videocarz


That's a new controversy all together... sucks for us who wants a FE. Nvidia can cough it up to their testing failed.


----------



## sakete

lordkahless said:


> People are disgusting. I would never do something like that. Making money off others misery. I guess its how the world operates.


The world has always been that way, and it always will be. There always were and always will be humans willing to exploit others, or profit at the expense of others. You could say it's in our nature, though many of us manage to counter such savage instincts.


----------



## Blameless

DarkRadeon7000 said:


> You mean the temperature of the GPU right? I can't measure the temp of that connector


I mean the connector itself. Loop Time Spy Extreme graphics test #2, or something else demanding, for an hour then point and IR thermometer on it, or put your finger on it.

Reported GPU temp isn't going to change even if the connector is about to burst into flames.



slayer6288 said:


> Just like any power cable bending in a curvature that does not kink the cables is perfectly fine. If you kinked the cables that would be an issue. Bending with no kinks is not an issue and whoever said it was doesn't understand electricity and wiring.


The potential issue wasn't about the wires themselves, but the mating of the connectors. It should be a non-issue, but bad samples exist and some people put fairly extreme tension on things they shouldn't.



ZealotKi11er said:


> Even Furmark does not hit 600W with 1.1v


What settings? FurMark is highly tunable.



DarkRadeon7000 said:


> You mean the temperature of the GPU right? I can't measure the temp of that connector


I mean the connector itself. Loop Time Spy Extreme graphics test #2, or something else demanding, for an hour then point and IR thermometer on it, or put your finger on it.

Reported GPU temp isn't going to change even if the connector is about to burst into flames.



slayer6288 said:


> Just like any power cable bending in a curvature that does not kink the cables is perfectly fine. If you kinked the cables that would be an issue. Bending with no kinks is not an issue and whoever said it was doesn't understand electricity and wiring.


The potential issue wasn't about the wires themselves, but the mating of the connectors. It should be a non-issue, but bad samples exist and some people put fairly extreme tension on things they shouldn't.



ZealotKi11er said:


> Even Furmark does not hit 600W with 1.1v


What settings? FurMark is highly tunable.



sakete said:


> though many of us manage to counter such savage instincts.


Some people manage to fool themselves about it, but most put even trivial desires over the very existence of every stranger on the planet.

No one needs an RTX 4090 and complaining about people profiteering off launch-day demand for a purely optional luxury good, when the systems that are required for such a product to even exist enslave large portions of the world and poison everyone, is a rather bitter irony.


----------



## 8472

Anyone who ordered from Newegg have their order ship yet? I don't mean just receiving an email saying that it shipped, I mean UPS or FedEx actually picking it up and tracking its progress from city to city.


----------



## Dhalmel

Anyone lucky enough to get the Gigabyte Waterforce card? It's really the only AIB card I'm interested in over the founder's edition.


----------



## AdamK47

I have an MSI 4090 Gaming Trio. Power slider goes to a measly 106%. Card came with a 3 plug connector. Should I flash the Suprim X BIOS to it? 520W is all I want.

A 520W BIOS is what I used on my 3090 Ti. Had a similar 3 plug connector to the new 12V connector.

Edit: The Oct. 13th version of nvflash posted on TechPowerUp doesn't even see the card.


----------



## LunaP

Blameless said:


> No one needs an RTX 4090


Curious why some people keep convincing themselves this, like sure if you don't need it you don't need but some do, and people are mostly here because they WANT it so kinda moot point to tell people u don't need it just because their usage scenarios don't line up with theirs, but stating it like their opinion is fact feels a bit weird. Rants/Complaints aside granted we're all here for the same thing so ofc people are going to vent to others w/ similar interest.

Not saying that's what u were meaning, just seeing that phrasing here and there like people have to constantly reinforce their own beliefs lol.


----------



## cletus-cassidy

8472 said:


> Anyone who ordered from Newegg have their order ship yet? I don't mean just receiving an email saying that it shipped, I mean UPS or FedEx actually picking it up and tracking its progress from city to city.


Yes, mine shipped from Newegg Ontario California today at 9:41am ET.


----------



## inedenimadam

AdamK47 said:


> I have an MSI 4090 Gaming Trio. Power slider goes to a measly 106%. Card came with a 3 plug connector. Should I flash the Suprim X BIOS to it? 520W is all I want.
> 
> A 520W BIOS is what I used on my 3090 Ti. Had a similar 3 plug connector to the new 12V connector.
> 
> Edit: The Oct. 13th version of nvflash posted on TechPowerUp doesn't even see the card.
> View attachment 2575908


this was my exact results with the same card. admittedly, the card is BARELY touching pwr for perf cap, and like 95% VRel, but definitely want to bios flash or shunt mod or whatever else I can do to remove that limitation.


----------



## RageOfFury

Gigabyte Gaming OC (aka Big Chungus) owner here. Hello! 🙂


----------



## Xavier233

Can you please name your 4090 model and if you have any coil whine (am trying to pick a card which has the least coil whine).


----------



## Arizor

Xavier233 said:


> Can you please name you 4090 model and if you have any coil whine (am trying to pick a card which has the least coil whine).


ASUS TUF, zero coil whine.


----------



## Baasha

Can someone please explain why having 4x 3090 Ti doesn't get me any link to buy a 4090 FE through GFE? 

I see people buying them with the link left and right but I'm sitting here trying to refresh GFE on all 3 of my rigs to no avail.

WTAF?


----------



## Equinox654

AdamK47 said:


> I have an MSI 4090 Gaming Trio. Power slider goes to a measly 106%. Card came with a 3 plug connector. Should I flash the Suprim X BIOS to it? 520W is all I want.
> 
> A 520W BIOS is what I used on my 3090 Ti. Had a similar 3 plug connector to the new 12V connector.
> 
> Edit: The Oct. 13th version of nvflash posted on TechPowerUp doesn't even see the card.
> View attachment 2575908


I hope so, the card is pretty solid otherwise. I grabbed it quick at microcenter as the case was getting pretty thin and there were several dudes eyeballing the 4090s.
Had pretty good luck with MSI with a 970 & 1070 and didn't want to pay $1799.

Here a 3DMark run at with power limit maxed and +200 Core / +1200 Memory:

















So not too shabby.
I have 16ga wires on my 8pin cables, hopefully I can just short a pin and flash a bios.


----------



## LunaP

Baasha said:


> Can someone please explain why having 4x 3090 Ti doesn't get me any link to buy a 4090 FE through GFE?
> 
> I see people buying them with the link left and right but I'm sitting here trying to refresh GFE on all 3 of my rigs to no avail.
> 
> WTAF?


Did u enroll on the website, I just noticed the option there after linking my email account.


----------



## MIST3RST33Z3

Xavier233 said:


> Can you please name you 4090 model and if you have any coil whine (am trying to pick a card which has the least coil whine).


Zotac AMP Extreme, zero coil whine. Thank god, I hate that noise lol. My kingpin 3090 had awful coil whine at 450w.


----------



## Xavier233

MIST3RST33Z3 said:


> Zotac AMP Extreme, zero coil whine. Thank god, I hate that noise lol. My kingpin 3090 had awful coil whine at 450w.


How is Zotac cards compared to Gigabyte or MSI? How is their warranty?


----------



## Xavier233

Arizor said:


> ASUS TUF, zero coil whine.


Nice. Even at high FPS (400+)?


----------



## Hanks552

I wonder if anybody here tried using a 4 connector on a 3 connector graphics card, like a MSI trio. I wonder if is going to unlock more power without bios mod


----------



## MIST3RST33Z3

Equinox654 said:


> I hope so, the card is pretty solid otherwise. I grabbed it quick at microcenter as the case was getting pretty thin and there were several dudes eyeballing the 4090s.
> Had pretty good luck with MSI with a 970 & 1070 and didn't want to pay $1799.
> 
> Here a 3DMark run at with power limit maxed and +200 Core / +1200 Memory:
> View attachment 2575911
> 
> View attachment 2575912
> 
> 
> So not too shabby.
> I have 16ga wires on my 8pin cables, hopefully I can just short a pin and flash a bios.


+200 is awesome!! You got a great card.


----------



## Arizor

Xavier233 said:


> Nice. Even at high FPS (400+)?


Haha I'm not trying to hit 400 in any game I actually play, but yeah in Heaven benchmark it goes into extremely high FPS and no whine.


----------



## Baasha

LunaP said:


> Did u enroll on the website, I just noticed the option there after linking my email account.


Enroll on which website? I've had GFE installed on all 3 of my rigs for years (I use ShadowPlay quite often).


----------



## Arizor

MIST3RST33Z3 said:


> +200 is awesome!! You got a great card.


If you haven't got one yet, I'd say 4090s seems to be very solid overclockers. Mine sits at +220 and +1500 on the memory without problems, and I think so does most folks I've seen so far. Pretty great.


----------



## sirneb

Baasha said:


> Can someone please explain why having 4x 3090 Ti doesn't get me any link to buy a 4090 FE through GFE?
> 
> I see people buying them with the link left and right but I'm sitting here trying to refresh GFE on all 3 of my rigs to no avail.
> 
> WTAF?


where are you seeing people buying them? seems pretty rare to me


----------



## MIST3RST33Z3

Xavier233 said:


> How is Zotac cards compared to Gigabyte or MSI? How is their warranty?


For the US, it’s only 2+1 years, which is unfortunate. I’ve not had any experiences with them before though, so no idea of the quality of support. Gigabyte has a 4 year warranty, and I never had issues with their support when I had to RMA my aorus 2080. I’ve never had an MSI card, so no idea how their warranty is.


----------



## MIST3RST33Z3

Arizor said:


> If you haven't got one yet, I'd say 4090s seems to be very solid overclockers. Mine sits at +220 and +1500 on the memory without problems, and I think so does most folks I've seen so far. Pretty great.


I do have one, haven’t messed with it too much. In my little testing, I couldn’t get above +185 lol


----------



## Xavier233

Arizor said:


> Haha I'm not trying to hit 400 in any game I actually play, but yeah in Heaven benchmark it goes into extremely high FPS and no whine.


Very good. So if I get a TUF that coil whines, I know I could get another that does not


----------



## Xavier233

MIST3RST33Z3 said:


> For the US, it’s only 2+1 years, which is unfortunate. I’ve not had any experiences with them before though, so no idea of the quality of support. Gigabyte has a 4 year warranty, and I never had issues with their support when I had to RMA my aorus 2080. I’ve never had an MSI card, so no idea how their warranty is.


3 years is plenty, TUF cards are 3 years. I rather have no coil whine than more warranty TBH


----------



## LunaP

Baasha said:


> Enroll on which website? I've had GFE installed on all 3 of my rigs for years (I use ShadowPlay quite often).


On the website itself ( Nvidia's )


----------



## MIST3RST33Z3

Arizor said:


> If you haven't got one yet, I'd say 4090s seems to be very solid overclockers. Mine sits at +220 and +1500 on the memory without problems, and I think so does most folks I've seen so far. Pretty great.


Curious, what’re your clocks with +220?


----------



## Jordyn

Gigabyte Gaming OC here, no coil whine that I can tell.

Was just playing around with +200 & +1500 mem in Metro Exodus. Was pulling around 580w while sitting at just under 3000mhz and a max temp of 75c in my closed case. Pretty impressive.


----------



## DokoBG

Cant wait to get my hands on the MSI Gaming Trio. Funny thing is these 600W bioses dont even clock higher than the rest or do better on benches... They just heat up more


----------



## Shaded War

Xavier233 said:


> How is Zotac cards compared to Gigabyte or MSI? How is their warranty?


I got "stuck" with a Zotac Amp back when the 1080 series launched since it was the last in-stock option. I changed the thermal paste to Kryonaut and saw a decent temp drop, something like 4-5C. The build quality seemed decent and it was easy to disassemble. I never had to use their warranty service and the card is still being gamed on today.

Other than their styling not being my favorite, I don't see any reason not to buy their 4090. I'd take it over anything from Gigabyte for sure. I personally bought the 4090 Gaming Trio this round since I'v never tried anything from MSI yet and the cooler looks decent.


----------



## Jordyn

DokoBG said:


> Cant wait to get my hands on the MSI Gaming Trio. Funny thing is these 600W bioses dont even clock higher than the rest or do better on benches... They just heat up more


Yea, seems to be the case. Even the liquid cooled cards dont seem to be clocking much higher.


----------



## Arizor

MIST3RST33Z3 said:


> Curious, what’re your clocks with +220?


3ghz (just over).


----------



## Shaded War

Has anyone tried 3x 8-pin on the adapter to see if it's a bottleneck? Wondering if you are still able to hit high clocks with the locked 450w power limit.

Ordered the MSI Gaming Trio and it only has 3x 8-pin adapter. Sitting here wondering if I need to upgrade my 2 year old PSU because of the cord. Seasonic wasn't willing to send me a replacement this time like they did with the 3090 12-pin cord.


----------



## MIST3RST33Z3

Arizor said:


> 3ghz (just over).


These cards behave so strangely lol. With only +125, I’m regularly boosting to 3050.


----------



## Antsu

Arizor said:


> 3ghz (just over).


Any chance you could do the community a huge favor and run Quake II RTX with max voltage and powerlimit to see how high the power usage peaks? The demo is free on Steam, and you only need to start a new game and test for some seconds with 200% render resolution. The reason I keep asking about this is that I've never seen a game use more power, but I've seen some of the worst case games get pretty close.


----------



## Hulk1988

zhrooms said:


> Work-In-Progress


Possible to get a new Thread Owner? 

Will something happen soon? No information on the front page 

I hope everything is fine with the Thread Owner.


----------



## Arizor

Antsu said:


> Any chance you could do the community a huge favor and run Quake II RTX with max voltage and powerlimit to see how high the power usage peaks? The demo is free on Steam, and you only need to start a new game and test for some seconds with 200% render resolution. The reason I keep asking about this is that I've never seen a game use more power, but I've seen some of the worst case games get pretty close.


No worries, will do now and report back.


----------



## Arizor

Antsu said:


> Any chance you could do the community a huge favor and run Quake II RTX with max voltage and powerlimit to see how high the power usage peaks? The demo is free on Steam, and you only need to start a new game and test for some seconds with 200% render resolution. The reason I keep asking about this is that I've never seen a game use more power, but I've seen some of the worst case games get pretty close.


Power usage peaked at 550W, voltage at 1.1


----------



## Jordyn

Antsu said:


> Any chance you could do the community a huge favor and run Quake II RTX with max voltage and powerlimit to see how high the power usage peaks? The demo is free on Steam, and you only need to start a new game and test for some seconds with 200% render resolution. The reason I keep asking about this is that I've never seen a game use more power, but I've seen some of the worst case games get pretty close.


Bouncing around between 550w and 570ishw here. Max 578w.


----------



## Antsu

Thanks a ton guys! Looks to be pretty much what I thought it would be based on consumption under other loads. Very impressive that it can hold 1.1V in that scenario. Good news for 99.9% of the people, voltmodders might want to shuntmod too still. 
BONUS: I bet you could make it tip over if you turned DSR (the old version, bottom of the list in NVCP) AND 200% render resolution on, but that is just for fun, games will never peak that high.


----------



## yt93900

They all seem to hit a wall between 3000 and 3100MHz core on ambient. My WF Xtreme only does ~3030MHz reliably.


----------



## Jordyn

yt93900 said:


> They all seem to hit a wall between 3000 and 3100MHz core on ambient. My WF Xtreme only does ~3030MHz reliably.


Was just going to post the same. Sometihing is holding them back as they dont really seem to be temp limited even on ambient.


----------



## Blameless

Antsu said:


> Any chance you could do the community a huge favor and run Quake II RTX with max voltage and powerlimit to see how high the power usage peaks? The demo is free on Steam, and you only need to start a new game and test for some seconds with 200% render resolution. The reason I keep asking about this is that I've never seen a game use more power, but I've seen some of the worst case games get pretty close.


Quake II RTX is up there. So is _Path of Exile_, of all things, in DX11 with global illumination enabled.


----------



## tubs2x4

Jordyn said:


> Was just going to post the same. Sometihing is holding them back as they dont really seem to be temp limited even on ambient.


Voltage


----------



## Antsu

tubs2x4 said:


> Voltage


I don't know man, the scaling has been very meh after ~1V for years now.
It's quite easy to test if someone is bored, just see max clock difference for 1.0V -> 1.068V and you probably get a rough idea how it would do with more voltage. 1.068V specifically because after that the effective clock takes a dive without also changing NVVD/MSVDD, atleast on the 3090. This is something to test on a 4090 too, how close is the effective clock to reported clock at different voltages.

Just ran a quick test on my 3090 and in a certain scenario I need 1006mV for 2115Mhz and going to 1068mV allows for 2175Mhz, that's less than 3% with 28.8C and 31.0C peak temps and 50W more power. Now personally I am all for that, but it's not like it actually makes a huge difference, and I don't think the new node is any different but I hope I am wrong.


----------



## kx11

here's 4090 struggling with RDR2 lol


----------



## Tideman

Managed to snag an ASUS 4090 TUF. Hope I'll be happy with it.


----------



## mattskiiau

Anyone try a 4 x 8-pin with a Gaming X Trio yet? Any changes in results?


----------



## ArcticZero

Since it's now dawning on me that the Strix won't fit my case with my res in there while I wait for a block, I'm considering getting a MSI Suprim X (air cooled one since I want to add it to my loop) instead, though I have no idea what the waterblock options for that are at the moment.


----------



## derthballs

The zotak 485w bios is a bit of an annoyance (as its a 4 x pci-e extension so i cant see any reason it cant do 600w)

Guess we wait to see whether they can be flashed to a 600w bios. My card is boosting to 2910 with a small overclock but it would be nice to see the 3000 that others are getting with other cards.


----------



## mattskiiau

derthballs said:


> The zotak 485w bios is a bit of an annoyance (as its a 4 x pci-e extension so i cant see any reason it cant do 600w)
> 
> Guess we wait to see whether they can be flashed to a 600w bios. My card is boosting to 2910 with a small overclock but it would be nice to see the 3000 that others are getting with other cards.


What's the PerfCap reason?
I'm getting 3050mhz with a measly +135Mhz on the Gaming X trio and that's using a 3x pin. I'm voltage limited after that point.


----------



## derthballs

mattskiiau said:


> What's the PerfCap reason?
> I'm getting 3050mhz with a measly +135Mhz on the Gaming X trio and that's using a 3x pin. I'm voltage limited after that point.


Its only got a 485w bios on it.


----------



## MIST3RST33Z3

derthballs said:


> The zotak 485w bios is a bit of an annoyance (as its a 4 x pci-e extension so i cant see any reason it cant do 600w)
> 
> Guess we wait to see whether they can be flashed to a 600w bios. My card is boosting to 2910 with a small overclock but it would be nice to see the 3000 that others are getting with other cards.


I have a Zotac and even with the 495w power limit, I’m regularly hitting 3060. I think it has even more room once it can be flashed.


----------



## Glottis

Honestly all cards should receive official 600W BIOS updates from their respective AIBs. It's ridiculous that people pay more than FE and get less for their money. If AIB card costs same as FE or more, it should have all feature parity with FE, or have even more features.


----------



## inedenimadam

mattskiiau said:


> Anyone try a 4 x 8-pin with a Gaming X Trio yet? Any changes in results?


ordered a 4x8 from cable mods last night, but it might be a minute before I get to test. I’m MOST interested in this too, AND we need a patch for nvflash


----------



## Bilco

Why are people posting their memory and core offsets? Aren't the base clocks variable from card to card? Shouldn't people be posting what their maximum stable frequency is instead?


----------



## RaMsiTo

Result not found







www.3dmark.com


----------



## 8472

My card is out for delivery. Not sure why UPS took so long to add the information to their system.


----------



## Equinox654

Bilco said:


> Why are people posting their memory and core offsets? Aren't the base clocks variable from card to card? Shouldn't people be posting what their maximum stable frequency is instead?


My post had an image of a timespy run with the average core speed and memory speed shown.


----------



## RaMsiTo

That position will not last long but while I enjoy 










I scored 29 509 in Port Royal


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## bmagnien

RaMsiTo said:


> That position will not last long but while I enjoy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 509 in Port Royal
> 
> 
> Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2575962


how the hell are you beating out TecLab who's on LN2 with a 12900ks vs your ambient + 9900k?!


----------



## EastCoast

Watch it and learn


----------



## morph.

Just a quick and dirty run after I drained and refilled the loop while whipping up a quick GPU bridge until the water block comes. She just tucked in snuggly clearing the D5 pump but the power cables will probably extrude the side panel until a 90° adapter is made.


----------



## Xavier233

Those with the MSI 4090 cards, can you please let us know if they have any coil whine? Thank you


----------



## smushroomed

Do 9900k users need to worry about upgrading?


----------



## RaMsiTo

bmagnien said:


> how the hell are you beating out TecLab who's on LN2 with a 12900ks vs your ambient + 9900k?!


drivers? ambient 26º celsius.


----------



## HeadlessKnight

Looks like Asus and Gigabyte cards are the right choices this time around, from what've seen all their 4090s have the 600w BIOS. All others vendors especially MSI and Zotac have just released a bunch of low PL cards.


----------



## QuatroKiller

Have a 8700K and a GOLD 850PSU, probably not a good idea to get a 4090 until im ready to drop serious coin on an entirely new build?


----------



## HeadlessKnight

QuatroKiller said:


> Have a 8700K and a GOLD 850PSU, probably not a good idea to get a 4090 until im ready to drop serious coin on an entirely new build?


Your CPU should be fine for the most part if it is overclocked and memory tweaked at 4k resolution, but at 1440p high refresh it will bottleneck unfortunately. Not sure about your PSU since you didn't mention which model it is.


----------



## MIST3RST33Z3

Tideman said:


> Managed to snag an ASUS 4090 TUF. Hope I'll be happy with it.


You could have been stuck with a Zotac lol, the TUF is an awesome card.


----------



## yzonker

bmagnien said:


> how the hell are you beating out TecLab who's on LN2 with a 12900ks vs your ambient + 9900k?!


Yea that's a smokin' run.


----------



## FarmerJo

HeadlessKnight said:


> Looks like Asus and Gigabyte cards are the right choices this time around, from what've seen all their 4090s have the 600w BIOS. All others vendors especially MSI and Zotac have just released a bunch of low PL cards.


I've seen from a few sources that the tuf is only a 505w bios. Can anyone confirm?


----------



## Alelau18

FarmerJo said:


> I've seen from a few sources that the tuf is only a 505w bios. Can anyone confirm?


TUF is 600W vBIOS, I just got mine


----------



## katates

Hey guys currently,i have Corsair RM750. I've bought MSI Gaming x trio 4090. And I have Ryzen 5800x and 4x8GB 3600Mhz ram. All of USB ports are full, 9 fans.

I think Corsair RM750 is not enough. But I don't know what PSU to buy.

I was thinking FSP Hydro G PRO 850W. Which is very affordable.

But I am not sure if this will be enough too. I don't think I will upgrade anything in 2-3 years just maybe go for the 5800X3D. So I'm a bit reluctant to future-proof with the psu. Would this be enough for my current PC?

Or should I just go for this? FSP Hydro G PRO 1000W. Or future proof with this FSP Hydro PTM Pro 1200W?


----------



## lordkahless

Isn't there a TUF and a TUF OC? Not sure if there is a difference in vBIOS.


----------



## HeadlessKnight

FarmerJo said:


> I've seen from a few sources that the tuf is only a 505w bios. Can anyone confirm?











ASUS TUF GeForce RTX 4090 Gaming review


In this review the turn goes to ASUS; they submitted that factory-tweaked and impressively cooled TUF Gaming OC edition of the GeForce RTX 4090. Loaded with a faster clock frequency, dual BIOS, and p... Overclocking




www.guru3d.com





Power limiter is up to 133%, which means the card should hit 600 watts if the situation demanded that. Unless it has a stupid engineering design like the infamous rev0.1 EVGA FTW3 3080 10GB and 3090.


----------



## MIST3RST33Z3

FarmerJo said:


> I've seen from a few sources that the tuf is only a 505w bios. Can anyone confirm?


It’s a 600w, it just ends up being voltage limited around 505w.


----------



## Alelau18

3GHz sounds very feasible really


----------



## QuatroKiller

HeadlessKnight said:


> Your CPU should be fine for the most part if it is overclocked and memory tweaked at 4k resolution, but at 1440p high refresh it will bottleneck unfortunately. Not sure about your PSU since you didn't mention which model it is.


EVGA 850Gold


----------



## Tideman

MIST3RST33Z3 said:


> You could have been stuck with a Zotac lol, the TUF is an awesome card.


Good to hear. Actually the only other two card models in the sites restock were ZOTACs and they were snapped up before the TUF.

I'm glad I had read bout the zotacs being more power limited or I might have grabbed the zotac based on its looks.


----------



## gamerMwM

MIST3RST33Z3 said:


> It’s a 600w, it just ends up being voltage limited around 505w.


Asus Tuf and Tuf OC both come with 4 pin connector right?

Do we have a list of which cards only have a 3 pin connector?

Sent from my SM-G975U using Tapatalk


----------



## Tadaschi

Shaded War said:


> Has anyone tried 3x 8-pin on the adapter to see if it's a bottleneck? Wondering if you are still able to hit high clocks with the locked 450w power limit.
> 
> Ordered the MSI Gaming Trio and it only has 3x 8-pin adapter. Sitting here wondering if I need to upgrade my 2 year old PSU because of the cord. Seasonic wasn't willing to send me a replacement this time like they did with the 3090 12-pin cord.


I am still waiting for my trio next week.
But after look everywhere for a 4x 8pin from Nvidia, i discovered that any aftermarket 3x 3pin do not have the IC chip to detect the amount of cables connected since the have both sensor pins connected to ground (what make the gpu release the full power), but nobody upload the bios anywhere to check it or tested with other cable yet. I will do it as soon i get my 4090 Trio


----------



## WayWayUp

bmagnien said:


> how the hell are you beating out TecLab who's on LN2 with a 12900ks vs your ambient + 9900k?!


the score is a little sus but he does have a high memory overclock
it still doesnt explain the disparity between 2nd and 3rd tho but i would like to hear about this ram latency 

if these cards are scaling with memory clocks i will wait it out for the 4090ti


----------



## Glottis

MIST3RST33Z3 said:


> It’s a 600w, it just ends up being voltage limited around 505w.


Pretty much. Only a handful of applications can force 4090 go above 500W (Quake RTX and Metro Exodus, afaik), and we don't even have any reliable data yet if something like Quake RTX running at 550W has any tangible performance benefits compared to stock 450W.


----------



## QuatroKiller

EVGA 850Gold good enough? Really dont want to go for an end to end new build.

8700k at 4.9ghz, 32 GB DDR4 (15/16/16/35), ArRock Z370 Taichi, O11 Dynamic (OG version), THANKS! (running a 2080TI right now and its still a good card but god the 4090 is what 2-3X better?!?!)


----------



## Baasha

LunaP said:


> On the website itself ( Nvidia's )
> View attachment 2575914


Hmm.. I hadn't done that before. I just did it only got the "one month complimentary Adobe CC membershiop" as a reward. Nothing about the GPU...


----------



## QuatroKiller

Baasha said:


> Hmm.. I hadn't done that before. I just did it only got the "one month complimentary Adobe CC membershiop" as a reward. Nothing about the GPU...


Same, just 1 month CC


----------



## bmagnien

QuatroKiller said:


> Same, just 1 month CC


I put my 1 month of Adobe CC up on eBay for $3600


----------



## nycgtr

Alelau18 said:


> TUF is 600W vBIOS, I just got mine


Pulls over 600 in furmark from my testing


lordkahless said:


> Isn't there a TUF and a TUF OC? Not sure if there is a difference in vBIOS.


No diference. I have both. One is just priced higher to make off the dont know betters and the other to be at fe pricing


MIST3RST33Z3 said:


> It’s a 600w, it just ends up being voltage limited around 505w.


----------



## Glottis

QuatroKiller said:


> EVGA 850Gold good enough? Really dont want to go for an end to end new build.
> 
> 8700k at 4.9ghz, 32 GB DDR4 (15/16/16/35), ArRock Z370 Taichi, O11 Dynamic (OG version), THANKS! (running a 2080TI right now and its still a good card but god the 4090 is what 2-3X better?!?!)


PSU may be OK if you don't overclock, but 4090 will be completely and utterly bottlenecked by your CPU.


----------



## heptilion

morph. said:


> Just a quick and dirty run after I drained and refilled the loop while whipping up a quick GPU bridge until the water block comes. She just tucked in snuggly clearing the D5 pump but the power cables will probably extrude the side panel until a 90° adapter is made.
> 
> View attachment 2575969
> 
> 
> 
> View attachment 2575968


what programme are you using to change colours on the 4090?


----------



## WayWayUp

to the non launch day buyers,
where are you getting your cards?


----------



## Xavier233

katates said:


> Hey guys currently,i have Corsair RM750. I've bought MSI Gaming x trio 4090. And I have Ryzen 5800x and 4x8GB 3600Mhz ram. All of USB ports are full, 9 fans.
> 
> I think Corsair RM750 is not enough. But I don't know what PSU to buy.
> 
> I was thinking FSP Hydro G PRO 850W. Which is very affordable.
> 
> But I am not sure if this will be enough too. I don't think I will upgrade anything in 2-3 years just maybe go for the 5800X3D. So I'm a bit reluctant to future-proof with the psu. Would this be enough for my current PC?
> 
> Or should I just go for this? FSP Hydro G PRO 1000W. Or future proof with this FSP Hydro PTM Pro 1200W?


850 is not enough IMO, specially if u have a OCed CPU like the 12900K. I pulled close to 800watts on a 3090Ti + OCed 12900K. And since you are buying a new PSU anyhow, I would get a 1200-1500 watts PSU.


----------



## motivman

Xavier233 said:


> Those with the MSI 4090 cards, can you please let us know if they have any coil whine? Thank you


I have gaming trio, zero coil whine.


----------



## Xavier233

QuatroKiller said:


> EVGA 850Gold good enough? Really dont want to go for an end to end new build.
> 
> 8700k at 4.9ghz, 32 GB DDR4 (15/16/16/35), ArRock Z370 Taichi, O11 Dynamic (OG version), THANKS! (running a 2080TI right now and its still a good card but god the 4090 is what 2-3X better?!?!)


That 8700K is gonna bottleneck the 4090 big time, even at 4K! 

I heard even the 12900K can bottleneck a 4090


----------



## QuatroKiller

Glottis said:


> PSU may be OK if you don't overclock, but 4090 will be completely and utterly bottlenecked by your CPU.


guess i wait a few more years then and just do an entire new build  because new CPU means new MB and new Memory and boom $3000.


----------



## Xavier233

motivman said:


> I have gaming trio, zero coil whine.


Very nice. Seems not many people are having coil whine this time around, which is very good news


----------



## jim2point0

Unlaunching The 12GB 4080


16GB GeForce RTX 4080 on track to delight gamers everywhere November 16th.



www.nvidia.com





Holy **** I can't believe this is real, lol


----------



## lordkahless

jim2point0 said:


> Unlaunching The 12GB 4080
> 
> 
> 16GB GeForce RTX 4080 on track to delight gamers everywhere November 16th.
> 
> 
> 
> www.nvidia.com
> 
> 
> 
> 
> 
> Holy **** I can't believe this is real, lol


I had to check if it was April 1st.


----------



## bmagnien

jim2point0 said:


> Unlaunching The 12GB 4080
> 
> 
> 16GB GeForce RTX 4080 on track to delight gamers everywhere November 16th.
> 
> 
> 
> www.nvidia.com
> 
> 
> 
> 
> 
> Holy **** I can't believe this is real, lol


So damn sloppy. Can they unlaunch and then relaunch the 4090 so there’s supply?


----------



## Equinox654

motivman said:


> I have gaming trio, zero coil whine.


No coil wine on mine either.


----------



## lordkahless

bmagnien said:


> So damn sloppy. Can they unlaunch and then relaunch the 4090 so there’s supply?


LOL 4090 "Reboot" launch - "We heard you, we listened, we released the rest of the inventory"


----------



## 8472

For those of you still on PCIE 3.0, techpowerup just published their article on performance with different PCIE generations. 










NVIDIA GeForce RTX 4090 PCI-Express Scaling


The new NVIDIA GeForce RTX 4090 is a graphics card powerhouse, but what happens when you run it on a PCI-Express 4.0 x8 bus? In our mini-review we've also tested various PCI-Express 3.0, 2.0 and 1.1 configs to get a feel for how FPS scales with bandwidth.




www.techpowerup.com


----------



## Xavier233

They will resell the exact same 4080 12GB as a 4070Ti 12 GB. Makes 0 difference at the end.


----------



## WayWayUp

at 4k pcie 4 is making a difference over pcie 3
interesting


----------



## BPS Customs

mattskiiau said:


> Anyone try a 4 x 8-pin with a Gaming X Trio yet? Any changes in results?


I just confirmed with NVIDIA that doing this would indeed work just fine, but would not change anything about the power draw of the card. The direct quote:

"It is safe to plug the 4-plug adapter into the card, but the card itself sets its own limits on how much power it can draw. If the card is limited by MSI to only draw 450 watts, then the extra 4th connector being available will not result in more power being drawn."


----------



## WayWayUp

Xavier233 said:


> They will resell the exact same 4080 12GB as a 4070Ti 12 GB. Makes 0 difference at the end.


it makes a difference if they delay launch enough to where the 3xxx mostly clears and instead of $900 this becomes $650-$700

nvidia never wanted to release that card now nor call it a 4080. they just needed to announce something under 1000 bucks but knew that a 4070 priced at $900 would be received horribly as the 3080 of last gen was only $700

but they couldnt discount price below 900 because it would kill 3000 series sales

But i think they are very happy with the demand so far for the 4090 and realize they dont need to have this option out there yet


----------



## MIST3RST33Z3

BPS Customs said:


> I just confirmed with NVIDIA that doing this would indeed work just fine, but would not change anything about the power draw of the card. The direct quote:
> 
> "It is safe to plug the 4-plug adapter into the card, but the card itself sets its own limits on how much power it can draw. If the card is limited by MSI to only draw 450 watts, then the extra 4th connector being available will not result in more power being drawn."


I am hoping a vbios flash will fix this. I have the Zotac extreme card which max's at 495w (+10%), but it came with the 4 plug adapter, and has better VRM than the gigabyte, so I am hoping it works. Hopefully nvflash will be updated soon so we can test it.


----------



## Xavier233

WayWayUp said:


> it makes a difference if they delay launch enough to where the 3xxx mostly clears and instead of $900 this becomes $650-$700
> 
> nvidia never wanted to release that card now nor call it a 4080. they just needed to announce something under 1000 bucks but knew that a 4070 priced at $900 would be received horribly as the 3080 of last gen was only $700
> 
> but they couldnt discount price below 900 because it would kill 3000 series sales
> 
> But i think they are very happy with the demand so far for the 4090 and realize they dont need to have this option out there yet


I think they dont realize (or dont care) that with more and more reviews being out comparing the 3000 and 4000 series, the less people would be willing to buy a 3000 series. Even if a 4080 16GB is very close in performance to a 3090Ti, I would probably still go for a 4080 16GB since it also has DLSS3, and also because it is a lot more efficient in temps and power than a 3090Ti.

They will just drop the 4080 12GB in price, say $100 less, and brand it as a different card. These cards are already produced, and designed, they wont just "scrap them".


----------



## mirkendargen

All this 3x8pin vs. 4x8pin talk is a bunch of FUD. The card sees a single rail, it can't tell how many 8pin cables are plugged into an adapter, or even if an adapter is in use vs. a native cable. *All *the card can tell is if the sense pins are grounded or not. If Sense0 is open and Sense1 is grounded, the card will draw a max of 450w. If both are grounded the card can draw a maximum of 600w. The Nvidia 4-cable adapter happens to have a super simple IC in it that controls whether Sense0 is grounded or not based on whether 3 or 4 power cables are connected to it. Once again, the card doesn't know the number of power cables plugged into the adapter plugged into it, it only know whether those pins are grounded. Plenty of adapters and cables can just ground those pins.

For example, this is the Corsair cable. You can see that it has 2 small ground wires coming from the PSU to the sense pins, signaling to the card to draw 600w










*INDEPENDANT OF ALL OF THIS *the cards still have max TBP settings in BIOS, which can (and are in MSI/Zotac's case) less than 600w. The power cable plugged in isn't gonna change that. A version of nvflash that works with 4090's probably will fix that.


----------



## MIST3RST33Z3

mirkendargen said:


> All this 3x8pin vs. 4x8pin talk is a bunch of FUD. The card sees a single rail, it can't tell how many 8pin cables are plugged into an adapter, or even if an adapter is in use vs. a native cable. *All *the card can tell is if the sense pins are grounded or not. If Sense0 is open and Sense1 is grounded, the card will draw a max of 450w. If both are grounded the card can draw a maximum of 600w. The Nvidia 4-cable adapter happens to have a super simple IC in it that controls whether Sense0 is grounded or not based on whether 3 or 4 power cables are connected to it. Once again, the card doesn't know the number of power cables plugged into the adapter plugged into it, it only know whether those pins are grounded. Plenty of adapters and cables can just ground those pins.
> 
> *INDEPENDANT OF ALL OF THIS *the cards still have max TBP settings in BIOS, which can (and are in MSI/Zotac's case) less than 600w. The power cable plugged in isn't gonna change that. A version of nvflash that works with 4090's probably will fix that.


Let’s hope that update comes soon! Can’t wait to get rid of this gimped vbios.


----------



## ZealotKi11er

Xavier233 said:


> They will resell the exact same 4080 12GB as a 4070Ti 12 GB. Makes 0 difference at the end.


Yep. Even at $799 still terrible. Need to be $599 MAX.


----------



## Xavier233

ZealotKi11er said:


> Yep. Even at $799 still terrible. Need to be $599 MAX.


They will not drop it to more than $100 IMO, no way.


----------



## yt93900

I've noticed I can turn on ECC in Nvidia control panel, is this new?


----------



## bmagnien

yt93900 said:


> I've noticed I can turn on ECC in Nvidia control panel, is this new?


Thats very interesting. If Error Correction can be toggled off, higher but potentially less stable memory clocks could become viable. Could you run speedway or port royal at the peak or just beyond your memory OC stability threshold, with and without ECC on? My hypothesis is that ECC disabled will perform better:

ECC Enabled: Recommended for high-precision, GPU-accelerated computational applications. 
ECC Disabled: Recommended for GPU-accelerated, ray tracing applications.
EDIT: Looks like these guys already ran the number: NVIDIA GeForce RTX 4090: The New Rendering Champion

In quick tests, enabling ECC memory dropped the benchmarked bandwidth from 845 GB/s down to 742 GB/s. Comparatively, enabling ECC memory on the Quadro RTX 6000 dropped bandwidth from 513 GB/s to 433 GB/s.


----------



## motivman

MIST3RST33Z3 said:


> I am hoping a vbios flash will fix this. I have the Zotac extreme card which max's at 495w (+10%), but it came with the 4 plug adapter, and has better VRM than the gigabyte, so I am hoping it works. Hopefully nvflash will be updated soon so we can test it.


this is great news. Just need to be able to flash 600W bios, and problem solved. I am no longer eyeing the strix or FE, will keep my trio.


----------



## Dragonsyph

So what’s the best card for watercooling that can overclock the best?
Are any cards binned?
Or is FE plus block better vs a 2000 strix?


----------



## yt93900

bmagnien said:


> Thats very interesting. If Error Correction can be toggled off, higher but potentially less stable memory clocks could become viable. Could you run speedway or port royal at the peak or just beyond your memory OC stability threshold, with and without ECC on? My hypothesis is that ECC disabled will perform better:
> 
> ECC Enabled: Recommended for high-precision, GPU-accelerated computational applications.
> ECC Disabled: Recommended for GPU-accelerated, ray tracing applications.
> EDIT: Looks like these guys already ran the number: NVIDIA GeForce RTX 4090: The New Rendering Champion
> 
> In quick tests, enabling ECC memory dropped the benchmarked bandwidth from 845 GB/s down to 742 GB/s. Comparatively, enabling ECC memory on the Quadro RTX 6000 dropped bandwidth from 513 GB/s to 433 GB/s.


I'd have to try, it's disabled by default.


----------



## DokoBG

They ALL clock about the same. I am not even sure that 600W does anything more for the scores/benches. It's like 3% difference between a 480w power limited card and 600w card.


----------



## motivman

mirkendargen said:


> All this 3x8pin vs. 4x8pin talk is a bunch of FUD. The card sees a single rail, it can't tell how many 8pin cables are plugged into an adapter, or even if an adapter is in use vs. a native cable. *All *the card can tell is if the sense pins are grounded or not. If Sense0 is open and Sense1 is grounded, the card will draw a max of 450w. If both are grounded the card can draw a maximum of 600w. The Nvidia 4-cable adapter happens to have a super simple IC in it that controls whether Sense0 is grounded or not based on whether 3 or 4 power cables are connected to it. Once again, the card doesn't know the number of power cables plugged into the adapter plugged into it, it only know whether those pins are grounded. Plenty of adapters and cables can just ground those pins.
> 
> For example, this is the Corsair cable. You can see that it has 2 small ground wires coming from the PSU to the sense pins, signaling to the card to draw 600w
> View attachment 2575981
> 
> 
> 
> *INDEPENDANT OF ALL OF THIS *the cards still have max TBP settings in BIOS, which can (and are in MSI/Zotac's case) less than 600w. The power cable plugged in isn't gonna change that. A version of nvflash that works with 4090's probably will fix that.
> 
> For example, this is the Corsair cable. You can see that it has 2 small ground wires coming from the PSU to the sense pins, signaling to the card to draw 600w



so basically, what you are saying is that those of us with the 3X8 adapters do not need to get the 4X8 adapter. We just need a bios flash to get to 600W. difference it that the individual PCIE cables will be carrying 200W with the 3X8 adapter vs 150W with the 4X8 adapter....


----------



## mirkendargen

motivman said:


> so basically, what you are saying is that those of us with the 3X8 adapters do not need to get the 4X8 adapter. We just need a bios flash to get to 600W. difference it that the individual PCIE cables will be carrying 200W with the 3X8 adapter vs 150W with the 4X8 adapter....


It depends how your specific 3x8pin adapter is wired. You might need a different adapter/cable as well if your adapter is wired to only ground sense1. But the determining factor isn't the number of 8pins.


----------



## yt93900

Has anyone seen the 12pin Corsair cable out in the wild already? Seem to never be in stock.


----------



## mirkendargen

yt93900 said:


> Has anyone seen the 12pin Corsair cable out in the wild already? Seem to never be in stock.


I posted a picture of it in my hand above.









[Official] NVIDIA RTX 4090 Owner's Club


Just a quick and dirty run after I drained and refilled the loop while whipping up a quick GPU bridge until the water block comes. She just tucked in snuggly clearing the D5 pump but the power cables will probably extrude the side panel until a 90° adapter is made. what programme are you...




www.overclock.net


----------



## MIST3RST33Z3

Probably going to return my Zotac. It performs great, boosting to 3050 pretty regularly, but I’m doubtful that one of the “good” waterblock makers will make a block for it, based on blocks for prior generation Zotac cards.


----------



## KedarWolf

yt93900 said:


> Has anyone seen the 12pin Corsair cable out in the wild already? Seem to never be in stock.


I just ordered it from the Corsair site with free shipping and no tax.



Oh wait, I think that's the wrong cable.


----------



## yzonker

yt93900 said:


> Has anyone seen the 12pin Corsair cable out in the wild already? Seem to never be in stock.


Yep I have one too.


----------



## yzonker

motivman said:


> so basically, what you are saying is that those of us with the 3X8 adapters do not need to get the 4X8 adapter. We just need a bios flash to get to 600W. difference it that the individual PCIE cables will be carrying 200W with the 3X8 adapter vs 150W with the 4X8 adapter....


You'll need both. The 3x8pin adapters that were included do not have both sense pins grounded, so you'll still see the same PL in AB until you have both pins grounded (like with a 4x8pin adapter or one like Corsair is selling).


----------



## GRABibus

dante`afk said:


> who's gonna get scalped
> 
> View attachment 2575328


This price is a « Strix No go » for me


----------



## GRABibus

yzonker said:


> Yep I have one too.


Where did you get it ?


----------



## alasdairvfr

mirkendargen said:


> GN got 666W draw with a 600W power limit in Furmark. Definitely power limited.


Is this 666w "rail power" by chance?

my HWINFO64 shows this after running some heavy duty stuff:


----------



## motivman

Dragonsyph said:


> So what’s the best card for watercooling that can overclock the best?
> Are any cards binned?
> Or is FE plus block better vs a 2000 strix?


Personally, if I could do it all over, I would Pick FE... Strix is not worth $2000


----------



## mirkendargen

alasdairvfr said:


> Is this 666w "rail power" by chance?
> 
> my HWINFO64 shows this after running some heavy duty stuff:
> 
> View attachment 2575983


They showed it on their chart, not raw sensor data if I remember right. They also might have been measuring it externally with induction clamps, not relying on the sensor data. If you expand that rail power does it have like 615w on the 12VHPWR and 10w on the PCIE slot?


----------



## Xavier233

motivman said:


> Personally, if I could do it all over, I would Pick FE... Strix is not worth $2000


I agree, though that FE's fans are wobbling weirdly and make some funky noises at certain RPMs. IMO, the cheapest AIB cards would still be better than the FE, and u get slightly better cooling


----------



## mirkendargen

I think the $1600 base TUF is the value champ. It has the same PCB as the Strix with a couple power stages left blank, and some headers missing on the side (possibly I2C voltage controller headers), and will therefore share Strix block compatibility.


----------



## ArcticZero

Biggest reason I'm hunting for a TUF/Strix is the extra HDMI port. Second is block availability. But yeah extremely hard to swallow an extra $400 for a Strix even if that's likely what I will have available soon.


----------



## WayWayUp

i love how everyone suddenly has a card
where are you guys getting these?

I'm not used to this. I usually just grab mine from evga. cant find a card anywhere


----------



## mirkendargen

WayWayUp said:


> i love how everyone suddenly has a card
> where are you guys getting these?
> 
> I'm not used to this. I usually just grab mine from evga. cant find a card anywhere


I think the majority of people in the US here with cards in hand already got them in person at Microcenter. I sure don't have a card, I have a TUF backordered at B&H and ShopBLT while I try to find something sooner.


----------



## Clukos

I'm blown away by the 4090 FE









I scored 30 475 in Time Spy


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





So far seems to be fine with +260 core, +1500 mem.


----------



## alasdairvfr

mirkendargen said:


> They showed it on their chart, not raw sensor data if I remember right. They also might have been measuring it externally with induction clamps, not relying on the sensor data. If you expand that rail power does it have like 615w on the 12VHPWR and 10w on the PCIE slot?


This is what i have, should have expanded it before tbh


----------



## motivman

WayWayUp said:


> i love how everyone suddenly has a card
> where are you guys getting these?
> 
> I'm not used to this. I usually just grab mine from evga. cant find a card anywhere


Microcenter was my source... I tried getting a strix on newegg... I must have hit f5 a hundred times, and "add to cart" button just as much, but a damn card won't add to my cart... all the while driving to microcenter which was an hour away from where i live....SMH, first world problems... the guy in front of me got the last TUF card they had in stock, so I had to settle for the MSI gaming trio... which I am very happy with. looking back, I am glad I didn't blow an extra $400 getting the strix.


----------



## Xavier233

The worst thing besides coil whine is a defective card. I hate coil whine


----------



## Jordyn

Speculation in the comments that Pailt cards are the badly desgined ones that BZ mentions at the start.


----------



## Alelau18

alasdairvfr said:


> This is what i have, should have expanded it before tbh
> 
> View attachment 2575987


I have very similar readings (4090 TUF OC)


----------



## DokoBG

Cant wait to get my MSi Gaming Trio and match these Strix cards with unlocked 600W power limits which are $500-600 more.


----------



## alasdairvfr

Did some Furmark testing of my own to see if I can hit or pass 600w.

I used +255 core and +1200 memory (this is pretty unstable for anything else but runs this bench like a champ)

I couldn't pass 572w, it was hovering in the 560w range. Mind you GN spent a king's ransom on their testing setup and my PSU/GPU power reporting will be off by a margin so I think ~560 +- that margin is where my card taps out. All in all I'm pretty pleased. If I could get my hands on a clamp meter I'd redo this and try again.

All things considered my 3080 TUF OC and my 3080 TI FTW 3 (both solid cards imo) ran way hotter than this one, the 3080 TI barely hit 440w

Furmark results:










Card stats:


----------



## dr/owned

mirkendargen said:


> I think the $1600 base TUF is the value champ. It has the same PCB as the Strix with a couple power stages left blank, and some headers missing on the side (possibly I2C voltage controller headers), and will therefore share Strix block compatibility.


I believe just looking at photos of the assembled card that the MSI Gaming is the same PCB as the Suprim, probably with a couple power stages also left blank. But the total number of power stages has been irrelevant even on the 3000 series...just not really possible to ever hit > 1000W and even less so on the 4090.

I think the bullet point of the 4090 is "just buy the $1600 base model". The theory floating is that NV gave them overkill specs to hit with their designs (for both cooler and power). Only Zotac seems to have been relatively "cheap" in mostly using NV's reference design.


----------



## ZealotKi11er

MIST3RST33Z3 said:


> Probably going to return my Zotac. It performs great, boosting to 3050 pretty regularly, but I’m doubtful that one of the “good” waterblock makers will make a block for it, based on blocks for prior generation Zotac cards.


Would suck if you got a model that clk lower. Mine does only 2970MHz. Water will not get much since its already running at low 60s.


----------



## sakete

MIST3RST33Z3 said:


> Probably going to return my Zotac. It performs great, boosting to 3050 pretty regularly, but I’m doubtful that one of the “good” waterblock makers will make a block for it, based on blocks for prior generation Zotac cards.


A FE or one of the Asus cards is a safe bet.


----------



## AvengedRobix

MIST3RST33Z3 said:


> Probably going to return my Zotac. It performs great, boosting to 3050 pretty regularly, but I’m doubtful that one of the “good” waterblock makers will make a block for it, based on blocks for prior generation Zotac cards.


wait.. for sure alphacool do a WB for amp extreme


----------



## MIST3RST33Z3

AvengedRobix said:


> wait.. for sure alphacool do a WB for amp extreme


I’d be really happy, they have one listed, just need to wait for confirmed comparability. Really want buildzoid or GN to do an analysis of the PCB. I have limited knowledge on power electronics components.


----------



## Roacoe717

Soon, got ripped off $50 by Amazon since Bestbuy and Newegg want 1750. Owell.


----------



## Arizor

Yeah honestly I would say any 4090 is pretty much equivalent to the next. So make your choice based on price (no need to go above the $1600 models really) and waterblock availability.


----------



## Kemerd

I have been able to use this cable and get the full 600W and 133% power target on my RTX 4090. https://www.amazon.com/dp/B0BF4B1T16 It is only $30.

8pin(6+2) PCI-E Plug works on my Corsair PSU, if you get the 16 pin might work for an Antec.


----------



## mirkendargen

Kemerd said:


> I have been able to use this cable and get the full 600W and 133% power target on my RTX 4090. https://www.amazon.com/dp/B0BF4B1T16 It is only $30.
> 
> 8pin(6+2) PCI-E Plug works on my Corsair PSU, if you get the 16 pin might work for an Antec.


Be careful with a cable like that and check the pinout on your PSU first. When it comes to the PSU side, it fitting doesn't equal it working because there is no standard pinout.


----------



## Kemerd

mirkendargen said:


> Be careful with a cable like that and check the pinout on your PSU first. When it comes to the PSU side, it fitting doesn't equal it working because there is no standard pinout.


Yes and no. 12VHPWR (what you want to search for) isn't special or unique to the RTX 4090. It's a spec. You don't need to spend $200 on a custom cable— this also isn't the only cable that does this. There are some newer PSUs that come with these by default, it is a next-gen spec; Nvidia is trying to future-proof. All the extra connector on the top does is tell the GPU how much current it can draw.


----------



## yzonker

GRABibus said:


> Where did you get it ?


Direct from Corsair. Someone in this thread posted it was in stock a few days ago at something like 9:00 PM CST. I jumped on it immediately.


----------



## mirkendargen

Kemerd said:


> Yes and no. 12VHPWR (what you want to search for) isn't special or unique to the RTX 4090. It's a spec. You don't need to spend $200 on a custom cable— this also isn't the only cable that does this. There are some newer PSUs that come with these by default, it is a next-gen spec; Nvidia is trying to future-proof. All the extra connector on the top does is tell the GPU how much current it can draw.


I'm not talking about the 12VHPWR side, I'm talking about the PSU side. That's a modular PSU cable not an adapter, it plugs straight into the PSU. You need to make sure your PSU has the same pinout it's designed for, because there is no standard. There's a reason there are Corsair PSU cables, EVGA PSU cables, Seasonic PSU cables, etc. People getting a new PSU and trying to reuse their existing modular cables is the #1 way computer parts get fried because they just assume if the connector fits in their new PSU it's fine.


----------



## sakete

For anyone in the Denver metro area still looking for a 4090, I just returned my TUF OC to Micro Center. It should be available as open box now, I think for $1619.


----------



## yzonker

Xavier233 said:


> I agree, though that FE's fans are wobbling weirdly and make some funky noises at certain RPMs. IMO, the cheapest AIB cards would still be better than the FE, and u get slightly better cooling


I've seen 2 YT videos where they showed the wobbly fan. That is a little concerning.


----------



## kx11




----------



## MIST3RST33Z3

sakete said:


> For anyone in the Denver metro area still looking for a 4090, I just returned my TUF OC to Micro Center. It should be available as open box now, I think for $1619.


I live in COS and am contemplating driving down there lol. Wonder if it would be available or still there if I went now. Wish I would have known, would have bought it off you and given you a bit extra


----------



## Hanks552

mattskiiau said:


> Anyone try a 4 x 8-pin with a Gaming X Trio yet? Any changes in results?


Exactly what I want to see!


----------



## Glottis

yt93900 said:


> Has anyone seen the 12pin Corsair cable out in the wild already? Seem to never be in stock.


Yeah I got one earlier this week (EU). Just keep checking their site I suppose, I've seen it go in and out of stock a few times already. Oh and btw you want 16pin, not 12.


----------



## sakete

MIST3RST33Z3 said:


> I live in COS and am contemplating driving down there lol. Wonder if it would be available or still there if I went now. Wish I would have known, would have bought it off you and given you a bit extra


Ah sorry man, I also had extended warranty and that was the only way to get a refund on that part. Keep an eye on their site, maybe open box 4090s can be reserved for pickup.


----------



## MikeGR7

Hanks552 said:


> Exactly what I want to see!


Guys you need to get over it:

Using a different cable will NOT unlock a higher power limit for you.
Many users here already reported it plus it is even recorded on multiple videos on YouTube.
eTeknix is one of them that straight up mentions it.

If you want a higher Power limit you need to flash another Vbios first and you MAY unlock it.
Plus IF and WHEN nVlash supports 4000 series.

Just get the Gigabyte one with 600W out of the box and a sane price.

On another note i saw someone some pages back that thought Zotacs VRM is superior to Gigabyte and made me lmao


----------



## slayer6288

MikeGR7 said:


> Guys you need to get over it:
> 
> Using a different cable will NOT unlock a higher power limit for you.
> Many users here already reported it plus it is even recorded on multiple videos on YouTube.
> eTeknix is one of them that straight up mentions it.
> 
> If you want a higher Power limit you need to flash another Vbios first and you MAY unlock it.
> Plus IF and WHEN nVlash supports 4000 series.
> 
> Just get the Gigabyte one with 600W out of the box and a sane price.
> 
> On another note i saw someone some pages back that thought Zotacs VRM is superior to Gigabyte and made me lmao


It is better from the pictures we have and I am not even the person who said it. You seem like a clown.


----------



## LuckyImperial

Has anyone tested whether the EVGA PerFE 12 Cable will work with the FE card? Is the pinout the same other than the sense pins? This thread is moving quick, so sorry if I missed this.


----------



## DokoBG

No thank you, I have horrible experience with Gigabyte cards - never again.


----------



## Glottis

LuckyImperial said:


> Has anyone tested whether the EVGA PerFE 12 Cable will work with the FE card? Is the pinout the same other than the sense pins? This thread is moving quick, so sorry if I missed this.


Without sense pins GPU will either refuse to boot or work at only 150W mode. Get 16 pin cable/adapter.


----------



## dr/owned

MIST3RST33Z3 said:


> I’d be really happy, they have one listed, just need to wait for confirmed comparability. Really want buildzoid or GN to do an analysis of the PCB. I have limited knowledge on power electronics components.


The 4090 Amp Extreme is a lower end PCB. It's like the reference PCB from Nvidia slightly expanded. Button caps, older vrm controller (although the uP9512 is still good enough for this application it's a semi-digital part that has I2C and can be controlled by the EVC2). The other boards like FE and Strix and MSI are using the MP2891 that is full-digital (similar to what was used on the 3090 Strix and KPE) but the EVC2 doesn't have support for it yet. UP9512 was used on the 3090 TUF for example. The crappy version is the 9511 that is completely analog. In terms of total number of power stages: doesn't really matter. It's using 50A stages instead of 70A that a lot of the other cards are using but you're not going to be pushing 1000W on this thing anyways. On paper it's more "appropriately built" for the actual power consumption of the 4090. The other cards are all overbuilt.

Something like the Gigabyte Gaming is lke a notch above the Amp Extreme. uP9512, more poscaps instead of button....still 50A stages. The Suprim line (and probably the Gaming Trio line if it's the same pcb) would be a notch higher where it's 70A stages, digital controller: basically a Strix without the price markup.

But again I wouldn't really be thinking "oh this one is better than the other" Yeah on paper it's better but in reality...you might even want the uP9512 so you can adjust the voltage if you can solder on to the microscopic I2C lines.


----------



## kx11

DokoBG said:


> No thank you, I have horrible experience with Gigabyte cards - never again.


well, i was like you but now my whole build is Aorus stuff


----------



## MikeGR7

slayer6288 said:


> It is better from the pictures we have and I am not even the person who said it. You seem like a clown.


No the pictures show something else but anyway, who are you again??
I didn't even quote or replied to you.


----------



## CallsignVega

Anyone know how to increase the voltage on the 4090? The voltage slider is greyed out on latest MSI Afterburner.


----------



## GosuPl

Voltage is locked on RTX 4090 right now. I will try with 2x RTX 4090 STRIX next week (separate RIGs, shame no SLI even for fun) but all RTX 4090 have locked voltage at the moment


----------



## MikeGR7

DokoBG said:


> No thank you, I have horrible experience with Gigabyte cards - never again.


Just to clarify what i said earlier, i do not promote Gigabyte and i am well aware of the shipwreck their 3000 series was.
Just said that their designs seem vastly improved this round and the vrm-pl-cooler package seems very competitive.
Certainly better (can expand on this if needed) than MSI trio which hurts me a lot since i loved that card on 3000.


----------



## alasdairvfr

MikeGR7 said:


> Just to clarify what i said earlier, i do not promote Gigabyte and i am well aware of the shipwreck their 3000 series was.
> Just said that their designs seem vastly improved this round and the vrm-pl-cooler package seems very competitive.
> Certainly better (can expand on this if needed) then MSI trio which hurts me a lot since i loved that card on 3000.


I cant say anything for under the hood but so far I'm impressed by my Gigabyte OC card. I considered not buying it given the flak they received last gen but it was there, I wanted a 4090 so here we are. It has a 600w bios and I find the 4090s seem to have less bin discrepancy than before. The cooling on FE & AIBs seems uniformly better than before, maybe just overengineered this time around. Reading online since I picked up this card I'm seeing the higher end cards' performance is so marginally better this time around, its almost non-existent.

I don't know about the VRMs or caps, etc this time around. I'm sure the techtubers will do engineering reviews of the boards and I think it would be wise to wait a week or two for these detailed reviews to come out. But my initial take is FE through Strix, there's not much of a jump in performance in the exotic cards. My new card is cooler than my ASUS TUF OC 3080 and my 3080TI EVGA FTW3, at least 5C cooler under torture tests. I have nothing to say about Gigabyte support and maybe they are **** but the product (albeit only 2 days in) appears to be sound as hell.

Again, derBauer, GN et all will do the rounds and discuss the engineering of these boards for those looking to do the most extreme overclocking.


----------



## DokoBG

MikeGR7 said:


> Certainly better (can expand on this if needed) than MSI trio which hurts me a lot since i loved that card on 3000.


Vapor chamber and higher number of pipes doesn't mean a better cooling solution.... unless u have them side by side with actual temperature numbers.


----------



## justpete

Speaking of Gigabyte. The 4090 OC (600w capable) is available for in-store pickup. A couple dozen left as of this writing. If your in Canada click the "Check other Stores" to see which ones. Good luck! GIGABYTE GeForce RTX 4090 GAMING OC 24G


----------



## yt93900

Weird that the Gaming OC is 600W capable but the TOTL waterforce model is not?


----------



## MIST3RST33Z3

sakete said:


> Ah sorry man, I also had extended warranty and that was the only way to get a refund on that part. Keep an eye on their site, maybe open box 4090s can be reserved for pickup.


I drove up and it was gone lol. All good, my Zotac had an awful fan buzzing noise, so I took the opportunity to return it. Now I’m a peasant again with no 4090 haha.


----------



## Jordyn

justpete said:


> Speaking of Gigabyte. The 4090 OC (600w capable) is available for in-store pickup. A couple dozen left as of this writing. If your in Canada click the "Check other Stores" to see which ones. Good luck! GIGABYTE GeForce RTX 4090 GAMING OC 24G


Interestingly it is the same in Australia, with the GB Gaming OC the only card still in stock across the larger retailers. Seems like they had the most stock available, albiet they are ~$300AU or ~10% more than the cheapest AIB models.


----------



## dr/owned

yt93900 said:


> Weird that the Gaming OC is 600W capable but the TOTL waterforce model is not?


I've seen on previous generations where waterblocked models of cards use reference or less-capable PCBs.


----------



## MikeGR7

DokoBG said:


> Vapor chamber and higher number of pipes doesn't mean a better cooling solution.... unless u have them side by side with actual temperature numbers.


Well we have actual numbers showing the Gaming OC being cooler than MSI suprim which is beefier than Trio.


----------



## DokoBG

Great, how much better is it ?


----------



## Equinox654

DokoBG said:


> Great, how much better is it ?


Probably not enough to matter. My trio runs pretty cool. Fans maxed was 50c in Timespy. Low 70s on auto with it not being loud enough to hear over my AIO fans. That was with it boosting to 2950 avg.


----------



## MikeGR7

DokoBG said:


> Great, how much better is it ?


4c cooler on core, 3c cooler on Hotspot and 6c cooler on Vram.
Feel free to check techpowerup's review.


----------



## MikeGR7

Equinox654 said:


> Probably not enough to matter. My trio runs pretty cool. Fans maxed was 50c in Timespy. Low 70s on auto with it not being loud enough to hear over my AIO fans. That was with it boosting to 2950 avg.


Have to agree with you, unless a card manages below the 50 mark it won't affect things much.
It's all theoretical advantages.


----------



## kx11




----------



## CallsignVega

GosuPl said:


> Voltage is locked on RTX 4090 right now. I will try with 2x RTX 4090 STRIX next week (separate RIGs, shame no SLI even for fun) but all RTX 4090 have locked voltage at the moment


Oh I could have swore I saw during a YT 4090 review they unlocked the voltage slider and got more voltage.


----------



## slayer6288

MikeGR7 said:


> No the pictures show something else but anyway, who are you again??
> I didn't even quote or replied to you.


Imagine getting upset and personal because someone pointed out something you said was wrong. Time to look in the mirror and grow up. Anyways....

Anything new on the bios flashing front or are we all still waiting on an nvflash update?


----------



## Glottis

GosuPl said:


> Voltage is locked on RTX 4090 right now. I will try with 2x RTX 4090 STRIX next week (separate RIGs, shame no SLI even for fun) but all RTX 4090 have locked voltage at the moment


I saw a post from another member who claimed they were able to slightly increase voltage on their Asus card with Asus GPU Tweak III. Have you tried that?


----------



## CallsignVega

Ah just installed MSI Afterburner 4.6.5 (Beta2) (Was on final series). Now voltage is unlocked and got a nice bump. Approaching 3100 MHz core at 133% power target and my cards running 60C in Superposition. This Gigabyte Gaming OC is turning out to be a pretty sweet card. And quite quiet too.


----------



## justpete

Nice @CallsignVega, I also just ordered the same card from Canada Computers. Looking forward to swapping the Strix 3090 to my streaming/secondary rig


----------



## Jordyn

CallsignVega said:


> Ah just installed MSI Afterburner 4.6.5 (Beta2) (Was on final series). Now voltage is unlocked and got a nice bump. Approaching 3100 MHz core at 133% power target and my cards running 60C in Superposition. This Gigabyte Gaming OC is turning out to be a pretty sweet card. And quite quiet too.



What bump did you give to the core to hit that number?


----------



## ValSidalv21

CallsignVega said:


> Anyone know how to increase the voltage on the 4090? The voltage slider is greyed out on latest MSI Afterburner.


The one from the GFE alt+z performance menu works for my ASUS card. I mean you can set it at 100%, not sure if that does anything or not. Cause I also set the power slider to 133% and didn't see the card drawing more power during 3DMark. On the default clocks that is.


----------



## smushroomed

Has the 9900k been shown to hold the 4090 back?

What is the "reasonable top end" amd and intel cpu right now?


----------



## CallsignVega

OK just to dispel any FUD out there.. these cards (at least my Gigabyte Gaming OC) voltage is NOT locked. Stock slider position the card is 1.05v, slider to max right is 1.1v.


----------



## MikeGR7

slayer6288 said:


> Imagine getting upset and personal because someone pointed out something you said was wrong. Time to look in the mirror and grow up. Anyways....
> 
> Anything new on the bios flashing front or are we all still waiting on an nvflash update?


Are you joking? Everyone can go back a page and plainly see that you called me a clown...
Anyway let's move on i agree on that.

Nothing new on the vbios front and not even a proper first page like other owners threads.

Last words from the OP were along the lines of "I'm bored of hardware..." so we may wait for that first page a long time lol


----------



## KedarWolf




----------



## MikeGR7

smushroomed said:


> Has the 9900k been shown to hold the 4090 back?
> 
> What is the "reasonable top end" amd and intel cpu right now?


If you play in 4K resolution and with recent games and not indie stuff then the bottleneck is the card still.
9900K is no slouch.

As for the latest lineups i would go with everything 8 cores and above. 7700X /12700K


----------



## CallsignVega

So far in Superposition I've seen max 560w draw with 1.1v and 133% power slider at 3090 MHz core and +1000 memory. I haven't even tweaked the memory yet.


----------



## dr/owned

KedarWolf said:


> View attachment 2576019


Pretty much my reaction:






At that price could you fly round trip to the US and hand-carry a card back?


----------



## MikeGR7

CallsignVega said:


> So far in Superposition I've seen max 560w draw with 1.1v and 133% power slider at 3090 MHz core and +1000 memory. I haven't even tweaked the memory yet.


I suggest you start reducing the power limit slider until you start seeing Perfcap reason power limit on gpuz.
Then start upping your memory and proportionally up the slider until you won't hit perfcap again.
You may end up with tuned memory and same core freq and not even using the full 600w capacity.


----------



## DokoBG

One thing is for sure, that STRIX is definitely not worth the premium price.


----------



## Tadaschi

Found the cheaper models pcb pictures to check the power fases, since techpower up just showed the top models pcbs
i dont believe pcb and power fases will make that much of difference for the 4090 this time around 

source https://www.coolpc.com.tw/tw/shop/gpu/nvidia-rtx4090/


----------



## RaMsiTo

CallsignVega said:


> So far in Superposition I've seen max 560w draw with 1.1v and 133% power slider at 3090 MHz core and +1000 memory. I haven't even tweaked the memory yet.


It is what I have achieved in superposition, the 9900k weighs me down in this test but the clock reaches 3165 mhz.











*








UNIGINE Superposition benchmark score


UNIGINE Superpsition detailed score page




benchmark.unigine.com




*


----------



## docta99

Arizor said:


> If anyone has their voltage greyed out in MSI AB I can confirm, at least for ASUS, their new GPU tweak 3 allows voltage to go up to 1.1, which enables over 3ghz on my TUF. Though I’d say diminishing returns past a 2.9ghz OC for any games.


 Mine is greyed out, how to unlock it so it goes above 450w?? I. Elieve the Gpu bios is locked??


----------



## mirkendargen

justpete said:


> Speaking of Gigabyte. The 4090 OC (600w capable) is available for in-store pickup. A couple dozen left as of this writing. If your in Canada click the "Check other Stores" to see which ones. Good luck! GIGABYTE GeForce RTX 4090 GAMING OC 24G


Thanks for the heads up! Looks like I'm making a trip to Vancouver in the next few days...


----------



## docta99

Arizor said:


> If anyone has their voltage greyed out in MSI AB I can confirm, at least for ASUS, their new GPU tweak 3 allows voltage to go up to 1.1, which enables over 3ghz on my TUF. Though I’d say diminishing returns past a 2.9ghz OC for any games.


My voltage is greyed out in afterburner how do I unlock it? I believe my Gpu bios is locked to 450w max???


----------



## mirkendargen

MikeGR7 said:


> Well we have actual numbers showing the Gaming OC being cooler than MSI suprim which is beefier than Trio.


If you look at the reviews the Gaming OC seems to have higher default fan speeds. It's "quiet" setting is like other card's normal setting in terms of speed and noise. So I think it's on par, not better. Which isn't a bad thing at all, we can all do whatever custom fan curves we want.


----------



## yt93900

Yep, latest Afterburner can unlock more voltage, 1.05V -> 1.095V. It is indeed voltage starved. Just did a quick run in Heaven, 3150MHz is borderline, already artifacting slightly.
I think we will be seeing 3500MHz+ core on LN2 when they uncork the voltage.
Funny enough the reported board power draw is lower than what I remember from my 3090 Strix in this test.


----------



## LunaP

8472 said:


> For those of you still on PCIE 3.0, techpowerup just published their article on performance with different PCIE generations.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 PCI-Express Scaling
> 
> 
> The new NVIDIA GeForce RTX 4090 is a graphics card powerhouse, but what happens when you run it on a PCI-Express 4.0 x8 bus? In our mini-review we've also tested various PCI-Express 3.0, 2.0 and 1.1 configs to get a feel for how FPS scales with bandwidth.
> 
> 
> 
> 
> www.techpowerup.com


Thanks , was curious about this since I'm still on x299, I can live w/ a 2% margin till HEDT returns next year.


----------



## Netarangi

Arizor said:


> Can easily overclock this TUF to 3ghz and +1500 on the mem with no stability issues in game. Somehow 1sr in 3dmark speedway for 5900x owners, imagine that’ll change quickly 😂
> 
> View attachment 2575901


Not sure how the stats work lol. I'm apparently first in speed way and 4k fire strike with my 3070 and 12700kf. It must have misread my card or something because there's no way I'm first.


----------



## Xavier233

KedarWolf said:


> View attachment 2576019


Is memory Express still pulling the final sale bullshit on these cards? That's the only reason am not getting one from there


----------



## tubs2x4

KedarWolf said:


> View attachment 2576019


Haha holy **** that is expensive item. ***in taxes


----------



## inedenimadam

poking around r/nvidia and found a guy with a gaming X trio and a proper ATX3.0. He is still limited to the 106% TDP. The low TDP cards are either going to need bios flash (hopefully) or a shunt mod.


----------



## bmagnien

EK now selling blocks for gigabyte aorus master and gaming oc, Asus TUF and strix, and FE. Am I correct that all of these have a 600w bios out of the box? Should make getting one a lot easier having so many options with blocks inbound.


----------



## Shaded War

inedenimadam said:


> poking around r/nvidia and found a guy with a gaming X trio and a proper ATX3.0. He is still limited to the 106% TDP. The low TDP cards are either going to need bios flash (hopefully) or a shunt mod.


Any report of oc numbers on the Trio? Wondering if the power limit is even holding back hitting decent oc.


----------



## MikeGR7

Tadaschi said:


> Found the cheaper models pcb pictures to check the power fases, since techpower up just showed the top models pcbs
> i dont believe pcb and power fases will make that much of difference for the 4090 this time around
> 
> source https://www.coolpc.com.tw/tw/shop/gpu/nvidia-rtx4090/


My ranking on these is first place tossed between GB and Asus, second place MSI and distant third place to zotac.
If i was pushed hard i whould give a slight edge to the GB over Asus for first. SLIGHT.


----------



## Mad Pistol

Moved over to a 1300-watt EVGA PSU and a new case today in preparation for the 4090 arriving on Tuesday.

Do you think I have enough room?


----------



## inedenimadam

Shaded War said:


> Any report of oc numbers on the Trio? Wondering if the power limit is even holding back hitting decent oc.


2950 Core /1519 mem. Adding voltage kicks the perf cap pwr quicker and results in lower scores. We need a bios. Leaving voltage alone it barely ever hits 106%TDP

Edit:
I feel like I have to correct myself a bit here. I guess depending on the workload, you may get better results with additional voltage. With voltage I hovered around 3010Mhz in TimeSpy, and scored higher




View attachment 2576057


----------



## MikeGR7

Mad Pistol said:


> Moved over to a 1300-watt EVGA PSU and a new case today in preparation for the 4090 arriving on Tuesday.
> 
> Do you think I have enough room?
> 
> 
> 
> View attachment 2576053


Damn that's going to be a tight fit for a Master or a Strix, anything below that should be okay.

Edit: you can also say goodbye to that little pcie card below your gpu.


----------



## Mad Pistol

MikeGR7 said:


> Damn that's going to be a tight fit for a Master or a Strix, anything below that should be okay.


Gonna have to break out the lube...


----------



## Antsu

MikeGR7 said:


> My ranking on these is first place tossed between GB and Asus, second place MSI and distant third place to zotac.
> If i was pushed hard i whould give a slight edge to the GB over Asus for first. SLIGHT.


Don't mean to attack your opinion in any way, and I couldn't even if I wanted to as I don't know much about PCB design, just genuinely curious what makes you think that?


----------



## MikeGR7

Antsu said:


> Don't mean to attack your opinion in any way, and I couldn't even if I wanted to as I don't know much about PCB design, just genuinely curious what makes you think that?


No problem friend, i may aswell be mistaken, tell me which part of the ranking do you want me to reiterate?


----------



## CallsignVega

Dialed in the memory... +1580 which gives me 12,081 MHz. This card rips!


----------



## Antsu

MikeGR7 said:


> No problem friend, i may aswell be mistaken, tell me which part of the ranking do you want me to reiterate?


ASUS vs GB specifically, I was under the impression that ASUS has the most overbuilt VRM but maybe GB has one-upped them this time around?


----------



## MikeGR7

mirkendargen said:


> If you look at the reviews the Gaming OC seems to have higher default fan speeds. It's "quiet" setting is like other card's normal setting in terms of speed and noise. So I think it's on par, not better. Which isn't a bad thing at all, we can all do whatever custom fan curves we want.


Ofc you are correct.

But until we find a reviewer willing to manually max 100% the fans of each model, we can only base our opinions for max performance based on every cards measurements on their respective performance/gaming bios switch.

If a manufacturer chose a very conservative fan profile for their performance oriented vbios mode then bad for them in regards to the max performance crowd and good for them on silence focused crowd.

I personally rank max performance as higher priority.


----------



## KedarWolf

It might be Gigabyte is only better than ASUS with their full waterblock AIO.

When I get my Optimus Water Cooling block, we'll see.


----------



## yzonker

Got my TUF OC installed. Seems solid so far. I played CP2077 at +225/+1500 for about 30 min without a crash. Obviously need to test more, but maybe it's at least not a dud. 

I also used that VRAM mem test program and was still seeing speed increasing at +1600 with no errors. Haven't found the limit yet. 

This program if you aren't aware of it,









I've created an application for testing Video...


A tool written in vulkan compute to stress test video memory for correctness. Ideological successor to memtestCL. Releases at GitHub Open-source, prebuilt binaries available for windows and linux, supports aarch64 NVidia Jetson. Simple to use - no any parameters except optional card...




www.overclock.net


----------



## MikeGR7

Antsu said:


> ASUS vs GB specifically, I was under the impression that ASUS has the most overbuilt VRM but maybe GB has one-upped them this time around?


Indeed that was and is still the case with Asus but this round GB is very competitive.
In fact both models are waaaay overbuild.

Tuf has a more advanced digital vrm controller and higher rated inductors.
GB has a still excellent semi digital vrm controller that is more vmod friendly.
Also GB may have lower rated inductors but it has more power phases.
The above put them on a tie in my opinion.
What gives GB the SLIGHT edge is the use of more SMD capacitors instead of the through hole ones on the Tuf plus some of them eg behind the gpu socket have more capacity on the GB.

As i said though both are excellent and overbuilt.


----------



## Equinox654

Shaded War said:


> Any report of oc numbers on the Trio? Wondering if the power limit is even holding back hitting decent oc.


Time Spy Extreme:


















That was +200 CORE +1200 MEM fans on auto. And the %106 power limit. It does hit 480watts.
Haven't tried raising volts. Also I pulled those overclock numbers out of my ass... Only my second iteration. I haven't tried to see if it will stay at 3ghz in Time Spy yet.

Eyeballing Afterburner during my tests. It flirts with the power limit. Nothing like my 3080 XC3 HYBRID though, that card was beating on the power limit like a red headed step child.
Mostly voltage limited though in normal usage. Pretty much have to run Furmark to see it hit it hard. Maybe a path traced game like quake 2 or something like cyberpunk would do it. I haven't tested with gpu-z on in anything like that.

I don't really think you will be leaving much on the table if you get one. I will be trying to flash mine as soon as it is possible though... Just to see what it can do.


----------



## inedenimadam

Equinox654 said:


> Time Spy Extreme:
> View attachment 2576060
> 
> 
> View attachment 2576059
> 
> 
> That was +200 CORE +1200 MEM fans on auto. And the %106 power limit. It does hit 480watts.
> 
> 
> Eyeballing Afterburner during my tests. It flirts with the power limit.


Same card. Almost same results. Depending on the test/game, it can either flirt with it or run head first into it. Try updating Afterburner from the website, for some reason updating from withing afterburner doesn't give the version with voltage control. Also, you can probably go +1500 or more on the memory, but +200 is about where I landed as well for core. 3Ghz + is awesome. Need NVFlash to update/patch!


----------



## J7SC

justpete said:


> Speaking of Gigabyte. The 4090 OC (600w capable) is available for in-store pickup. A couple dozen left as of this writing. If your in Canada click the "Check other Stores" to see which ones. Good luck! GIGABYTE GeForce RTX 4090 GAMING OC 24G





CallsignVega said:


> Ah just installed MSI Afterburner 4.6.5 (Beta2) (Was on final series). Now voltage is unlocked and got a nice bump. Approaching 3100 MHz core at 133% power target and my cards running 60C in Superposition. This Gigabyte Gaming OC is turning out to be a pretty sweet card. And quite quiet too.





mirkendargen said:


> Thanks for the heads up! Looks like I'm making a trip to Vancouver in the next few days...


...thanks guys ! After reading TPU (600W ok) and other reviews of the Gigabyte Gaming OC and subsequently saw _your comments_, I headed straight over to Canada Computer in Vancouver - on the way, I noticed gas prices were down by 40 c / liter...a satisfying day .

I have three other recent-gen Gigabyte GPUs, and they have never let me down and are great clockers to boot. I was going to wait until more info for the 4090 Ti / RDNA3 was out, but my HWBot days are behind me anyway - and I do game on a 48 inch 4K 120Hz OLED with Microsoft Flight Sim 2020 and Cyberpunk 2077.

...won't be until Sunday or so when I actually get to install it as there will be a lot of rewiring to do. Also, if anyone sees some water-block offerings for the Gigabyte 4090 Gaming OC, please post a link. BTW, the store still had a few left, with a few more on the pick-up shelf... Converted into US$, price was 1,618


----------



## Equinox654

inedenimadam said:


> Same card. Almost same results. Depending on the test/game, it can either flirt with it or run head first into it. Try updating Afterburner from the website, for some reason updating from withing afterburner doesn't give the version with voltage control. Also, you can probably go +1500 or more on the memory, but +200 is about where I landed as well for core. 3Ghz + is awesome. Need NVFlash to update/patch!


Yeah, I just did some digging. Checked the patch notes for NVFLASH and it looks like for Ampere and Turing it was updated around a month after release in both cases.


----------



## MIST3RST33Z3

Drove an hour and a half to micro center to return my Zotac as it had issues and I wanted one with a confirmed waterblock, got lucky and ended up getting a TUF lol


----------



## mirkendargen

J7SC said:


> ...thanks guys ! After reading TPU (600W ok) and other reviews of the Gigabyte Gaming OC and subsequently saw _your comments_, I headed straight over to Canada Computer in Vancouver - on the way, I noticed gas prices were down by 40 c / liter...a satisfying day .
> 
> I have three other recent-gen Gigabyte GPUs, and they have never let me down and are great clockers to boot. I was going to wait until more info for the 4090 Ti / RDNA3 was out, but my HWBot days are behind me anyway - and I do game on a 48 inch 4K 120Hz OLED with Microsoft Flight Sim 2020 and Cyberpunk 2077.
> 
> ...won't be until Sunday or so when I actually get to install it as there will be a lot of rewiring to do. Also, if anyone sees some water-block offerings for the Gigabyte 4090 Gaming OC, please post a link. BTW, the store still had a few left, with a few more on the pick-up shelf... Converted into US$, price was 1,618
> View attachment 2576063


I've got a card for me and a friend secured (he literally ordered it on airplane wifi on a transatlantic flight when I told him lololol). Driving from Seattle to Vancouver tomorrow!


----------



## bmagnien

J7SC said:


> ...thanks guys ! After reading TPU (600W ok) and other reviews of the Gigabyte Gaming OC and subsequently saw _your comments_, I headed straight over to Canada Computer in Vancouver - on the way, I noticed gas prices were down by 40 c / liter...a satisfying day .
> 
> I have three other recent-gen Gigabyte GPUs, and they have never let me down and are great clockers to boot. I was going to wait until more info for the 4090 Ti / RDNA3 was out, but my HWBot days are behind me anyway - and I do game on a 48 inch 4K 120Hz OLED with Microsoft Flight Sim 2020 and Cyberpunk 2077.
> 
> ...won't be until Sunday or so when I actually get to install it as there will be a lot of rewiring to do. Also, if anyone sees some water-block offerings for the Gigabyte 4090 Gaming OC, please post a link. BTW, the store still had a few left, with a few more on the pick-up shelf... Converted into US$, price was 1,618
> View attachment 2576063











EK-Quantum Vector² Master RTX 4090 D-RGB - Nickel + Plexi


EK-Quantum Vector² water block for the Aorus Master and Gigabyte Gaming RTX 4090




www.ekwb.com


----------



## J7SC

mirkendargen said:


> I've got a card for me and a friend secured (he literally ordered it on airplane wifi on a transatlantic flight when I told him lololol). Driving from Seattle to Vancouver tomorrow!


I was probably looking at your card then because the pick-up shelf with order sheets taped to each box is right behind the cashier. Shiny !



bmagnien said:


> EK-Quantum Vector² Master RTX 4090 D-RGB - Nickel + Plexi
> 
> 
> EK-Quantum Vector² water block for the Aorus Master and Gigabyte Gaming RTX 4090
> 
> 
> 
> 
> www.ekwb.com


Thanks for the link


----------



## Antsu

MikeGR7 said:


> Indeed that was and is still the case with Asus but this round GB is very competitive.
> In fact both models are waaaay overbuild.
> 
> Tuf has a more advanced digital vrm controller and higher rated inductors.
> GB has a still excellent semi digital vrm controller that is more vmod friendly.
> Also GB may have lower rated inductors but it has more power phases.
> The above put them on a tie in my opinion.
> What gives GB the SLIGHT edge is the use of more SMD capacitors instead of the through hole ones on the Tuf plus some of them eg behind the gpu socket have more capacity on the GB.
> 
> As i said though both are excellent and overbuilt.


Thanks for clarifying, good to hear that Gigabyte is basicly on par this time around.


----------



## mirkendargen

I looked at PCB pictures of Gaming OC from Techpowerup and Aorus Master from der8auer's nvlink video. They're definitely the exact same PCB with a couple power stages left blank on the Gaming OC (the Windforce looked like it was also the same PCB with a few more power stages left blank), so they will definitely share block compatibility (already shows with EK). The amount of PCB sharing should be good for block availability overall this gen with each manufacturer only having one PCB it seems.

Windforce (now that I look at this one more, the screw hole pattern is enough different that you'd need to leave enough screws off that contact might not be great across the non-core part of a Gaming OC/Aorus Master block):









Gaming OC:









Aorus Master:


----------



## BTK

I’ve been trying to buy one I’m just trying to figure out where the to hell to buy one of these things


----------



## Arizor

A few initial thoughts after messing around with the TUF since release and observing vids / other posters' info:


Most cards can sustain an OC to 2900mhz and +1500 on the memory (12000mhz)
Voltage increases, much like last gen, have severely diminishing returns: I can get 3100mhz and +1700 on the memory, but the gains are like... 1-2% for about 100W extra
Temps are much better all round, GPU core barely touches 70C. With an aggressive curve (I let mine go to 70% at 55C), you can keep temps at 65C easily even when OC'd.
Watercooling will help sustain these larger clocks and keep temps nice and cool, but for the first time I'm actually debating whether it's worth forking out the extra few hundred bucks, hours of labour, and then damaging resale value.
On the other hand, it may well be worth holding onto your 4090 on air, then selling and going for the 4090Ti when it releases before considering waterblocking, if you are a complete speedfreak and know you're going to be tempted.


----------



## Shaded War

Equinox654 said:


> Time Spy Extreme:
> View attachment 2576060
> 
> 
> View attachment 2576059
> 
> 
> That was +200 CORE +1200 MEM fans on auto. And the %106 power limit. It does hit 480watts.
> Haven't tried raising volts. Also I pulled those overclock numbers out of my ass... Only my second iteration. I haven't tried to see if it will stay at 3ghz in Time Spy yet.
> 
> Eyeballing Afterburner during my tests. It flirts with the power limit. Nothing like my 3080 XC3 HYBRID though, that card was beating on the power limit like a red headed step child.
> Mostly voltage limited though in normal usage. Pretty much have to run Furmark to see it hit it hard. Maybe a path traced game like quake 2 or something like cyberpunk would do it. I haven't tested with gpu-z on in anything like that.
> 
> I don't really think you will be leaving much on the table if you get one. I will be trying to flash mine as soon as it is possible though... Just to see what it can do.


Averaging 2950Mhz and 63C with stock fan curve seems decent. How was the fan noise?

I was seeing complaints about the lower power limit and only coming with a 3x 8-pin adapter. I'm not the type to flash my bios and spend hours finding the max overclock anymore. If I can get a decent OC and have a quiet fan curve, that's all I ask for now days.

Mine should arrive the 20th. Regret not buying the faster shipping option now.


----------



## J7SC

mirkendargen said:


> I looked at PCB pictures of Gaming OC from Techpowerup and Aorus Master from der8auer's nvlink video. They're definitely the exact same PCB with a couple power stages left blank on the Gaming OC (the Windforce looked like it was also the same PCB with a few more power stages left blank), so they will definitely share block compatibility (already shows with EK). The amount of PCB sharing should be good for block availability overall this gen with each manufacturer only have one PCB it seems.
> 
> Windforce (PCB is a slightly different shape, but all the interfering parts look to be in the same place....except maybe the 12VHPWR connector):
> View attachment 2576078
> 
> 
> Gaming OC:
> View attachment 2576079
> 
> 
> Aorus Master:
> View attachment 2576080


A quick tip...per TPU review (pic below), I noticed that the 4090 Gaming OC backplate is reminiscent to the one they used on the Gigabyte 6900 XT Gaming OC. Gigabyte actually puts a soft *thermal-pad on the back *of the GPU die area...when I got my 6900 Gaming OC, I noticed that the backplate sure got toasty (its metal, and thus doing its job transferring heat). Adding a huge heatsink on the backplate seemed to make a difference with temps (even more so with fans blowing on the back heatsink). 

The thermal pad Gigabyte includes between the back of the GPU die and backplate seems to make an extra heatsink even more effective - and the really big heatsink even stays put just from the suction of the thermal paste (ie. regarding warranty concerns) though I used drops of crazy glue on the corners (psst, don't tell anyone).


----------



## MIST3RST33Z3

The Asus Tuf is HUGE! It didn't fit in my enthoo elite... I had to take my distrobution block off the mounting points and move it out of the way just to get the thing into the slot haha. This generation is comical.


----------



## Madness11

Any one here with palit game rock 4090???


----------



## dante`afk

in my impatience I pulled the trigger earlier on a strix4090 for 2800. now I got the nvidia bestbuy link and bought a FE for 1720.

someone wanted me to save 1100 😅


----------



## mirkendargen

J7SC said:


> A quick tip...per TPU review (pic below), I noticed that the 4090 Gaming OC backplate is reminiscent to the one they used on the Gigabyte 6900 XT Gaming OC. Gigabyte actually puts a soft *thermal-pad on the back *of the GPU die area...when I got my 6900 Gaming OC, I noticed that the backplate sure got toasty (its metal, and thus doing its job transferring heat). Adding a huge heatsink on the backplate seemed to make a difference with temps (even more so with fans blowing on the back heatsink).
> 
> The thermal pad Gigabyte includes between the back of the GPU die and backplate seems to make an extra heatsink even more effective - and the really big heatsink even stays put just from the suction of the thermal paste (ie. regarding warranty concerns) though I used drops of crazy glue on the corners (psst, don't tell anyone).
> View attachment 2576081


Good tip! I have a RAM waterblock attached to the backplate of my 3090 with thermal adhesive to cool the backside VRAM and....really don't want to do that again, lol. I might swap the stock thermal pads on the backplate to better ones though. I'm not as worried about some caps getting a bit toasty as I was about VRAM.


----------



## MIST3RST33Z3

dante`afk said:


> in my impatience I pulled the trigger earlier on a strix4090 for 2800. now I got the nvidia bestbuy link and bought a FE for 1720.
> 
> someone wanted me to save 1100 😅


Lol that’s crazy lucky!! Enjoy the extra 1100 and the FE!!


----------



## J7SC

mirkendargen said:


> Good tip! I have a RAM waterblock attached to the backplate of my 3090 with thermal adhesive to cool the backside VRAM and....really don't want to do that again, lol. I might swap the stock thermal pads on the backplate to better ones though. I'm not as worried about some caps getting a bit toasty as I was about VRAM.


...my 'poor' 6900 G-OC got my first-ever thermal putty treatment (spoiler) - probably used a bit much🥴- but putty is so forgiving as it conforms so easily. Not shown is the MX5 I put _on top_ of the remaining thermalright pads. Temps for that install remain superb- and I have some unopened thermal putty in the fridge...



Spoiler


----------



## Gandyman

Anyone with some insight here for me, went from 3090 strix oc to 4090 strix oc, and my pci-e bus went from 16x4.0 to 8x4.0

I reseated the card twice and checked the bios, the z690 apex lists it as an 8x device in the pcie_16_1 slot. 

any ideas? or is this normal for 4090?


----------



## bmagnien

mirkendargen said:


> I looked at PCB pictures of Gaming OC from Techpowerup and Aorus Master from der8auer's nvlink video. They're definitely the exact same PCB with a couple power stages left blank on the Gaming OC (the Windforce looked like it was also the same PCB with a few more power stages left blank), so they will definitely share block compatibility (already shows with EK). The amount of PCB sharing should be good for block availability overall this gen with each manufacturer only having one PCB it seems.
> 
> Windforce (now that I look at this one more, the screw hole pattern is enough different that you'd need to leave enough screws off that contact might not be great across the non-core part of a Gaming OC/Aorus Master block):
> View attachment 2576078
> 
> 
> Gaming OC:
> View attachment 2576079
> 
> 
> Aorus Master:
> View attachment 2576080


EKs block fits gaming oc and master, speicifcally excludes windforce


----------



## Jordyn

Gandyman said:


> Anyone with some insight here for me, went from 3090 strix oc to 4090 strix oc, and my pci-e bus went from 16x4.0 to 8x4.0
> 
> I reseated the card twice and checked the bios, the z690 apex lists it as an 8x device in the pcie_16_1 slot.
> 
> any ideas? or is this normal for 4090?


Not normal. Def should be 16x4.0.










Maybe check any power saving modes?


----------



## Taggen86

Suprim x owners, are you guys getting 108 or 116 as the max power slider in afterburner using the gaming bios? Mine only gets to 108 percent (exactly in line with the techpowerup bios information for the gaming profile). It is only the default silent profile that gets to plus 16. Why does the silent profile have a higher power limit? As both bioses are listed as max 520w according to techpowerup then 108=520w in the gaming bios and 116=520 in the silent bios?


----------



## MIST3RST33Z3

This Asus TUF OC is awesome compared to the Zotac. The fans are dead silent. It clocks higher, more reliably. I’m doing +300 on the core which gives me 3075mhz. Pulling 540w in TSE and hitting the voltage limit now. So glad I swapped the Zotac out.

Plus, there are already water blocks available for preorder for this card. The temps are already so cool, I doubt water will give any performance improvements, likely just acoustic improvements.


----------



## mirkendargen

Taggen86 said:


> Suprim x owners, are you guys getting 108 or 116 as the max power slider in afterburner using the gaming bios? Mine only gets to 108 percent (exactly in line with the techpowerup bios information for the gaming profile). It is only the default silent profile that gets to plus 16. Why does the silent profile have a higher power limit? As both bioses are listed as max 520w according to techpowerup then 108=520w in the gaming bios and 116=520 in the silent bios?


Is the gaming BIOS 480w base and the silent BIOS 450w base?



MIST3RST33Z3 said:


> This Asus TUF OC is awesome compared to the Zotac. The fans are dead silent. It clocks higher, more reliably. I’m doing +300 on the core which gives me 3075mhz. Pulling 540w in TSE and hitting the voltage limit now. So glad I swapped the Zotac out.
> 
> Plus, there are already water blocks available for preorder for this card. The temps are already so cool, I doubt water will give any performance improvements, likely just acoustic improvements.


https://www.aliexpress.us/item/3256804654341896.html You can "order", not just "preorder", heh.


----------



## Taggen86

mirkendargen said:


> Is the gaming BIOS 480w base and the silent bios 450w base?


Yes it seems so


----------



## MIST3RST33Z3

mirkendargen said:


> Is the gaming BIOS 480w base and the silent BIOS 450w base?
> 
> 
> 
> https://www.aliexpress.us/item/3256804654341896.html You can "order", not just "preorder", heh.


I don’t like Bykski, their 3090 FE active backplate block had garbage QC and their blocks aren’t as nice as EK or Optimus


----------



## Nizzen

MIST3RST33Z3 said:


> I don’t like Bykski, their 3090 FE active backplate block had garbage QC and their blocks aren’t as nice as EK or Optimus


It's out very fast with new gpu's, performance is great, it's cheap.
QC is garbage and often there is no "how to install"
Performance per $ and how fast it's avaiable makes Bykski to a overclockers favourite. Who want to wait for months to overclocking the new gpu "to the limit" 
For long therm use, there are better and more expensive options.


----------



## Clukos

My best scores so far on the FE









I scored 30 364 in Time Spy


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com












I scored 28 606 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com












I scored 10 923 in Speed Way


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com












I scored 15 077 in Time Spy Extreme


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## mirkendargen

MIST3RST33Z3 said:


> I don’t like Bykski, their 3090 FE active backplate block had garbage QC and their blocks aren’t as nice as EK or Optimus


Optimus will sell you a block that's better for 4x as much in a year. EK will sell you a block that's no better, possibly worse right now for twice as much. Given those choices, I'd take the Bykski, lol. Alphacool will also probably have a similar block at a slightly higher price.

I've had plating fail on one Bykski block, I emailed them and they sent me a new one for free no questions asked, the new one has been solid for 18 months. And I have a Bykski CPU block I've been using for 4 years now. I went from a Threadripper to AM5 and could keep using it because it has mounting holes for both. It's so massive the fin area is larger than an AM5 CPU lol. I did have to dremel 1mm of acrylic off the top and bottom to get it to fit between the VRM and m.2 heatsinks on my motherboard, but it works!

That's more than good enough for me.


----------



## marti69

RaMsiTo said:


> View attachment 2575954
> 
> 
> 
> 
> View attachment 2575956
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


how did score past 28k bro i have same gpu suprim x even with 3075mhz oc my best is 27931 on port royal did you do any tweaks to the driver to get that?


----------



## marti69

Clukos said:


> My best scores so far on the FE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 30 364 in Time Spy
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 606 in Port Royal
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 10 923 in Speed Way
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 15 077 in Time Spy Extreme
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2576095
> 
> View attachment 2576096
> 
> View attachment 2576097
> 
> View attachment 2576098


how do you guys get past 28k on port royal im runing 3075mhz on a suprim x but my best is 27935


----------



## Gandyman

Jordyn said:


> Not normal. Def should be 16x4.0.
> 
> View attachment 2576086
> 
> 
> Maybe check any power saving modes?


Thanks for following up. Just tried a 3070 and my old 3090 back in and it says 16x but the 4090 says 8x (even in bios) I even tried a riser cable in case I wasn't pushing the massive card far enough into the socket. Always says 8x with the 4090. 

Sigh .. Guess I have to RMA


----------



## kx11

I don't know why PR won't finish 










I scored 15 978 in Time Spy Extreme


Intel Core i9-12900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com














I scored 10 541 in Speed Way


Intel Core i9-12900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## warrior-kid

Got a sealed PNY Verto 4090 and still unconvinced I should open it and actually give it a spin. It is a 2520 MHz clocked model, supposedly has a 4x1 cable included along with an extra backplate (not opened the box so can't confirm). Anyone here has one, no reviews, no idea what the max power is, what do you think guys, should I return it and wait for a higher end option? Other thoughts?


----------



## yzonker

Interesting, I seem to be coming up a little short on the PL with my TUF OC. Only about 560w?


----------



## Gandyman

warrior-kid said:


> Got a sealed PNY Verto 4090 and still unconvinced I should open it and actually give it a spin. It is a 2520 MHz clocked model, supposedly has a 4x1 cable included along with an extra backplate (not opened the box so can't confirm). Anyone here has one, no reviews, no idea what the max power is, what do you think guys, should I return it and wait for a higher end option? Other thoughts?


Power limits do nothing to 4090 anyways. I get 3025 at 70% or 130% power limit

(aka I would be considering cooler strength and noise levels over power limit)


----------



## morph.

heptilion said:


> what programme are you using to change colours on the 4090?


The dreaded armoury crate


----------



## Benjit

Is anyone applying an undervolt to get the best power performance. I've managed 1850/1000mv with +1000 on ram.


----------



## yzonker

Anybody know if there is a way to increase the scale for the AB VF curve. Seems topped at 3000mhz. I tried PX1 and GPU Tweak 3, but they didn't seem to work either. PX1 had a lower max scale and for some reason GPU Tweak 3 was capped at 2910mhz (actual running clock speed during a PR run) no matter what I did.


----------



## Azulath

parky.fp said:


> Anyone else with an MSI X570-A pro experiencing an issue getting a display signal? VGA light is red with an ASUS TUF 4090 - system boots fine with a 3090.
> 
> Bios updated to latest,1000w PSU


Similar to you, I have issues with my Gigabyte X570 Aorus Master coupled with a MSI RTX 4090 Suprim X. My 3090 Strix works without any issues, but when I put my 4090 into the slot I don't get any display output, the fans on the card do not spin up and the lights of the card do not go on.

I have tested the card with both a 1KW PSU and a 1.6KW PSU and I have used both the 12VHPWR (12+4) dongle that came with the card and the 12VHPWR (12+2) cable that came with the PSU. (PSU cables are interchangeable given that both are Seasonic Prime.)

Just to be certain, I have also updated my Mainboard's BIOS to the latest version (both, stable and beta) but the problem persists. Furthermore, I have tried a CMOS reset for good measure but to no avail...

My problem seems similar to Jay's, but he does neither mention if the fans of the card spin up, nor if RGB stays off entirely, which is why I'm not entirely sure on this.





Any help or input would be appreciated!


----------



## cletus-cassidy

Nizzen said:


> It's out very fast with new gpu's, performance is great, it's cheap.
> QC is garbage and often there is no "how to install"
> Performance per $ and how fast it's avaiable makes Bykski to a overclockers favourite. Who want to wait for months to overclocking the new gpu "to the limit"
> For long therm use, there are better and more expensive options.


This is the way. Buy the Bykski now and replace with Optimus later.


----------



## LunaP

Azulath said:


> Spoiler


What the....in his video he shows 3dbenchmark scores w/ 3090ti's beating the 4090 due to SLI, I thought SLI was proven not to work w/ the 3090 series for anything cuz they removed the option for it ( Cuz GN did a vid and saw no gains iirc ) did something change? Whole reason I skipped Ampere was due to that since I was already on SLI and mine beat out the 3090ti lmao REEEEEEE can someone share input?


That aside looks like more and more threads of scalpers boasting the GFE program giving them access to tons of cards, hoping something gets done. I wish microcenter would ship....

Also could we get a list of cards that are block supported as well as which supports what for the OP ? That'd be helpful.


----------



## Nizzen

LunaP said:


> What the....in his video he shows 3dbenchmark scores w/ 3090ti's beating the 4090 due to SLI, I thought SLI was proven not to work w/ the 3090 series for anything cuz they removed the option for it ( Cuz GN did a vid and saw no gains iirc ) did something change? Whole reason I skipped Ampere was due to that since I was already on SLI and mine beat out the 3090ti lmao REEEEEEE can someone share input?
> 
> 
> That aside looks like more and more threads of scalpers boasting the GFE program giving them access to tons of cards, hoping something gets done. I wish microcenter would ship....
> 
> Also could we get a list of cards that are block supported as well as which supports what for the OP ? That'd be helpful.


Sli with nvlink is working with 3090. 3dmark is the most supported "game" LoL


----------



## LuckyImperial

BTK said:


> I’ve been trying to buy one I’m just trying to figure out where the to hell to buy one of these things


Join the very large club. I'm afraid it's a paper launch, again, despite Nvidia's assurances it wouldn't be.

Very frustrating.


----------



## yzonker

Not too bad. Looks like I'm just a little off from the air cooled results if you don't count the 29xxx someone posted earlier. lol Not going to match that for sure. 










Result not found







www.3dmark.com





I'd need to be able to complete +300 core looks like based on the clocks shown in 3DMark. My run was +240/+1800.

Here's the one I'm referring to (not my run),









I scored 29 509 in Port Royal


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## RaMsiTo

yzonker said:


> Not too bad. Looks like I'm just a little off from the air cooled results if you don't count the 29xxx someone posted earlier. lol Not going to match that for sure.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I'd need to be able to complete +300 core looks like based on the clocks shown in 3DMark. My run was +240/+1800.
> 
> Here's the one I'm referring to (not my run),
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 509 in Port Royal
> 
> 
> Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com



It's mine, after that I got another one somewhat lower.









Result not found







www.3dmark.com


----------



## yzonker

RaMsiTo said:


> It's mine, after that I got another one somewhat lower.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


I think you definitely got one of the better cores. I'm just happy I got good mem this time at least. Mem sucked on my 3090. 

Are you just using the core offset slider? I'm still looking for a way to edit the VF curve above 3GHz like I posted above.


----------



## inedenimadam

warrior-kid said:


> Got a sealed PNY Verto 4090 and still unconvinced I should open it and actually give it a spin. It is a 2520 MHz clocked model, supposedly has a 4x1 cable included along with an extra backplate (not opened the box so can't confirm). Anyone here has one, no reviews, no idea what the max power is, what do you think guys, should I return it and wait for a higher end option? Other thoughts?


What we do know about 4090s is that the 450W and 600W cards perform almost identical for most gaming workloads. We also know that Nvidia has (afaik) almost always produced cards that we can cross flash with higher TDP bios. We are voltage limited more so than power limited on 4090s. PNY generally doesn’t make bad products. 
With those things in mind, if you are getting a card to play games and maybe do a little benching, then I would open that sucker up and game on. If you are getting it for benching, then scalp it and get one of the cards that are known to have the high TDP, because it’s a toss up with no reviews.


----------



## marcetps

Hi guys, anyone with a 4090 FE and a Lian Li Dynamic XL case? Is the 12vhpwr cable hitting the glass? Anyone using the gpu vertical mounted (thinking about doing this in case the 12vhpwr cable forces the glass)?


----------



## RaMsiTo

yzonker said:


> I think you definitely got one of the better cores. I'm just happy I got good mem this time at least. Mem sucked on my 3090.
> 
> Are you just using the core offset slider? I'm still looking for a way to edit the VF curve above 3GHz like I posted above.


+165 in the core, I haven't tried to use the curve, in overlay I manage to go to 3165 mhz(+210), I am very limited if the test consumes more than 520w by the msi bios


----------



## yzonker

RaMsiTo said:


> +165 in the core, I haven't tried to use the curve, in overlay I manage to go to 3165 mhz(+210), I am very limited if the test consumes more than 520w by the msi bios


That's interesting. That card must have a lot more offset out of the box as my TUF OC is only 3030-3045 at +240. Like I posted earlier, my card doesn't seem to really make it to 600w. I saw 2 brief blips of power limit during TSE GT2. Nothing else other than Metro EE and Quake 2 RTX get it there (and Kombustor of course). I'll have to try flashing a few different bios whenever we get an updated NVFlash.


----------



## RaMsiTo

yzonker said:


> That's interesting. That card must have a lot more offset out of the box as my TUF OC is only 3030-3045 at +240. Like I posted earlier, my card doesn't seem to really make it to 600w. I saw 2 brief blips of power limit during TSE GT2. Nothing else other than Metro EE and Quake 2 RTX get it there (and Kombustor of course). I'll have to try flashing a few different bios whenever we get an updated NVFlash.


Curve by default 2940 mhz.


----------



## yzonker

RaMsiTo said:


> Curve by default 2940 mhz.
> View attachment 2576138


Wow,


----------



## warrior-kid

inedenimadam said:


> What we do know about 4090s is that the 450W and 600W cards perform almost identical for most gaming workloads. We also know that Nvidia has (afaik) almost always produced cards that we can cross flash with higher TDP bios. We are voltage limited more so than power limited on 4090s. PNY generally doesn’t make bad products.
> With those things in mind, if you are getting a card to play games and maybe do a little benching, then I would open that sucker up and game on. If you are getting it for benching, then scalp it and get one of the cards that are known to have the high TDP, because it’s a toss up with no reviews.


My main interest is to get to 8K gaming as much as possible (to use my 8K monitor fully). Currently, Assassins Odyssey is one working really well on my 3090 but I'd like 8K to be playable on more. Decided to ship back the card in the end, scalping is not for me. Suprim X may be the next target.


----------



## nycgtr

LuckyImperial said:


> Join the very large club. I'm afraid it's a paper launch, again, despite Nvidia's assurances it wouldn't be.
> 
> Very frustrating.


If you are in the US this is beyond false. I have picked up 3 fairly easily in the past 2-3 days, this with being model specific as well. I passed on zotacs, the trio, surpim aio, aorus xtreme, master all of which I had in my cart for checkout. If you use any twitter stock alerts, they come in and out thru out the day. The FE on the other hand is an unicorn.


----------



## dante`afk




----------



## J7SC

nycgtr said:


> If you are in the US this is beyond false. I have picked up 3 fairly easily in the past 2-3 days. If you use any twitter stock alerts, they come in and out thru out the day. The FE on the other hand is an unicorn.


...was in a store yesterday (W.Canada) which had over ten of these on hand. Now I need to figure out how to integrate this thing. Horizontal green lines are the same width. 











dante`afk said:


> View attachment 2576143


I suspected that Asus might try to take the 'KingPin' special positioning (including on price). Who knows, may be we'll see a w-cooled Matrix w/XOC vbios, yeah ! 4090 Ti and RDNA3 remain the wildcards, though...


----------



## nycgtr

dante`afk said:


> View attachment 2576143


I find this a bit hard to believe tbh. While a special "matrix" sku would not surprise me. Previous matrix versions were not any different than the strix sku in die quality.


----------



## Carillo

Is zhrooms alive ? Out hunting 4090's ?


----------



## themad

marcetps said:


> Hi guys, anyone with a 4090 FE and a Lian Li Dynamic XL case? Is the 12vhpwr cable hitting the glass? Anyone using the gpu vertical mounted (thinking about doing this in case the 12vhpwr cable forces the glass)?


Not sure if you can see anything but this is a Suprim X 3090 Ti with the 12vhpwr cable from the Asus Thor 1200w psu. There is about 2cm to the glass. It seems both cards are about the same in that dimension (height?) 140mm.
I was wondering the same and it seems it should be ok.


Spoiler


----------



## marcetps

themad said:


> Not sure if you can see anything but this is a Suprim X 3090 Ti with the 12vhpwr cable from the Asus Thor 1200w psu. There is about 2cm to the glass. It seems both cards are about the same in that dimension (height?) 140mm.
> I was wondering the same and it seems it should be ok.
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2576146


Thank you! In that case it will nto fit for sure, the connector on your 3090ti is inside the card (at least 1cm less) and on the 4090 is i nthe same level of the card. I love my current case but maybe will need to get a 719 enthoo


----------



## themad

marcetps said:


> Thank you! In that case it will nto fit for sure, the connector on your 3090ti is inside the card (at least 1cm less) and on the 4090 is i nthe same level of the card. I love my current case but maybe will need to get a 719 enthoo


Oh, really? I gotta look into that in more detail then in case I wanna go for a 4090.
Thanks for raising that.


----------



## LunaP

Nizzen said:


> Sli with nvlink is working with 3090. 3dmark is the most supported "game" LoL


Ah ok if its just that then ok lol, moot point at that was gonna be upset if not.


----------



## ZealotKi11er

Looking at some of your guys OC, my Gigabyte Gaming OC at only +150 looks sad. 
Also no way I am dropping $280 USD + TAX + Shipping on a WB for 0 gains. I could buy a different case and more for that much money.


----------



## dalip1

Anyone know where I can find a review of the PNY 4090 brand? Can't find anything online and never heard about this brand. I'm kinda scared tbh


----------



## Nizzen

ZealotKi11er said:


> Looking at some of your guys OC, my Gigabyte Gaming OC at only +150 looks sad.
> Also no way I am dropping $280 USD + TAX + Shipping on a WB for 0 gains. I could buy a different case and more for that much money.


Cool story bro


----------



## akgis

The MSI suprim liquids are poping out in stock I was waiting for the Strix aparently drops next week but the AIO colling seems fun, is it worth it? The price isnt that much more expensive than a normal Suprim X and it is bellow the strix


----------



## warrior-kid

dalip1 said:


> Anyone know where I can find a review of the PNY 4090 brand? Can't find anything online and never heard about this brand. I'm kinda scared tbh


It is a very reputable company, with large presence especially in the US. Having said that, I've just packed my sealed purchased PNY Verto card to be returned--will go for a Suprim X replacement.


----------



## ArcticZero

dalip1 said:


> Anyone know where I can find a review of the PNY 4090 brand? Can't find anything online and never heard about this brand. I'm kinda scared tbh


PNY are a very reputable brand, and only one of a few AIB's who make Quadros. My current GPU is a 3090 which I've shunt modded, put a block on and mined and gamed with since launch at ~500+w. I realize that's a small sample size, but still. No issues whatsoever.


----------



## yzonker

I did get the GPU Tweak 3 curve editor to work.


----------



## LuckyImperial

nycgtr said:


> If you are in the US this is beyond false. I have picked up 3 fairly easily in the past 2-3 days, this with being model specific as well. I passed on zotacs, the trio, surpim aio, aorus xtreme, master all of which I had in my cart for checkout. If you use any twitter stock alerts, they come in and out thru out the day. The FE on the other hand is an unicorn.


I have stock alerts set up for a MSI Suprim Liquid X at Best Buy, NewEgg, my local PC store (Central Computer) and B&H and I've not received a single ping.

I was able to "Purchase" one on Central Computers web store on launch day (5 minutes after launch) and received this message after purchase: 
"
We are sorry to say, due to very high demand for the RTX 4090, we have sold out of all models, and unfortunately are unable to process your order. We are extremely sorry. Currently, we are trying to get ETAs from our distributors on the next shipment and will provide updates as soon as possible. Please let us know if you want to continue the order or want to cancel. Thank you again for your understanding.

-Central Computers Team"

I watched Best Buy and NewEgg sell out instantly.

Where do you live? I live in the Bay Area and these things just don't exist, and Nvidia is down the street.


----------



## Bilco

dante`afk said:


> View attachment 2576143


I hope this is complete BS.... Strix is supposed to be a premo card.. Where's the TUF cards fit in? If they are using lowly binned chips in the strix I probably won't bother with their **** in the future.

Also these cards seem to collectively run fairly cool, do we have any data on the uplift of air cooled vs water? I am thinking this generation might not be worth the hassle of adding a block on to.


----------



## EastCoast

Bilco said:


> I hope this is complete BS.... Strix is supposed to be a premo card.. Where's the TUF cards fit in? If they are using lowly binned chips in the strix I probably won't bother with their **** in the future.


LOL, there is absolutely no binned gpu dies for AIBs.


----------



## Bilco

EastCoast said:


> LOL, there is absolutely no binned gpu dies for AIBs.


Can't AIB's bin on their side and haven't they? The KPN cards being an example, no?


----------



## Hulk1988

Does anybody know where I can get the new MSI afterburner version with the higher voltage unlocker?

Did someone tried it already and got more points in Benchmark?


----------



## RaMsiTo

Hulk1988 said:


> Does anybody know where I can get the new MSI afterburner version with the higher voltage unlocker?
> 
> Did someone tried it already and got more points in Benchmark?











MSI Afterburner 4.6.5 (Beta2) Download


MSI Afterburner 4.6.2 Download - Today we release an updated this Stable revision of Afterburner, this application successfully secured the leading position on graphics card utilities.




www.guru3d.com


----------



## Baasha

LunaP said:


> What the....in his video he shows 3dbenchmark scores w/ 3090ti's beating the 4090 due to SLI, I thought SLI was proven not to work w/ the 3090 series for anything cuz they removed the option for it ( Cuz GN did a vid and saw no gains iirc ) did something change? Whole reason I skipped Ampere was due to that since I was already on SLI and mine beat out the 3090ti lmao REEEEEEE can someone share input?
> 
> 
> That aside looks like more and more threads of scalpers boasting the GFE program giving them access to tons of cards, hoping something gets done. I wish microcenter would ship....
> 
> Also could we get a list of cards that are block supported as well as which supports what for the OP ? That'd be helpful.


Did someone say SLI? 









nycgtr said:


> If you are in the US this is beyond false. I have picked up 3 fairly easily in the past 2-3 days, this with being model specific as well. I passed on zotacs, the trio, surpim aio, aorus xtreme, master all of which I had in my cart for checkout. If you use any twitter stock alerts, they come in and out thru out the day. The FE on the other hand is an unicorn.


Well if you're trying to get the Asus RoG Strix 4090 OC and the 4090 FE, the above post is patently false. The Strix has been in stock for ~ 3 mins on Wednesday at launch on NewEgg and NEVER came back in stock. Yes, there were several at Microcenter so those living close to one lucked out. For those of us trying to order online, that card has NOT been in stock since Wednesday morning.

Further, the GFE 'email notification' to buy the 4090 FE also seems like hogwash as I have 4x 3090 Ti on 3 separate rigs and NONE of the rigs show me the notification to buy the 4090 FE. 

Seems like those who bought other brands like Gigabyte, Zotac, and MSI didn't have problems as you mentioned - I too had the opportunity to buy those but chose not to since I want the Strix OC and the FE.


----------



## yzonker

Hulk1988 said:


> Does anybody know where I can get the new MSI afterburner version with the higher voltage unlocker?
> 
> Did someone tried it already and got more points in Benchmark?


This one. 









MSI Afterburner 4.6.5 (Beta2) Download


MSI Afterburner 4.6.2 Download - Today we release an updated this Stable revision of Afterburner, this application successfully secured the leading position on graphics card utilities.




www.guru3d.com


----------



## bmagnien

This looks gorgeous, can’t wait to play on OLED with a new 4090 whenever I can buy one


----------



## inedenimadam

yzonker said:


> Anybody know if there is a way to increase the scale for the AB VF curve. Seems topped at 3000mhz. I tried PX1 and GPU Tweak 3, but they didn't seem to work either. PX1 had a lower max scale and for some reason GPU Tweak 3 was capped at 2910mhz (actual running clock speed during a PR run) no matter what I did.
> 
> View attachment 2576126


Grab a lower point in the curve and shift click the line up?


----------



## bmagnien

Is Asus strix/TUF the only cards with 2 hdmi 2.1 ports? That could be the deciding factor for folks with multiple monitor tvs


----------



## Shaded War

bmagnien said:


> Is Asus strix/TUF the only cards with 2 hdmi 2.1 ports? That could be the deciding factor for folks with multiple monitor tvs


Yes, but I believe you are still limited to 4 displays. Not much advantage over other cards with a DisplayPort adapter.


----------



## bmagnien

Shaded War said:


> Yes, but I believe you are still limited to 4 displays. Not much advantage over other cards with a DisplayPort adapter.


Some TVs only take hdmi 2.1 to get 4K 120. Some of the most popular lg oleds are like this.


----------



## gamerMwM

Baasha said:


> Well if you're trying to get the Asus RoG Strix 4090 OC and the 4090 FE, the above post is patently false. The Strix has been in stock for ~ 3 mins on Wednesday at launch on NewEgg and NEVER came back in stock. Yes, there were several at Microcenter so those living close to one lucked out. For those of us trying to order online, that card has NOT been in stock since Wednesday morning.
> 
> Further, the GFE 'email notification' to buy the 4090 FE also seems like hogwash as I have 4x 3090 Ti on 3 separate rigs and NONE of the rigs show me the notification to buy the 4090 FE.
> 
> Seems like those who bought other brands like Gigabyte, Zotac, and MSI didn't have problems as you mentioned - I too had the opportunity to buy those but chose not to since I want the Strix OC and the FE.


You're right. If you've been trying to land a Strix online, nothing has dropped since launch. I think we'll all have more opportunity at the end of next week.


----------



## Rbk_3

Bilco said:


> Can't AIB's bin on their side and haven't they? The KPN cards being an example, no?


The kingpin 3090s definitely were not binned


----------



## Xavier233

Some MSI 4090 Trios on youtube are having coil whine. Looks like its really random this time, not for a specific brand or model


----------



## yzonker

inedenimadam said:


> Grab a lower point in the curve and shift click the line up?


When you select a point a window pops up with up/down arrows. It's nice in that it moves in proper 15mhz steps. I think there's a way to move a group also, but I haven't tried it.


----------



## Carillo

Bilco said:


> Can't AIB's bin on their side and haven't they? The KPN cards being an example, no?


Asus like all other AIB`s is a business with one main goal, to make money. Why on earth would they waste hours binning chips when we buy their product anyway?


----------



## xcx xcxvgyt

Is there anyone tried to flash bios with new nvflash v5.763 ?


----------



## WayWayUp

what concerns me even more is that they would even make a 4090 kingpin

That means the 4090ti is either a full year + away or may never even see the market at all


----------



## yt93900

Probably, still I think the 4090 turned out to be way more efficient than predicted and there is some room left for the full "Ti" chip, even aircooled.


----------



## yzonker

xcx xcxvgyt said:


> Is there anyone tried to flash bios with new nvflash v5.763 ?


People have reported it does not work.


----------



## 8472

Shaded War said:


> Yes, but I believe you are still limited to 4 displays. Not much advantage over other cards with a DisplayPort adapter.


Those displayport to hdmi 2.1 adapters usually have trouble with something, such as lack of support for VRR with Nvidia GPUs. 

The extra HDMI port was the sole reason that I went with Asus.


----------



## WayWayUp

could you even imagine how much asus would ask for a 4090ti kingpin? if the regular strix 4090 is $2k?

2.8k? rofl


----------



## Xavier233

WayWayUp said:


> could you even imagine how much asus would ask for a 4090ti kingpin? if the regular strix 4090 is $2k?
> 
> 2.8k? rofl


The consensus right now is that those $1900-$2000 are not worth it in terms of additional (if any?) performance or cooling. At least a 4090Ti would have more performance (cuda cores) which might justify its price increase. Basically, get the cheapest 4090 u can, or wait for a 4090Ti but expect to pay $2000 MSRP on a FE 4090Ti


----------



## alasdairvfr

Edit i bumped up a little higher broke top 100 for whatever thats worth on a new bench

For some reason I can't get great PR scores, but I can get a decent bench out of SpeedWay, goes over 3,100 mhz! +270/+1500 and still testing

I'm not sure what I'm doing wrong for the PR score though. I can do +210/+1200 but if i crank up memory i have to dial back core, vs versa. 210/1200 seems to be my peak and my limit

Gigabyte Gaming OC card, on air

NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)












NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)


----------



## Azazil1190

Hey guys!
I have to choose between gigabyte gaming oc and tuf non oc. Im so confused which to choose.
Tuf is more elegant product but giga on the other hand has very good temps and 20power phases and it can draw 600w if you set pl at 133 .
Does the pl matters in that generation?


----------



## bmagnien

Azazil1190 said:


> Hey guys!
> I have to choose between gigabyte gaming oc and tuf non oc. Im so confused which to choose.
> Tuf is more elegant product but giga on the other hand has very good temps and 20power phases and it can draw 600w if you set pl at 133 .
> Does the pl matters in that generation?


TUF also has 600 out of the box, both oc and non oc


----------



## alasdairvfr

Azazil1190 said:


> Hey guys!
> I have to choose between gigabyte gaming oc and tuf non oc. Im so confused which to choose.
> Tuf is more elegant product but giga on the other hand has very good temps and 20power phases and it can draw 600w if you set pl at 133 .
> Does the pl matters in that generation?


A lot of ppl on here thing the Gigabyte Gaming OC is one of the better AIB cards this time around... not only that but considering it's much cheaper than the SuprimX/Strix.

If the price is on par I'd go for the Gigabyte 100%. Also, I'm not sure if the non OC TUF card has a 450w or 600w bios but the Gigabyte is 600.


----------



## RetroWave78

nycgtr said:


> If you are in the US this is beyond false. I have picked up 3 fairly easily in the past 2-3 days, this with being model specific as well. I passed on zotacs, the trio, surpim aio, aorus xtreme, master all of which I had in my cart for checkout. If you use any twitter stock alerts, they come in and out thru out the day. The FE on the other hand is an unicorn.


Where? I've been looking roughly 3 times a day, Newegg, Best Buy, BHPhoto, Craigslist and Ebay and the only ones I can find are on CL and Ebay scalped for $3k. I'd prefer for FE but will settle for Tuf at this point now that a water block has been announced. I could care less about the cooler, ultimately it's going under water.

Point is, none of us can find this in stock anywhere ever but you've passed up on a bunch of cards, where are you looking that we aren't?

Edit: 

Just checked Micro Center and all of the 4090 cards are listed as out of stock at every location.

Paper launch. 100%.


----------



## Azazil1190

alasdairvfr said:


> A lot of ppl on here thing the Gigabyte Gaming OC is one of the better AIB cards this time around... not only that but considering it's much cheaper than the SuprimX/Strix.
> 
> If the price is on par I'd go for the Gigabyte 100%. Also, I'm not sure if the non OC TUF card has a 450w or 600w bios but the Gigabyte is 600.


Yeap the giga gaming is one of the best this year at the cooling but its ungly and its all plastic but it can draw 600w. The tuf has a premium look with that metal but its a bit hotter and of course im searching the net to found infows about the clear draw watts of tuf.Another thing that i noticed is that asus gpu bioses gives more fps performance vs the giga ,maybe its my idea


----------



## Bilco

Carillo said:


> Asus like all other AIB`s is a business with one main goal, to make money. Why on earth would they waste hours binning chips when we buy their product anyway?


A potential even higher mark up and profit?


----------



## derthballs

alasdairvfr said:


> Edit i bumped up a little higher broke top 100 for whatever thats worth on a new bench
> 
> For some reason I can't get great PR scores, but I can get a decent bench out of SpeedWay, goes over 3,100 mhz! +270/+1500 and still testing
> 
> I'm not sure what I'm doing wrong for the PR score though. I can do +210/+1200 but if i crank up memory i have to dial back core, vs versa. 210/1200 seems to be my peak and my limit
> 
> Gigabyte Gaming OC card, on air
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)
> 
> View attachment 2576159
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)
> View attachment 2576158


I get the same PR Score with undervolt at 2850 + 1000 on memory.


----------



## bmagnien

Azazil1190 said:


> Yeap the giga gaming is one of the best this year at the cooling but its ungly and its all plastic but it can draw 600w. The tuf has a premium look with that metal but its a bit hotter and of course im searching the net to found infows about the clear draw watts of tuf.Another thing that i noticed is that asus gpu bioses gives more fps performance vs the giga ,maybe its my idea


No need to search the net the answer is back a few pages in this thread


----------



## Carillo

Bilco said:


> A potential even higher mark up and profit?


That’s total BS. Yes the top models have stronger ( overkill) power delivery , optimized bioses, and more metal instead of plastic on the coolers , but silicon is random , always have been. I have tested over 30 ampere gpu’s, best chip I found was MSI Ventus 3X , Turing : HP OEM , Pascall : Gigabyte windforce. Why would they start doing that when GPU’s sells better then ever ?


----------



## RetroWave78

"Reduce sell in to let channel inventory correct"






Paper launch, artificial scarcity 100% confirmed.

Think about it, there are maybe 10 FE's on ebay all going for $3k and no-one in this thread managed to get one, so maybe Nvidia sold 100-500 on launch day globally?

This is disgusting.


----------



## BeZol

alasdairvfr said:


> Edit i bumped up a little higher broke top 100 for whatever thats worth on a new bench
> 
> For some reason I can't get great PR scores, but I can get a decent bench out of SpeedWay, goes over 3,100 mhz! +270/+1500 and still testing
> 
> I'm not sure what I'm doing wrong for the PR score though. I can do +210/+1200 but if i crank up memory i have to dial back core, vs versa. 210/1200 seems to be my peak and my limit
> 
> Gigabyte Gaming OC card, on air
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)
> 
> View attachment 2576159
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)
> View attachment 2576158





alasdairvfr said:


> Edit i bumped up a little higher broke top 100 for whatever thats worth on a new bench
> 
> For some reason I can't get great PR scores, but I can get a decent bench out of SpeedWay, goes over 3,100 mhz! +270/+1500 and still testing
> 
> I'm not sure what I'm doing wrong for the PR score though. I can do +210/+1200 but if i crank up memory i have to dial back core, vs versa. 210/1200 seems to be my peak and my limit
> 
> Gigabyte Gaming OC card, on air
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)
> 
> View attachment 2576159
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)
> View attachment 2576158





















Gainward Phantom GS RTX 4090 overclocked.

"4x8pin" card, but only +11% extra pwr-limit allowed.
I think the non-GS version got no extra TDP option.

Time Spy Extreme is more demanding than Speed Way.

Interestingly this card is power-limited, not voltage-limited (for now).

At Time Spy Extreme:
Graphics Test 1 goes throug with around 1,05-1,075V
Graphics Test 2 with 1,00-1,05V

The temperatures are funny low... 

At normal Time Spy I am at some scenes CPU-bound with all core OC to 4.6GHz (5950X).

Waiting for MSI Afterburner update, so I can fine tune the voltage/frequency curve above 3GHz (if you change it, your setting going to flatten out at 3GHz, wont go above).
Waiting for GPU-Z update so I can save the BIOS.
And the same goes for nvflash64, because "graphics adapter" not recognized, so I also cannt save the BIOS, and cann't flash it.

BUT

I was able to save the bios with the Gainward tool.
The EVGA X1 showed that I could update the BIOS (well, I did not dare to click it  )
The ASUS Gpu Tweak III can handle the 3+GHz gpu-clock range, but this program is behaving strangely, so I left it after 3-4 hours of trial and error.


----------



## nyk20z3

LunaP said:


> Curious why some people keep convincing themselves this, like sure if you don't need it you don't need but some do, and people are mostly here because they WANT it so kinda moot point to tell people u don't need it just because their usage scenarios don't line up with theirs, but stating it like their opinion is fact feels a bit weird. Rants/Complaints aside granted we're all here for the same thing so ofc people are going to vent to others w/ similar interest.
> 
> Not saying that's what u were meaning, just seeing that phrasing here and there like people have to constantly reinforce their own beliefs lol.


It’s true, what we do is already a niche market. A person who legit needs this card to game smoothly is probably in the below 1% category. But it won’t stop people from buying them, i am still a firm believer in overkill even if it makes no sense 😉


----------



## yzonker

A little different config. Closer to 600w. This card seems to be the same as many/almost all 30 series cards. Depending on what you run, internal limits can be hit and it will limit below 600w. I wish some other people would do this test for comparison. Curious if any of the other cards are better. Reviews all led us to believe the FE is better for one.


----------



## bmagnien

yzonker said:


> A little different config. Closer to 600w. This card seems to be the same as many/almost all 30 series cards. Depending on what you run, internal limits can be hit and it will limit below 600w. I wish some other people would do this test for comparison. Curious if any of the other cards are better. Reviews all led us to believe the FE is better for one.
> 
> View attachment 2576177


That’s weird that it says perf cap reason is power when you’re not hitting power limit. Also, your voltage at 1.065 is lower than max of 1.1. Is there any load you can run that takes voltage to 1.1? Would be curious what your clocks, power, and thermals are at 1.1v


----------



## yzonker

bmagnien said:


> That’s weird that it says perf cap reason is power when you’re not hitting power limit. Also, your voltage at 1.065 is lower than max of 1.1. Is there any load you can run that takes voltage to 1.1? Would be curious what your clocks, power, and thermals are at 1.1v


Clocks are low because it is on the power limit


----------



## bmagnien

yzonker said:


> Clocks are low because it is on the power limit


Can you set all sensors to display max and run again? Perf cap power doesn’t make sense when you’re at 585/600. I wasn’t saying your clocks were low btw, or anything about your clocks.


----------



## J7SC

Just did a quick boot-up with a temporary mount of the GigaGamingOC, all very janky with kitchen paper towel cores holding it up  ...haven't pushed the limits yet as the protective foil is still on the front and back (don't want to bake that on) but it is definitely a keeper. Serious benching and scoping-the-limits will have to wait a bit until proper installation, but some early figs below with ~ 525 W. max 1.05v max and about 25 C ambient.


----------



## yzonker

bmagnien said:


> Can you set all sensors to display max and run again? Perf cap power doesn’t make sense when you’re at 585/600. I wasn’t saying your clocks were low btw, or anything about your clocks.


It does make sense. There are many internal limits beyond the PL you set in AB. If one is hit the card will show pwr limit. Doesn't matter where total power is at.


----------



## zware62

I would like to report something different that what most of people trying to achieve here... instead of super high speed with high power usage, something opposite ... how to avoid excesive room heating 
I got gigabyte windforce (probably slowest card in 4090 family) but i am super happy with it!
When playing ACC on triple 1440 monitors (vsync 75hz) power consumption by GPU would be like this:
with 3080ti ~320w
with 3070 laptop (but on 1 monitor 60Hz) : 120w
with 4090 ~160w.... its heating the room almost like laptop on much higher fidelity ... its amazing!


----------



## WayWayUp

Dude is playing gta 5 in 16k res






Most insane gpu


----------



## J7SC

...per my last post above, janky for sure 


Spoiler


----------



## ZealotKi11er

J7SC said:


> Just did a quick boot-up with a temporary mount of the GigaGamingOC, all very janky with kitchen paper towel cores holding it up  ...haven't pushed the limits yet as the protective foil is still on the front and back (don't want to bake that on) but it is definitely a keeper. Serious benching and scoping-the-limits will have to wait a bit until proper installation, but some early figs below with ~ 525 W. max 1.05v max and about 25 C ambient.
> View attachment 2576180


Dam +300 is impressive. I dont know if I should keep mine with only +150.


----------



## dante`afk

nycgtr said:


> If you are in the US this is beyond false. I have picked up 3 fairly easily in the past 2-3 days, this with being model specific as well. I passed on zotacs, the trio, surpim aio, aorus xtreme, master all of which I had in my cart for checkout. If you use any twitter stock alerts, they come in and out thru out the day. The FE on the other hand is an unicorn.


can't confirm, I've been trying to get on every day since "launch" no matter which brand.

Almost paid a scalper and got lucky yesterday night, opened up GFE and saw that I got an invite


----------



## MIST3RST33Z3

The 


ZealotKi11er said:


> Looking at some of your guys OC, my Gigabyte Gaming OC at only +150 looks sad.
> Also no way I am dropping $280 USD + TAX + Shipping on a WB for 0 gains. I could buy a different case and more for that much money.


+xxx number doesn’t matter. What clocks are you getting. The closer to 3100mhz you get, the better your silicon luck it seems.


----------



## fitnessgrampacertest

Anybody know why i cant adjust the Voltage on my Asus ROG Strix 4090 through MSI Afterburner? I see that it works in Asus GPU Tweak II, but god does the Userinterface infuriate me. 

Also it seems that the minimum core voltage is bios-locked to 870mv? Can anybody else confirm?


----------



## ZealotKi11er

MIST3RST33Z3 said:


> The
> 
> +xxx number doesn’t matter. What clocks are you getting. The closer to 3100mhz you get, the better your silicon luck it seems.


+xxx do matter. I hit 2970MHz max. You need +300MHz to get close to 3100MHz and running at 1.1v.


----------



## Thebc2

These things are insane, rock solid at +255 +1500. Gaming in 4k/120hz and she doesn’t even break a sweat. I can’t believe this is on air.











Sent from my iPhone using Tapatalk Pro


----------



## Nizzen

fitnessgrampacertest said:


> Anybody know why i cant adjust the Voltage on my Asus ROG Strix 4090 through MSI Afterburner? I see that it works in Asus GPU Tweak II, but god does the Userinterface infuriate me.
> 
> Also it seems that the minimum core voltage is bios-locked to 870mv? Can anybody else confirm?


Use newest AB beta from guru3d


----------



## yzonker

J7SC said:


> ...per my last post above, janky for sure
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2576185


The little pogo stick the TUF came with doesn't work at all for my case, so I did a careful adjustment.


----------



## J7SC

ZealotKi11er said:


> +xxx do matter. I hit 2970MHz max. You need +300MHz to get close to 3100MHz and running at 1.1v.


If you have the opportunity to exchange it, then you might want to do that. Still, unless you are into hard-benching, consistently high FPS w/o too many frame time issues is probably the key measure for many. I really won't know where my card will settle until I have got it under water. I am pleased with it from what little I've seen, also as it was a 'whopping' US$20 more than the (currently) lowest priced 4090s - and the only 4090 model in stock.



fitnessgrampacertest said:


> Anybody know why i cant adjust the Voltage on my Asus ROG Strix 4090 through MSI Afterburner? I see that it works in Asus GPU Tweak II, but god does the Userinterface infuriate me.
> 
> Also it seems that the minimum core voltage is bios-locked to 870mv? Can anybody else confirm?


...minimum core voltage on mine seems to be 879mv+- ...so far, it has not exceeded 1.05v max at all, but as posted before, I just did a few runs to make sure it's a keeper before I removed the protective foil etc. My MSI AB (latest beta) voltage slider does 'move' and is not greyed out, though I am not sure what actual impact it will have...


----------



## inedenimadam

Gaming X Trio pcb shots. I was curious, so I took it apart. Thought you guys might like to see it too.

Edit to add: one of the easiest disassembly jobs I’ve done on a GPU in a while. One type of screw for entire backplate, screws on spring mount are captured, and then one more type for I/O cover.
Might do a shunt mod later if bios issues prevent flashing


----------



## MikeGR7

ZealotKi11er said:


> +xxx do matter. I hit 2970MHz max. You need +300MHz to get close to 3100MHz and running at 1.1v.


It matters for your own internal testing, it is irrelevant to be shared with others though because each card boosts differently despit using the same offset.

Multiple parameters like different ambient temperatures, silicon quality, stock voltage, stock frequency and many more vary from one user to another so a
two cards with the same +100 offset can lead to different final frequency. 

Do not compare your cards with others based on offset but only based on final boost frequency and even then be cautious because different benchmarks may have been used and that also changes what the "stable" final boost frequency is.


----------



## fitnessgrampacertest

Nizzen said:


> Use newest AB beta from guru3d


Link/version #?


----------



## AngryLobster

Wow scaling upward on these is trash. I'm seeing an additional 70-80w for 5.5% gain. Nvidia basically maxed it out the box.

Going the other direction with a undervolt saves 50w with no loss to stock performance.



Thebc2 said:


> These things are insane, rock solid at +255 +1500. Gaming in 4k/120hz and she doesn’t even break a sweat. I can’t believe this is on air.
> 
> Sent from my iPhone using Tapatalk Pro


You do realize you're at 50% GPU usage right? Put a real load on the card.


----------



## MikeGR7

inedenimadam said:


> Gaming X Trio pcb shots. I was curious, so I took it apart. Thought you guys might like to see it too.
> 
> 
> View attachment 2576197
> 
> View attachment 2576196


Thanks for sharing because the front was posted already some pages back but the rear one is giving me new info!

Confirming the Trio has a "lite" version of the Suprims pcb makes me tempted to rank Trios electrical quality above Tufs... look at all those SMDs! ( If only trio had better cooling and PL)

It is the first time after many many years that all three top Manufacturers (Asus,GB,MSI) are so close in electrical properties!

Truly get the one with better price/cooler this round.

Bonus: Watercooling is nice to have but definitely irrelevant this round aswell.


----------



## MikeGR7

fitnessgrampacertest said:


> Link/version #?


Previous page.


----------



## Thebc2

AngryLobster said:


> Wow scaling upward on these is trash. I'm seeing an additional 70-80w for 5.5% gain.
> 
> Going the other direction with a undervolt saves 50w with no loss to stock performance.
> 
> 
> 
> You do realize you're at 50% GPU usage right? Put a real load on the card.


I have put plenty of synthetic load on the card. This was cyberpunk at 4k120, which was the point I was making, 4/120hz gaming and it didn’t break a sweat.


Sent from my iPhone using Tapatalk Pro


----------



## Xavier233

MikeGR7 said:


> Thanks for sharing because the front was posted already some pages back but the rear one is giving me new info!
> 
> Confirming the Trio has a "lite" version of the Suprims pcb makes me tempted to rank Trios electrical quality above Tufs... look at all those SMDs! ( If only trio had better cooling and PL)
> 
> It is the first time after many many years that all three top Manufacturers (Asus,GB,MSI) are so close in electrical properties!
> 
> Truly get the one with better price/cooler this round.
> 
> Bonus: Watercooling is nice to have but definitely irrelevant this round aswell.


And that reflects the reviews in terms of temps and performance across all cards: get the cheapest you can, as long as there is no coilwhine or obvious physical defects. Right now I think Gigabyte or MSI makes most sense


----------



## th3illusiveman

Canada Computers in BC has ~10 gigabyte 4090s in stock right now... they had ~20 when i checked last night. Unfortunately only the FE will fit in my case.... 

Definitely seems like more stock/less demand for these than the 3000 series. Im sure supply will stabilize by mid November in most places. those scalpers can get FK'd.


----------



## inedenimadam

MikeGR7 said:


> Thanks for sharing because the front was posted already some pages back but the rear one is giving me new info!
> 
> Confirming the Trio has a "lite" version of the Suprims pcb makes me tempted to rank Trios electrical quality above Tufs... look at all those SMDs! ( If only trio had better cooling and PL)
> 
> It is the first time after many many years that all three top Manufacturers (Asus,GB,MSI) are so close in electrical properties!
> 
> Truly get the one with better price/cooler this round.
> 
> Bonus: Watercooling is nice to have but definitely irrelevant this round aswell.


Honestly, this is the cleanest release I can recall in several decades of watching this space. There isn't a 'bad card' across the entire AIB landscape. Sure, some variance, but no major outliers at the top or the bottom of the stack.


----------



## bmagnien

Xavier233 said:


> And that reflects the reviews in terms of temps and performance across all cards: get the cheapest you can, as long as there is no coilwhine or obvious physical defects. Right now I think Gigabyte or MSI makes most sense


MSI’s lower end doesn’t have 600w bios out of the box. While flashing bios is almost certainly going to be possible in the future, it’s currently not and technically no guarantee that it will. Just in case that matters. Gigabytes 2nd to lowest and Asus’s lowest both do support 600w. Not sure about the wind force


----------



## Jordyn

inedenimadam said:


> Honestly, this is the cleanest release I can recall in several decades of watching this space. There isn't a 'bad card' across the entire AIB landscape. Sure, some variance, but no major outliers at the top or the bottom of the stack.


To be fair at this end of the market and price point there really shouldn't be "bad cards" but compared to previous releases in recent memory the level of parity and lack of outliers in the high end is surprising alright.


----------



## Dragonsyph

If all the cards almost all boost to same performance then why are you guys wanting 600 watt bios if there voltage limited.


----------



## AngryLobster

Yeah I don't really get it either. Above 500w the cards hit a brick wall.


----------



## Xavier233

bmagnien said:


> MSI’s lower end doesn’t have 600w bios out of the box. While flashing bios is almost certainly going to be possible in the future, it’s currently not and technically no guarantee that it will. Just in case that matters. Gigabytes 2nd to lowest and Asus’s lowest both do support 600w. Not sure about the wind force


Which Gigabyte model has the 600w BIOS? Is it the 4090 Gaming OC?

The performance gain from 450w to 600w might not even be worth that extra 150w other than heat and more noise, IMO. Do we know factually what is it like though?


----------



## Morteen199

Tadaschi said:


> Found the cheaper models pcb pictures to check the power fases, since techpower up just showed the top models pcbs
> i dont believe pcb and power fases will make that much of difference for the 4090 this time around
> 
> source https://www.coolpc.com.tw/tw/shop/gpu/nvidia-rtx4090/


so do the 4090 TUF only have 18 power phases? founders have 20? is the tuf better or worse than founders edition?


----------



## bmagnien

Xavier233 said:


> Which Gigabyte model has the 600w BIOS? Is it the 4090 Gaming OC?
> 
> The performance gain from 450w to 600w might not even be worth that extra 150w other than heat and more noise, IMO. Do we know factually what is it like though?


GB Gaming Oc and Auorus Master have 600. Not 100% sure about Windforce as the PCB is different. I’m just trying to help people with the factual differences, not making any statements about potential implications of those facts.


----------



## bmagnien

Morteen199 said:


> so do the 4090 TUF only have 18 power phases? founders have 20? is the tuf better or worse than founders edition?


TUF has 2 hdmi 2.1 ports. And yes 2 less core power phases, but one more memory power phase


----------



## GassyBiz

Xavier233 said:


> Which Gigabyte model has the 600w BIOS? Is it the 4090 Gaming OC?
> 
> The performance gain from 450w to 600w might not even be worth that extra 150w other than heat and more noise, IMO. Do we know factually what is it like though?


My gigabyte gaming oc will easily go to 600W.
All that’s required is 4 pcie cables from psu into the adapter and at least a 1000W psu.
After that you can pull the power slider to 133% in MSI afterburner.
Highest I’ve seen so far is 580W in Furmark at about 800 fps! Though I haven’t sought out absolute maximum wattage.


----------



## Xavier233

bmagnien said:


> GB Gaming Oc and Auorus Master have 600. Not 100% sure about Windforce as the PCB is different. I’m just trying to help people with the factual differences, not making any statements about potential implications of those facts.


Agree with you there. The price difference between the cheapest gigabyte card and the Gaming OC is $50 CAD, which makes the Gaming OC a much better purchase IMO. I dont know if these 4090 GPUs come binned per model or not


----------



## GassyBiz

My 4090 is only using a bus of x16 gen 3.0 despite a system capable of gen 4.0 throughout.
Z590 Maximus XIII Hero mobo, i9 11900K, pcie slot 1 x16 set to Gen 4 in latest BIOS.

Any help or ideas?


----------



## bmagnien

GassyBiz said:


> My 4090 is only using a bus of x16 gen 3.0 despite a system capable of gen 4.0 throughout.
> Z590 Maximus XIII Hero mobo, i9 11900K, pcie slot 1 x16 set to Gen 4 in latest BIOS.
> 
> Any help or ideas?


Are you using a riser? Try reseating the card.


----------



## Xavier233

GassyBiz said:


> My gigabyte gaming oc will easily go to 600W.
> All that’s required is 4 pcie cables from psu into the adapter and at least a 1000W psu.
> After that you can pull the power slider to 133% in MSI afterburner.
> Highest I’ve seen so far is 580W in Furmark at about 800 fps! Though I haven’t sought out absolute maximum wattage.


Damn, my 12900 OCed consumes 350 watts with the rest of the system (without a GPU). That puts the total worst-case to 950 watts. Very close to the limit of my 1000 watts PSU.


----------



## GassyBiz

bmagnien said:


> Are you using a riser? Try reseating the card.


Yes I’ve reseated and tried even in the x16 spot 2, no luck


----------



## GassyBiz

Xavier233 said:


> Damn, my 12900 OCed consumes 350 watts with the rest of the system (without a GPU). That puts the total worst-case to 950 watts. Very close to the limit of my 1000 watts PSU.


This is under unrealistic gpu stress.

Other benchmarks see a max of 518W and gaming seems even lower than that at max 4K settings


----------



## Morteen199

bmagnien said:


> TUF has 2 hdmi 2.1 ports. And yes 2 less core power phases, but one more memory power phase


so don't buy a tuf?? founders edition better ?


----------



## Wilco183

Just now pre-ordered Tuf (not OC) for $1,599 @ BHPhoto for those interested. A Zotac for pre-order also.


----------



## Wilco183

...


----------



## J7SC

bmagnien said:


> GB Gaming Oc and Auorus Master have 600. Not 100% sure about Windforce as the PCB is different. I’m just trying to help people with the factual differences, not making any statements about potential implications of those facts.


At the end of the day, it is _nice to have the option_ to crank it all the way up to 600W once properly cooled and/or during winter benching - still, for gaming I probably just leave this card on stock settings re. clocks and PL. I game on a big 4K 120 Hz so this should work fine.


----------



## ZealotKi11er

I returned my bad overcloking Gigabyte Gaming OC 4090. Cant get myself to keep a lemon even if perf difference is 1-2%.


----------



## Xavier233

ZealotKi11er said:


> I returned my bad overcloking Gigabyte Gaming OC 4090. Cant get myself to keep a lemon even if perf difference is 1-2%.


I would have bought it from you. 

Keep in mind the silicon lottery is applicable to all cards and models, unless they pre-binned them at the factory.


----------



## dr/owned

inedenimadam said:


> Gaming X Trio pcb shots. I was curious, so I took it apart. Thought you guys might like to see it too.
> 
> Edit to add: one of the easiest disassembly jobs I’ve done on a GPU in a while. One type of screw for entire backplate, screws on spring mount are captured, and then one more type for I/O cover.
> Might do a shunt mod later if bios issues prevent flashing


So 100% confirmed then it's the same PCB as the Suprim X (and Liquid):











Compared to the Strix vs. Tuf, the Suprim X is better than the Strix with 2 more stages. The TUF also uses a similar PCB as the Strix with stages knocked off and probably cheaper capacitors. Can't tell if they're still 70A power stages on the TUF:











TBH the TUF was the "darling" of the 3090 series with the PCB being miles better than the FTW3. So it's interesting that now MSI seems to have the crown.


BTW if you're doing thermal pad replacement on the VRAM: 2.0mm. The flat cold plates mean they're 2.5mm standoff height, the VRAM is 1mm thick so 1.5mm gap-> use 2.0mm thermal pad.


----------



## KillerBee33

ZealotKi11er said:


> I returned my bad overcloking Gigabyte Gaming OC 4090. Cant get myself to keep a lemon even if perf difference is 1-2%.


Yeap, yet I'm still with my 3080Ti GOC. My first and last Gigabyte product for sure.


----------



## Xavier233

J7SC said:


> At the end of the day, it is _nice to have the option_ to crank it all the way up to 600W once properly cooled and/or during winter benching - still, for gaming I probably just leave this card on stock settings re. clocks and PL. I game on a big 4K 120 Hz so this should work fine.


If all 4090s are within 2-3% in terms of performance, that makes it useless to go after the maximum power limit. For example, a good-silicon chip at 500 watts can well outperform another not-so-good silicon chip at 600watts.


----------



## Roldo

# of phases isn't everything
The Gaming OC for example might have 20 but it's using 50A PS from Vishay when the FE, also a 20 phase design, uses 70A SPS from Monolithic.


----------



## RetroWave78

Heads up, B&H Photo is accepting pre-orders for 4090 Tuf, I just placed one. No guarantee as to when they will actually come in but this beats scouring everywhere else multiple times a day and it sure beats buying these for $3k on ebay. This very same pre-order is currently on ebay for like $2500 (at current auction value) so act fast! 



https://www.bhphotovideo.com/c/product/1730937-REG/asus_tuf_rtx4090_24g_gaming_tuf_gmg_gefrc_rtx.html


----------



## bmagnien

RetroWave78 said:


> Heads up, B&H Photo is accepting pre-orders for 4090 Tuf, I just placed one. No guarantee as to when they will actually come in but this beats scouring everywhere else multiple times a day and it sure beats buying these for $3k on ebay. This very same pre-order is currently on ebay for like $2500 (at current auction value) so act fast!
> 
> 
> 
> https://www.bhphotovideo.com/c/product/1730937-REG/asus_tuf_rtx4090_24g_gaming_tuf_gmg_gefrc_rtx.html


This has been up for like 3 days. Shudder to think how many orders are in by now. A pending charge was added to my card when I first ordered on the 12th, and then a second pending charge just came through yesterday on the 14th, which I’m hoping means they’re possibly reverifying for imminent shipment? Or meaningless lol.


----------



## AdamK47

Xavier233 said:


> Some MSI 4090 Trios on youtube are having coil whine. Looks like its really random this time, not for a specific brand or model


Mine has a little bit of coil whine. Can't hear it with the glass door to my 7000X closed. Impossible to hear it from where I sit.


----------



## mattskiiau

Anyone know how to zoom out or increase Freq MHZ graph?
Tried looking up keybinds but couldn't find anything.


----------



## Xavier233

AdamK47 said:


> Mine has a little bit of coil whine. Can't hear it with the glass door to my 7000X closed. Impossible to hear it from where I sit.


I have had high end cards where the coil whine was pretty much gone after a lot of benchmarking. Might be worth a try


----------



## Wilco183

RetroWave78 said:


> Heads up, B&H Photo is accepting pre-orders for 4090 Tuf, I just placed one. No guarantee as to when they will actually come in but this beats scouring everywhere else multiple times a day and it sure beats buying these for $3k on ebay. This very same pre-order is currently on ebay for like $2500 (at current auction value) so act fast!
> 
> 
> 
> https://www.bhphotovideo.com/c/product/1730937-REG/asus_tuf_rtx4090_24g_gaming_tuf_gmg_gefrc_rtx.html


3 distributors on Amazon are selling the TUF OC for same price...$3,499.99. "Only 1 left in stock - order soon" LOL.


----------



## Dragonsyph

Ain’t B and H still closed? Wonder who’s been running the website.

There’s also about 240 4090s on eBay tons at 2050-2200. And going down each day.

Hoping in a few more days they will be down under 2000 and they will stop buying stock.


----------



## RetroWave78

Wilco183 said:


> Just now pre-ordered Tuf (not OC) for $1,599 @ BHPhoto for those interested. A Zotac for pre-order also.


Same, just sent EKWB an email request to change my WB pre-order from FE to Strix / Tuf.

In regards to perceived coil whine being higher on AIB's, I postulate that the FE cooler fully encapsulates the PCB and memory and caps therefore has unparalleled acoustic isolation whereas the AIB's have many direct pathways for sound-waves to escape.

Here's to hoping B&H manage to get a massive shipment to satisfy the pre-orders within the next month.

I prefer waiting a month with a pre-order than playing the scour all retail outlets multiple times a day (and putting up with Newegg's atrociously annoying pop-up) and I'd much rather give B&H Photo my business than Best Buy who clearly collaborated with NGreedia to artificially limit 4090 FE supply.

Tuf is just as fast as Strix and FE, as another user here mentioned recently, there are no outliers in any direction, all of the 4090's perform within a few percentage points of each other:






That Tuf is the same MSRP as FE and performs the same and has a WB available, if you're going to throw whatever card you're getting under a WB this is a great alternative to FE. Also 600w power limit.

There is no difference between the OC and non OC Tuf other than a vbios with a 40 MHz clock difference. 40 MHz means nothing with Lovelace and it's a near certainty OC Tuf VBIOS could be flashed to non OC Tuf but why anyone would bother when the same thing could be accomplished in MSI AIB is beyond me. The $1799 price of Tuf OC exists for those who couldn't snag a non-OC Tuf at $1599. Zero binning is going on between all of the cards this time around.

That FE air-cooler though, masterpiece.


----------



## RetroWave78

Wilco183 said:


> 3 distributors on Amazon are selling the TUF OC for same price...$3,499.99. "Only 1 left in stock - order soon" LOL.


Here's to hoping B&H can fulfill our pre-order!


----------



## Dragonsyph

Hell there’s two 4090s on eBay right now for 1700 and 1800 buy now.

edit: lol there scams selling pictures of 4090s


----------



## inedenimadam

for people wondering why we would want to flash a bios or hard mod for TDP headroom. This is a port royal run with 106% TDP on a gaming X trio.


----------



## Dragonsyph

So fe plus block is best, good to know


----------



## RetroWave78

Dragonsyph said:


> Hell there’s two 4090s on eBay right now for 1700 and 1800 buy now.


This is a great deal! 


*NEW MSI Suprim GeForce RTX 4090 Picture

Please Read Description

Don't buy this if you are a REAL PERSON trying to buy a Nvidia RTX 4090. This listing is to help stop the huge bot buying problem effecting the GPU Market. This is a Picture of a MSI Suprim GeForce RTX 4090. Again, Please Don't Buy This Nvidia RTX 4090 listing if you are not a bot. Seriously, this listing is for a Picture of the MSI Suprim GeForce RTX 4090. The Picture will be delivered to your eBay Mail within 24 Hours of Purchase.

All Sales Are Final!!*


----------



## RetroWave78

Dragonsyph said:


> So fe plus block is best, good to know


Nope, Tuf + block is best.


----------



## Wilco183

RetroWave78 said:


> This is a great deal!
> 
> ​
> *NEW MSI Suprim GeForce RTX 4090 Picture
> 
> Please Read Description
> 
> Don't buy this if you are a REAL PERSON trying to buy a Nvidia RTX 4090. This listing is to help stop the huge bot buying problem effecting the GPU Market. This is a Picture of a MSI Suprim GeForce RTX 4090. Again, Please Don't Buy This Nvidia RTX 4090 listing if you are not a bot. Seriously, this listing is for a Picture of the MSI Suprim GeForce RTX 4090. The Picture will be delivered to your eBay Mail within 24 Hours of Purchase.
> 
> All Sales Are Final!!*


Too funny...dude just might scam an unsuspecting bot.


----------



## Nico67

ZealotKi11er said:


> +xxx do matter. I hit 2970MHz max. You need +300MHz to get close to 3100MHz and running at 1.1v.


+xxx comparison's are pointless, one card might need +150 to hit 3100, and another +300. Also changing to 1.1v will boost a lot higher too. Just keep adding core til you have issues and see what the final mhz is, 2950-3050 would be the norm.




MikeGR7 said:


> It matters for your own internal testing, it is irrelevant to be shared with others though because each card boosts differently despit using the same offset.
> 
> Multiple parameters like different ambient temperatures, silicon quality, stock voltage, stock frequency and many more vary from one user to another so a
> two cards with the same +100 offset can lead to different final frequency.
> 
> Do not compare your cards with others based on offset but only based on final boost frequency and even then be cautious because different benchmarks may have been used and that also changes what the "stable" final boost frequency is.


Exactly 

This is the same as the base TDP and %uplift debates of old.


----------



## ZealotKi11er

Nico67 said:


> +xxx comparison's are pointless, one card might need +150 to hit 3100, and another +300. Also changing to 1.1v will boost a lot higher too. Just keep adding core til you have issues and see what the final mhz is, 2950-3050 would be the norm.
> 
> 
> 
> 
> Exactly
> 
> This is the same as the base TDP and %uplift debates of old.


Again it does not matter what the final freq is. My card only had +150 to give. It was hitting 2970MHz so in the low range.


----------



## bmagnien

ZealotKi11er said:


> Again it does not matter what the final freq is. My card only had +150 to give. It was hitting 2970MHz so in the low range.


What do you mean? In a given benchmark the card tuned to be able to achieve the higher average core frequency will score higher, all other factors being equal.


----------



## fitnessgrampacertest

Does anybody else have pretty much zero control over the RGB's around the end of their ROG Strix 4090? The logo RGB's work just fine, but the RGB's on the end of the card to not respond to any RGB settings at all. They just kind of Randomly change colors at times for no apparent reason. I have Armory Crate and Aura Creator installed and up to date. But no luck

I hate armory crate so much, it sucks. Id get rid of it if i didn't need it to run my AIO


----------



## inedenimadam

Shunt mods are showing weird behaviour. using same OC as before the shunt mod, getting exact same benchmark scores. As expected, I am seeing a lower wattage reading in GPU-Z, but I am also still showing perf cap PWR. It's like the shunt mod worked, but the card is smarter than the mod.

Edit to add: I shorted 4 shunts. 3 by the pcie connector and one by the pcie slot. Using a conductive pen because I didn't want to modify the card to a point that it cant be returned to stock config for warranty


----------



## MikeGR7

So to summarize, the ranking (IMHO) goes like this as far as ELECTRICAL QUALITY (not coolers or PL) is concerned:

1. MSI SUPRIM X (LIQUID)
2. FOUNDERS EDITION
3. ASUS STRIX
4. GB MASTER (WATERFORCE)
5. GB GAMING OC
6. MSI TRIO X
7. ASUS TUF
8. GB WINDFORCE
9. ZOTAC, PALIT

IF COOLING AND PL IS PRIORITY THE LIST GOES LIKE THIS:

1. GB MASTER
2. ASUS STRIX
3. GB GAMING OC
4. MSI SUPRIM X
5. ASUS TUF
6. FOUNDERS EDITION
7. MSI TRIO X
8. GB WINDFORCE
9. ZOTAC, PALIT


----------



## Nico67

ZealotKi11er said:


> Again it does not matter what the final freq is. My card only had +150 to give. It was hitting 2970MHz so in the low range.


At 1.05v or 1.10? either way its not low, it just on the bottom end of the range. I also noticed I was crashing in benchmarks because my Ram wasn't 100% stable, but I am still tuning Zen4 at the same time 
Lastly temperature bins might be an issue, my drops one at 57c core.


----------



## dante`afk

inedenimadam said:


> for people wondering why we would want to flash a bios or hard mod for TDP headroom. This is a port royal run with 106% TDP on a gaming X trio.
> 
> 
> 
> View attachment 2576222


so your card is hitting 450w or more?


----------



## ZealotKi11er

Nico67 said:


> At 1.05v or 1.10? either way its not low, it just on the bottom end of the range. I also noticed I was crashing in benchmarks because my Ram wasn't 100% stable, but I am still tuning Zen4 at the same time
> Lastly temperature bins might be an issue, my drops one at 57c core.


1.1v.


----------



## inedenimadam

dante`afk said:


> so your card is hitting 450w or more?


106% or 480ish at 1.1V with 3ghz +/- core


----------



## Nizzen

Morteen199 said:


> so don't buy a tuf?? founders edition better ?


Tuf is VERY good


----------



## dr/owned

inedenimadam said:


> Shunt mods are showing weird behaviour. using same OC as before the shunt mod, getting exact same benchmark scores. As expected, I am seeing a lower wattage reading in GPU-Z, but I am also still showing perf cap PWR. It's like the shunt mod worked, but the card is smarter than the mod.
> 
> Edit to add: I shorted 4 shunts. 3 by the pcie connector and one by the pcie slot. Using a conductive pen because I didn't want to modify the card to a point that it cant be returned to stock config for warranty


I dunno how much I trust the conductive pen. It's plausible the card is expecting to see certain ratios on the shunts. But then again there's also un-shuntable current sensing that happens that could also be limiting it. The 3090 had all sorts of Aux power limits in the bios.



MikeGR7 said:


> So to summarize, the ranking (IMHO) goes like this as far as ELECTRICAL QUALITY (not coolers or PL) is concerned:
> 
> 1. MSI SUPRIM X (LIQUID)
> 2. FOUNDERS EDITION
> 3. ASUS STRIX
> 4. GB MASTER (WATERFORCE)
> 5. GB GAMING OC
> 6. MSI TRIO X
> 7. ASUS TUF
> 8. GB WINDFORCE
> 9. ZOTAC, PALIT


The Gaming OC is definitely below the TUF. It "only" has 50A power stages. Although I haven't seen detail on whether the TUF has the same 70A stages the Strix has. The FE is also below the strix.


----------



## DokoBG

Doesnt really matter guys. The top Chiller cooled card at like -35C average is like 10% difference to the cheapest air cooled max overclocked 4090... The top air cooled boards that can actually go around 3Ghz+ with 600W bios are about ~4.5% faster than the average air cooled card with the 480W bios... i mean, ***, this is literally sad results. Go to the Port Royal hall of fame and compare some graphics results - extreme cooling is really garbage with these 4090s so it almost doesn't really matter what you get. Lets hope some custom bioses can actually improve something in the future.


----------



## Roldo

Yeah Gaming OC looks pretty average PCB wise
20x50A PS, i/o filtering looks meh (still better than the barebones Zotac,Palit, Colorful reviewed on TPU)
Even cooling wise it's not anything special but then again all coolers are pretty good this time around

If TUF/Trio(X) are lite versions of Strix/Suprim X using same(ish) components they should both be better

Does it make a difference for the average user though? Not really


----------



## MikeGR7

It is very close between those models but even though the inductance is a bit higher on the tuf, the Gaming has 2 more power phases plus (and that's more important) it uses way more SMDs instead of classic through hole capacitors on the Asus.

Founders is also better than Strix in everything except memory phases which has one phase less and it's honestly an irrelevant advantage because the Vram could even work with a single such phase.

It sounds like a joke but it is true, Asus is not leading this year.


----------



## MikeGR7

Roldo said:


> Yeah Gaming OC looks pretty average PCB wise
> 20x50A PS, i/o filtering looks meh (still better than the barebones Zotac,Palit, Colorful reviewed on TPU)
> Even cooling wise it's not anything special but then again all coolers are pretty good this time around
> 
> If TUF/Trio(X) are lite versions of Strix/Suprim X using same(ish) components they should both be better
> 
> Does it make a difference for the average user though? Not really


i/o filtering is better than tuf and 20 phases are more important than 16(trio) even with lower inductance.
Cooling wise is better than both tuf and trio.

If all this give actual results it's another matter i agree.


----------



## bottjeremy

My entry level Gigabyte Windforce 4090 on a 5900x is doing quite nice. No coil whine, card is quiet, overclocks like a champ. Been very stable at +220 +1500 106% power target. 1.1 volts. 3045mhz clock speed in games. Very happy with the 4090.






Also, have a few of the top spots for 5900x and 4090 on 3Dmark with this card.


----------



## LunaP

RetroWave78 said:


> Spoiler
> 
> 
> 
> "Reduce sell in to let channel inventory correct"
> 
> 
> 
> 
> 
> 
> Paper launch, artificial scarcity 100% confirmed.
> 
> Think about it, there are maybe 10 FE's on ebay all going for $3k and no-one in this thread managed to get one, so maybe Nvidia sold 100-500 on launch day globally?
> 
> This is disgusting.
> 
> View attachment 2576172


Yeah as of yesterday 321 completed listing of the 4090 were up of that 320 were sold, today we're at 379 (per ebay)
That's not accounting for Amazon/Craigslist and other avenues, it legit sucks. I spoke to multiple bestbuy reps all week and they're now all sharing the same info that it will be a while ( pref 2-3 months ) before next shipment? If any since they don't normally stock high end cards, but keep telling me that the 4080 is next month like ok but this is about the 4090..


----------



## AdamK47

MikeGR7 said:


> So to summarize, the ranking (IMHO) goes like this as far as ELECTRICAL QUALITY (not coolers or PL) is concerned:
> 
> 1. MSI SUPRIM X (LIQUID)
> 2. FOUNDERS EDITION
> 3. ASUS STRIX
> 4. GB MASTER (WATERFORCE)
> 5. GB GAMING OC
> 6. MSI TRIO X
> 7. ASUS TUF
> 8. GB WINDFORCE
> 9. ZOTAC, PALIT
> 
> IF COOLING AND PL IS PRIORITY THE LIST GOES LIKE THIS:
> 
> 1. GB MASTER
> 2. ASUS STRIX
> 3. GB GAMING OC
> 4. MSI SUPRIM X
> 5. ASUS TUF
> 6. FOUNDERS EDITION
> 7. MSI TRIO X
> 8. GB WINDFORCE
> 9. ZOTAC, PALIT


Conjecture is so much fun.

We've all seen lists like these before. Clickbait list sites where rankings lack any sort of factual information to back it up.


----------



## J7SC

I don't think there's much between the Gaming OC and TUF OC...the Gaming OC has 20 + 4 stages (50A) while the TUF OC has 10 ('teamed', A?) + 4. None of these cards will run out of power unless you're doing naughty things with LN2 and custom XOC vbios. Most of these cards (including Strix, GigaGamOC, MSI Suprim X etc) also got the editor's choice awards at TPU, probably because NVidia kept a lid on too much 'divergence' by vendors' engineering teams. From that perspective, price (and availability) also matter.

BTW, it should be interesting to see what the Galax 4090 (+Ti) HoF is allowed to get away with... 

So far, 'the numbers' on my Gaming OC are very encouraging but I will still water-cool it sooner rather than later (have to really to fit in the case). I am also going to keep a very close eye on Hotspot temps, what with the 4 nm process. No issues so far but I have zero experience oc'ing with such a node at up to 600W.

Gigabyte Gaming OC PCB from TPU in the spoiler below. Incidentally, the warranty card in the box says '*Aorus*' Warranty, no mention of Gigabyte at all...as someone else already suggested earlier, they might have changed their mind relatively close to release...


Spoiler


----------



## Xavier233

Maybe the choice of VRMs between Gigabyte vs Asus is related to CoilWhine prevention? Less current going through them might mean less chance of it happening?


----------



## RetroWave78

J7SC said:


> I don't think there's much between the Gaming OC and TUF OC...the Gaming OC has 20 + 4 stages (50A) while the TUF OC has 10 ('teamed', A?) + 4. None of these cards will run out of power unless you're doing naughty things with LN2 and custom XOC vbios. Most of these cards (including Strix, GigaGamOC, MSI Suprim X etc) also geot the editor's choice awards at TPU, probably because NVidia kept a lid on too much 'divergence' by vendors' engineering teams. From that perspective, price (and availability) also matter.
> 
> BTW, it should be interesting to see what the Galax 4090 (+Ti) HoF is allowed to get away with...
> 
> So far, 'the numbers' on my Gaming OC are very encouraging but I will still water-cool it sooner rather than later (have to really to fit in the case). I am also going to keep a very close eye on Hotspot temps, what with the 4 nm process. No issues so far but I have zero experience oc'ing with such a node at up to 600W.
> 
> Gigabyte Gaming OC PCB from TPU in the spoiler below. Incidentally, the warranty card in the box says '*Aorus*' Warranty, no mention of Gigabyte at all...as someone else already suggested earlier, they might have changed their mind relatively close to release...
> 
> 
> Spoiler


That's what I'm saying, power delivery isn't critical when the cards fall off a power efficiency cliff beyond 300w, a cliff that gets steeper the higher you go.

AD-102 yields 5% performance gain taking PT from 100% (450w) to 133% (600w, or 666.6w). Most sensible people will be bringing the core temp down and undervolting as with Ampere.

Ideal scenario, 2750 MHz @ .950v @ 300w @ 40C core

When you bring core temp down to ~40C you can get away with frequencies that would require another 100mv @ ~70C.






In my opinion a $1600 FE or Tuf is the better option over a $2k Strix with better power delivery.

We have yet to see unlocked voltage bios however, at that point in time having better power delivery will be of some value for those who want that extra 5% above 600w, running their card at 3300 MHz and 900w power draw.


----------



## Morteen199

so are the Tuf whit 18 power phases going do worse in overclocking? less safe whit only 18 power phases to overclock? and are there any real world difference between strix and tuf in gaming ?


----------



## dr/owned

Morteen199 said:


> so are the Tuf whit 18 power phases going do worse in overclocking? less safe whit only 18 power phases to overclock? and are there any real world difference between strix and tuf in gaming ?


No, all cards are overclocking practical the same regardless of the pcb. There’s zero advantage to the Strix.

We're probably never going to get unlocked voltage from software...only the uP9512R cards (Gaming OC, Zotac) have EVC2 support now. The MP2891 cards aren't controllable yet.


----------



## MikeGR7

AdamK47 said:


> Conjecture is so much fun.
> 
> We've all seen lists like these before. Clickbait list sites where rankings lack any sort of factual information to back it up.


If you bothered to read some pages back you would find a lot of actual information from me but you seem bored to even write two sentences.



Xavier233 said:


> Maybe the choice of VRMs between Gigabyte vs Asus is related to CoilWhine prevention? Less current going through them might mean less chance of it happening?


I have to say that im hearing reports that the Tuf has heavy coil whine noise.
Even more than Trio. (Which seems has very little)
We will know more in the next few days ofc.


----------



## Lune

RetroWave78 said:


> That Tuf is the same MSRP as FE and performs the same and has a WB available, if you're going to throw whatever card you're getting under a WB this is a great alternative to FE. Also 600w power limit.


Out of curiosity, how do we know that it's 600W? Is there a link for that? I managed to get an ASUS TUF non OC @ MSRP. Wasn't this a 450W / 520W card or something along those lines?


----------



## Morteen199

Asus told me this about the TUF : " The VRM is made up of *22 power phases* distributed in two main lines on each side of the socket and 4 extra phases located in the corners attached to the memory chips. The two main DC-DC zones have *18 Infineon TDA21570 MOSFETS with 70A* nominal power, thus generating a capacity of 1260A. *The 4 extra phases have 50A Vishay SiC639 MOSTFETS* . The digital controller in charge of managing the energy signal will be an *MPS2103* , while the filtering capacitors used are military grade with a useful life of *20K hours and 105ºC* ." so are there 22 or 18 power phases? xD


----------



## MikeGR7

Lune said:


> Out of curiosity, how do we know that it's 600W? Is there a link for that? I managed to get an ASUS TUF non OC @ MSRP. Wasn't this a 450W / 520W card or something along those lines?


It has been confirmed by fellow members here that both models are 600W.



Morteen199 said:


> Asus told me this about the TUF : " The VRM is made up of *22 power phases* distributed in two main lines on each side of the socket and 4 extra phases located in the corners attached to the memory chips. The two main DC-DC zones have *18 Infineon TDA21570 MOSFETS with 70A* nominal power, thus generating a capacity of 1260A. *The 4 extra phases have 50A Vishay SiC639 MOSTFETS* . The digital controller in charge of managing the energy signal will be an *MPS2103* , while the filtering capacitors used are military grade with a useful life of *20K hours and 105ºC* ." so are there 22 or 18 power phases? xD


It is as we know, 18 phases for the Vcore and 4 phases for the Vmem.
We mostly refer to the Vcore phases around here because it's more important than memory.


----------



## smushroomed

are riser cables okay to use? I'm reading review on amazon and they say that most cables are pci 3.0


----------



## Lune

MikeGR7 said:


> It has been confirmed by fellow members here that both models are 600W.


Thank you for the info.


----------



## RetroWave78

Lune said:


> Out of curiosity, how do we know that it's 600W? Is there a link for that? I managed to get an ASUS TUF non OC @ MSRP. Wasn't this a 450W / 520W card or something along those lines?


Great question, I did a lot of digging before pulling the trigger today and did ascertain that Tuf, both OC and non-OC, is 600w. 100% PT = 450w and 133% PT = 600W just like FE.

The only difference between OC and non-OC Tuf is 40 MHz more core via vbios and $200.

OC Tuf isn't binned better.

It's all marketing nonsense.

Found some information:









ASUS TUF GeForce RTX 4090 Gaming review


In this review the turn goes to ASUS; they submitted that factory-tweaked and impressively cooled TUF Gaming OC edition of the GeForce RTX 4090. Loaded with a faster clock frequency, dual BIOS, and p... Overclocking




www.guru3d.com





Guru3D shows power limit has another 33% on tap and that both the 4090 FE and Tuf drew over 500w @ +33% PT.

GeForce RTX 4090 Founders edition review

As you can see, with both cards overclocked the performance is nearly identical.

Here's Suprim Liquid X overclocked for good measure.









MSI GeForce RTX 4090 Suprim Liquid X review


So many GeForce RTX 4090 you've been able to read already. But we end our review streak with a bang, the superb SUPRIM Liquid X. A card that eats away only two slots, and is water cooled, bringing co... Overclocking




www.guru3d.com





There is a chance non-OC Tuf is limited to 500w but I did confirm it has 600w on tap before placing pre-order today (I just can't remember where and I could be mistaken) and it does come with a 4x8 pin PCI-E adapter from Asus.

Either way, my Tuf will be spending it's life under a water block undervolted, I'm shooting for 100% PT FE performance @ 80% PT or 350w.

Edit: 

I just realized the Guru3D review is for the OC version. It was listed as $1599 in the review which is what confused me.


----------



## dr/owned

smushroomed said:


> are riser cables okay to use? I'm reading review on amazon and they say that most cables are pci 3.0











NVIDIA GeForce RTX 4090 PCI-Express Scaling


The new NVIDIA GeForce RTX 4090 is a graphics card powerhouse, but what happens when you run it on a PCI-Express 4.0 x8 bus? In our mini-review we've also tested various PCI-Express 3.0, 2.0 and 1.1 configs to get a feel for how FPS scales with bandwidth.




www.techpowerup.com





TLDR: doesn't matter PCIe 3.0 or 4.0


----------



## Lune

RetroWave78 said:


> Great question, I did a lot of digging before pulling the trigger today and did ascertain that Tuf, both OC and non-OC, is 600w. 100% PT = 450w and 133% PT = 600W just like FE.
> 
> The only difference between OC and non-OC Tuf is 40 MHz more core via vbios and $200.
> 
> OC Tuf isn't binned better.
> 
> It's all marketing nonsense.
> 
> Found some information:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ASUS TUF GeForce RTX 4090 Gaming review
> 
> 
> In this review the turn goes to ASUS; they submitted that factory-tweaked and impressively cooled TUF Gaming OC edition of the GeForce RTX 4090. Loaded with a faster clock frequency, dual BIOS, and p... Overclocking
> 
> 
> 
> 
> www.guru3d.com
> 
> 
> 
> 
> 
> Guru3D shows power limit has another 33% on tap and that both the 4090 FE and Tuf drew over 500w @ +33% PT.
> 
> GeForce RTX 4090 Founders edition review
> 
> As you can see, with both cards overclocked the performance is nearly identical.
> 
> Here's Suprim Liquid X overclocked for good measure.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI GeForce RTX 4090 Suprim Liquid X review
> 
> 
> So many GeForce RTX 4090 you've been able to read already. But we end our review streak with a bang, the superb SUPRIM Liquid X. A card that eats away only two slots, and is water cooled, bringing co... Overclocking
> 
> 
> 
> 
> www.guru3d.com
> 
> 
> 
> 
> 
> There is a chance non-OC Tuf is limited to 500w but I did confirm it has 600w on tap before placing pre-order today (I just can't remember where and I could be mistaken) and it does come with a 4x8 pin PCI-E adapter from Asus.
> 
> Either way, my Tuf will be spending it's life under a water block undervolted, I'm shooting for 100% PT FE performance @ 80% PT or 350w.


Thanks for this. Yeah, I definitely won't be running it maxed out anyway but just wanted to know (for benchmarks etc). These cards seem so overengineered that I'll be skipping the waterblock part this series due to their insane cooling since my PC is in a different room + it's cold so it's already like running on water 

The TUF does seem like a good card, wanted an FE but Germany FE has the worst website ever created (notebookbilliger) so getting an FE card is very difficult since it's always botted and the website breaks even when there's no drops. I'll post my results when I get this thing in a few days.


----------



## RetroWave78

Lune said:


> Thanks for this. Yeah, I definitely won't be running it maxed out anyway but just wanted to know (for benchmarks etc). These cards seem so overengineered that I'll be skipping the waterblock part this series due to their insane cooling since my PC is in a different room + it's cold so it's already like running on water
> 
> The TUF does seem like a good card, wanted an FE but Germany FE has the worst website ever created (notebookbilliger) so getting an FE card is very difficult since it's always botted and the website breaks even when there's no drops. I'll post my results when I get this thing in a few days.


One correction, the Guru3D review I referred to is actually the OC version, you may want to ascertain the power limit of non-OC Tuf before pulling the trigger on one.


----------



## gamerMwM

RetroWave78 said:


> Heads up, B&H Photo is accepting pre-orders for 4090 Tuf, I just placed one. No guarantee as to when they will actually come in but this beats scouring everywhere else multiple times a day and it sure beats buying these for $3k on ebay. This very same pre-order is currently on ebay for like $2500 (at current auction value) so act fast!
> 
> 
> 
> https://www.bhphotovideo.com/c/product/1730937-REG/asus_tuf_rtx4090_24g_gaming_tuf_gmg_gefrc_rtx.html


Thanks for the heads up! Got one! And even if they are closed thru the 18th, with overnight shipping I should still get it in before end of next week. That's what I'm hoping for anyway. Next thing on my list is to order a Byski water block:



https://www.aliexpress.us/item/3256804656264743.html



The Byski block on my 3090 Asus Strix has been working great, and keeping that card cool with regular gaming overclocking for the past year, so no hesitations here especially at $122 USD


----------



## cletus-cassidy

dr/owned said:


> I dunno how much I trust the conductive pen. It's plausible the card is expecting to see certain ratios on the shunts. But then again there's also un-shuntable current sensing that happens that could also be limiting it. The 3090 had all sorts of Aux power limits in the bios.
> 
> The Gaming OC is definitely below the TUF. It "only" has 50A power stages. Although I haven't seen detail on whether the TUF has the same 70A stages the Strix has. The FE is also below the strix.


This review suggests it’s the same power stages: Asus TUF Gaming RTX 4090 OC Review en español (Análisis completo)


----------



## dante`afk

DokoBG said:


> Doesnt really matter guys. The top Chiller cooled card at like -35C average is like 10% difference to the cheapest air cooled max overclocked 4090... The top air cooled boards that can actually go around 3Ghz+ with 600W bios are about ~4.5% faster than the average air cooled card with the 480W bios... i mean, ***, this is literally sad results. Go to the Port Royal hall of fame and compare some graphics results - extreme cooling is really garbage with these 4090s so it almost doesn't really matter what you get. Lets hope some custom bioses can actually improve something in the future.


4090 is voltage starved. unless you put an evc2 on it, you won't see big numbers.


----------



## Roldo

Strix: 24x70A PS from Onsemi (FDMF3170) for GPU voltage + 4x of the same FDMF3170 for memory, MP2891 controller (for both GPU/VRAM) 

TUF: 18x70A PS from Infineon (TDA21570) for GPU + 4x50A PS from Vishay (SiC639), MP2888A controller from Monolith


----------



## RetroWave78

gamerMwM said:


> Thanks for the heads up! Got one! And even if they are closed thru the 18th, with overnight shipping I should still get it in before end of next week. That's what I'm hoping for anyway. Next thing on my list is to order a Byski water block:
> 
> 
> 
> https://www.aliexpress.us/item/3256804656264743.html
> 
> 
> 
> The Byski block on my 3090 Asus Strix has been working great, and keeping that card cool with regular gaming overclocking for the past year, so no hesitations here especially at $122 USD


Sadly it's only a pre-order! It will probably be a few weeks, best case scenario, before B&H get enough stock in to satisfy all the pre-orders, and seeing as how scalpers had already created listings of only the B&H pre-order, we may be waiting for months depending on when exactly you pre-ordered. I don't remember seeing a quantity limit either, so scalpers could have gone willy nilly on this listing and B&H could be sitting on our money for months.

This scalping bullshit makes me furious, and all because NGreedia have artificially limited supply due to 30 series overstock because of their shortsighted decision to sell hand-over-fist to miners. It's beyond ridiculous.

Edit: 

Yep, listing quantity drop down menu you can actually order 3, so guaranteed scalper ****s bought the whole lot of them. Maybe I should cancel my pre-order.


----------



## Morteen199

Roldo said:


> Strix: 24x70A PS from Onsemi (FDMF3170) for GPU voltage + 4x of the same FDMF3170 for memory, MP2891 controller (for both GPU/VRAM)
> 
> TUF: 18x70A PS from Infineon (TDA21570) for GPU + 4x50A PS from Vishay (SiC639), MP2888A controller from Monolith


so how much better are strix? worth it over tuf? and whats better tuf or gigabyte gaming? what should i buy? xD


----------



## Lune

RetroWave78 said:


> One correction, the Guru3D review I referred to is actually the OC version, you may want to ascertain the power limit of non-OC Tuf before pulling the trigger on one.


No problem, I'll be happy with it either way. I was mostly just curious about it, not like I'd ever run 600W daily anyway. I'll be doing an undervolt or stock like you


----------



## gamerMwM

RetroWave78 said:


> Sadly it's only a pre-order! It will probably be a few weeks, best case scenario, before B&H get enough stock in to satisfy all the pre-orders, and seeing as how scalpers had already created listings of only the B&H pre-order, we may be waiting for months depending on when exactly you pre-ordered. I don't remember seeing a quantity limit either, so scalpers could have gone willy nilly on this listing and B&H could be sitting on our money for months.
> 
> This scalping bullshit makes me furious, and all because NGreedia have artificially limited supply due to 30 series overstock because of their shortsighted decision to sell hand-over-fist to miners. It's beyond ridiculous.
> 
> Edit:
> 
> Yep, listing quantity drop down menu you can actually order 3, so guaranteed scalper ****s bought the whole lot of them. Maybe I should cancel my pre-order.


That's possible, but there are indications that there could be some decent stock coming in on the Tuf series 4090. Newegg was allowing backorders the other night on the Tuf OC 4090 for a little bit and shut them off later at a certain point. I hadn't seen them do that with any other cards. Maybe the B&H situation will be the same once preorders hit a certain amount.

If I get a priority order notification from Best Buy this week thru geforce I'll probably get an FE and then cancel the Tuf 4090 if it hasn't shipped by then. If I get both in a timely fashion, I'll keep the better bin and send one back.


----------



## RetroWave78

gamerMwM said:


> That's possible, but there are indications that there could be some decent stock coming in on the Tuf series 4090. Newegg was allowing backorders the other night on the Tuf OC 4090 for a little bit and shut them off later at a certain point. I hadn't seen them do that with any other cards. Maybe the B&H situation will be the same once preorders hit a certain amount.
> 
> If I get a priority order notification from Best Buy this week thru geforce I'll probably get an FE and then cancel the Tuf 4090 if it hasn't shipped by then. If I get both in a timely fashion, I'll keep the better bin and send one back.


Best of luck to you, sadly my finances don't allow buying more than one card, I've been saving over the past 6 months and could barely swing the $1750 for the Tuf after taxes and shipping. Luckily the water block is already accounted for.

Had EKWB indicated that there was a Strix / Tuf WB to be released soon I would've just pulled the trigger on a Tuf from the beginning. The only reason I fixated on FE was the price and WB availability (and gorgeous air-cooler, but ultimately it's going under water).

I actually had a Strix in my cart at 6 am the 12th at Newegg but hesitated last minute and ultimately that was a good thing I think, if I can get a Tuf in the next month or so, because that $400 price difference is nearly what I have left in my checking account for the rest of the month.

I can't believe how expensive these things are now, this is pretty bad. I don't think I'm going to upgrade after this. 2x performance over 3090 (40k Timespy, 28k Port Royal), this thing will be future-proof for some time.


----------



## RetroWave78

Morteen199 said:


> so how much better are strix? worth it over tuf? and whats better tuf or gigabyte gaming? what should i buy? xD


Not in my opinion, no. The reason Strix is selling at $2k MSRP is because that's a bargain to what's available on ebay. The card absolutely does not warrant the $400 price difference unless you're keeping it on air and even then, that's a lot of money for a nice oversized air-cooler. 

None of the dies seem to be binned this time around.


----------



## alitayyab

The latest version of firestorm does allow +Vcore (up to 100mv). I know zotac isnt the best constructed card, but it was the only one available locally (barring the palit which was more expensive). Raising the Vcore by 25mV in firestorm results in Vcore to rise to 1.65V (HW Info). Max clock freq. in superposition 3030Mhz. Varies between 3030. and 3000. Max power draw 411 Watts.

Makes pointless difference in gaming though.


----------



## Roldo

Morteen199 said:


> so how much better are strix? worth it over tuf? and whats better tuf or gigabyte gaming? what should i buy? xD


As long as the PCB is good enough (as in adequately designed for what's it supposed to do and thus reliable in the long term) you don't need to worry too much about it.
(unless you're into extreme overclocking)

I'd take the TUF over the Gigabyte but not because of the PCB differences, I just like the cooler and the all metal build more.

Better VRM, filtering is gonna net you so little in terms of oc for the average user

A good and silent cooler is more important to me than an over designed PCB

Regarding PCB quality though, Suprim X, Strix, FE seem to be the better ones for now (imo)

Honestly I feel there's too much of an emphasis on PCB quality in general.
People watch buildzoid videos/tier lists and start obsessing about buying the GPU/motherboard with the most powerful VRM, pay a premium for said product and then run their CPU/GPU stock or with your run of the mill oc that would have been handled just fine by pretty much anything from the midrange.


----------



## Pietro

MikeGR7 said:


> So to summarize, the ranking (IMHO) goes like this as far as ELECTRICAL QUALITY (not coolers or PL) is concerned:
> 
> 1. MSI SUPRIM X (LIQUID)
> 2. FOUNDERS EDITION
> 3. ASUS STRIX
> 4. GB MASTER (WATERFORCE)
> 5. GB GAMING OC
> 6. MSI TRIO X
> 7. ASUS TUF
> 8. GB WINDFORCE
> 9. ZOTAC, PALIT
> 
> IF COOLING AND PL IS PRIORITY THE LIST GOES LIKE THIS:
> 
> 1. GB MASTER
> 2. ASUS STRIX
> 3. GB GAMING OC
> 4. MSI SUPRIM X
> 5. ASUS TUF
> 6. FOUNDERS EDITION
> 7. MSI TRIO X
> 8. GB WINDFORCE
> 9. ZOTAC, PALIT


I disagree, electrically:
1. MSI SUPRIM X (LIQUID) - Vcore: 26x70A MPS mem:4x70A MPS
2. ASUS STRIX - Vcore 24x70A Onsemi, mem 4x70A Onsemi
3. FOUNDERS EDITION - Vcore: 20x70A MPS mem:3x70A MPS
4. Colorful Vulcan OC-V - Vcore: 24x55A Alpha&Omega, mem 4x55A
5. GB MASTER (WATERFORCE) - Vishay 24x50A, mem: 4x50A Vishay
6. ASUS TUF Vcore: 18x70A Infineon, mem: 4x50A Vishay
7. GB GAMING OC Vcore: 20x50A Vishay, mem 4x50A Vishay
8. MSI TRIO X - Vcore: 18x50A , mem:4x50A

If cooling is priority with being still decent electrically:
1. GB Waterforce with 360mm AIO
1.5. MSI Suprim Liquid X with 240mm AIO
3. Asus Strix
4. GB MASTER - first with 1800 rpm and 38dBA? - not really, what's the point of getting 59-60*C when card it's loud and check the width with 162,8mm + connector it will fit only few cases
5. MSI Suprim
6. Colorful Vulcan OC-V, probably not available in Europe and America, also is quite big 159mm width it won't fit with connector just like master
7. ASUS TUF it has good cooler, non English reviews prove that, but it's a shame that those cards so far had the most coil whine cases
8. GB GAMING OC 60 *C and the loudest card in TPU review to get that, no so great on normalised tests when on 35 dBA it gets 65 *C and Strix 61 *C
9. MSI TRIO X - no vapor chamber
10. FOUNDERS EDITION

The rest is not worth due to being either poor electically or cooling wise compared to those from list

Update due to new tests.


----------



## J7SC

smushroomed said:


> are riser cables okay to use? I'm reading review on amazon and they say that most cables are pci 3.0


Yes - I already had a riser PCIe 4.0 cable in use for the 3090 Strix, just transferred it & it works great with the 4090 (PCIe. 4.0 X16). I use the type of riser in the spoiler (have two of those, from Amazon last year).


Spoiler


----------



## Clukos

mattskiiau said:


> Anyone know how to zoom out or increase Freq MHZ graph?
> Tried looking up keybinds but couldn't find anything.
> View attachment 2576214


Change VFCurveEditorMaxFrequency in the config where you installed Afterburner, I've set it to 4000 for example:


----------



## MIST3RST33Z3

Clukos said:


> Change VFCurveEditorMaxFrequency in the config where you installed Afterburner, I've set it to 4000 for example:
> View attachment 2576246


Will it actually go up to 1.25, or is it still capped at 1.1?


----------



## Nizzen

Morteen199 said:


> so are the Tuf whit 18 power phases going do worse in overclocking? less safe whit only 18 power phases to overclock? and are there any real world difference between strix and tuf in gaming ?


Doesn't matter what card you are buying, unless you are voltmodding the card. Every model is performing the same. + - 2% depending on silicon lottery.


----------



## Clukos

MIST3RST33Z3 said:


> Will it actually go up to 1.25, or is it still capped at 1.1?


It's capped from the bios so it won't go above 1.1 afaik.


----------



## mirkendargen

My Canadian Gigabyte Gaming OC made it home and got #22 single GPU on Port Royal. Seems like a keeper, boosts to 3120 then tapers down to 3090 as the temperature rises. Memory can only do +1400 though, +1500 crashes.









I scored 28 202 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## MIST3RST33Z3

mirkendargen said:


> My Canadian Gigabyte Gaming OC made it home and got #22 single GPU on Port Royal. Seems like a keeper, boosts to 3120 then tapers down to 3090 as the temperature rises. Memory can only do +1400 though, +1500 crashes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 202 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Wow, that is an amazing card! I havent seen many people in the 3100 range.


----------



## J7SC

mirkendargen said:


> My Canadian Gigabyte Gaming OC made it home and got #22 single GPU on Port Royal. Seems like a keeper, boosts to 3120 then tapers down to 3090 as the temperature rises. Memory can only do +1400 though, +1500 crashes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 202 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


'grats  
...and speaking of Canadian Gigabyte Gaming OC from the same store (now down to 2 available cards from 25 or so they had initially) and shelf, I've been slowly working my way up on VRAM...don't know where the limit is yet (also re. efficiency)... going in 100 MHz steps, but hopefully find out tomorrow. These cards seem to be great clockers, only issue is hotspot temps beyond 500W and 1.1V...need a water-block, preferably from Phanteks or even Bykski (my last 3090 Strix EKWB block had multiple issues).


----------



## LukeOverHere

Hey Guys, I ended up grabbing a Gainward Phantom RTX 4090 (GPU-RTX4090-24GB-Gainward-Phantom). I did a bit of research and found a write up on the GS version which performed really well, I'm hoping there is not much performance loss between the GS & non-GS Versions, but hopefully this is irrelevant when we can flash the bios later..... If there is any Australians on here, this seems to be the best priced 4090 card at $2799 AUD from TechFast, Luckily I had $200 cash back as well so it cost me $2599 total which is still savage, but at least slightly cheaper hahaha . Every other card i have found starts at $3000, with the Strix being $3800 which is insane..... that's an additional $1000 over the Gainward card.... Im hoping the Gainward is quality, it looks good so far.


----------



## BeZol

LukeOverHere said:


> Hey Guys, I ended up grabbing a Gainward Phantom RTX 4090 (GPU-RTX4090-24GB-Gainward-Phantom). I did a bit of research and found a write up on the GS version which performed really well, I'm hoping there is not much performance loss between the GS & non-GS Versions, but hopefully this is irrelevant when we can flash the bios later..... If there is any Australians on here, this seems to be the best priced 4090 card at $2799 AUD from TechFast, Luckily I had $200 cash back as well so it cost me $2599 total which is still savage, but at least slightly cheaper hahaha . Every other card i have found starts at $3000, with the Strix being $3800 which is insane..... that's an additional $1000 over the Gainward card.... Im hoping the Gainward is quality, it looks good so far.


GS-version here:









Still at tweaking. (world 13. Graphics Score)
Imagine this with a Strix BIOS.
BTW my memory is stable until +1400, depends on the gpu-clock + voltage combination.


----------



## yt93900

WayWayUp said:


> could you even imagine how much asus would ask for a 4090ti kingpin? if the regular strix 4090 is $2k?
> 
> 2.8k? rofl


Wasn't the Galax HOF about 3.5-4k?


----------



## StreaMRoLLeR

Anyone able to active force rebar for 4090 in 3dmark ?


----------



## LukeOverHere

BeZol said:


> GS-version here:
> View attachment 2576274
> 
> 
> Still at tweaking. (world 13. Graphics Score)
> Imagine this with a Strix BIOS.
> BTW my memory is stable until +1400, depends on the gpu-clock + voltage combination.


This is awesome! I’m Looking forward to messing around with this when it arrives! As a starting point I’m hoping i can grab the Phantom GS Bios and flash it to the Phantom (Non-GS) to achieve the minor boost if nothing better is available at the time. I completely agree about the Strix Bios, this will be very interesting!

I cant see there being any major differences between the Phantom GS & Phantom PCB’s, so I’m hoping i have not missed out much buying the cheaper Phantom card over the Phantom GS, i was busy balancing the price as it is already insane….


----------



## xcx xcxvgyt

BeZol said:


> GS-version here:
> View attachment 2576274
> 
> 
> Still at tweaking. (world 13. Graphics Score)
> Imagine this with a Strix BIOS.
> BTW my memory is stable until +1400, depends on the gpu-clock + voltage combination.


My gamerock non oc is also stable around 3120mhz and can bench mem+1750 but power limit is 450w+zero 

So i can't get more than 28139 from port royal and waiting for bios mod 

3dmark.com without system info on that's the best i can with 450w

Plus, I can't update futuremark SystemInfo for some reason and unable to solve it yet


----------



## Nizzen

tps3443 said:


> It’s kinda funny, because while I have owned my 3090KP HC for a long while now. It still feels super freaking powerful to this day. It is OCed to the absolute max.
> 
> The 4090 better be a worth while upgrade. Just saying.


3090 feels like 1050ti compared to 4090 now 🤣


----------



## xboxmingjai

I got GB master. is it good?


----------



## yt93900

Nizzen said:


> 3090 feels like 1050ti compared to 4090 now 🤣


To be fair, it sometimes does 
Even without the raw benchmark numbers, the 4090 does feel a lot faster.


----------



## Nizzen

xboxmingjai said:


> I got GB master. is it good?


Same as every 4090.


----------



## BeZol

xcx xcxvgyt said:


> My gamerock non oc is also stable around 3120mhz and can bench mem+1750 but power limit is 450w+zero
> 
> So i can't get more than 28139 from port royal and waiting for bios mod
> 
> 3dmark.com without system info on that's the best i can with 450w
> 
> Plus, I can't update futuremark SystemInfo for some reason and unable to solve it yet


For me (during Time Spy Extreme testing):
1.0V - 2940MHz
1.02V - 3000MHz
1.06V - 3075MHz
1.1V - 3105MHz

and the memory is +1400 (+1425 = crash, going to check later with +1 steps...)

My main issue now is, that if I set 1.0V 2940MHz, it goes to 2970MHz... So I need to set the voltage/frequency curve to 1.0V 2910MHz to achieve 1.0V 2940MHz.
And this behaviour is at every combination different, so I need to see how the gpu-clock is behaving...
But for example 1.06V 3075MHz setup is rock-stable, it doesn't overshoot with +15-30-45MHz.

For me a Strix BIOS would mean, that if I am not at TDP-limit with 1.1V (which I dont know yet), then I could bench through TSE with 3105MHz GPU-clock.
If there would be higher voltage possible, then I could check how much voltage I need for 3120-3135-3150...MHz

At Graphics Test 2 at the highest TDP-demand I go down until 1.0V (sometimes to 0.995V) with the +1400 memory.

The RTX 4090 is memory-bandwith-limited around 3200MHz and +1500 memory, so there is not much room left with higher TDP and voltage (because of the memory).


----------



## Benni231990

now what we See the sweetspot is 3ghz with low voltage 

maybe When somebody have a golden sample He can run 1v with 3ghz this would be a absolute dream


----------



## GTANY

ASUS RTX 4090 TUF review : (29) The Best Yet! Asus RTX 4090 TUF Gaming OC Review, Thermals, Power & Overclocking - YouTube 

Disappointing : the TUF memory is 12°C hotter than the Gigabyte Gaming OC. And in France, it is one of the most expensive models.


----------



## long2905

GTANY said:


> ASUS RTX 4090 TUF review : (29) The Best Yet! Asus RTX 4090 TUF Gaming OC Review, Thermals, Power & Overclocking - YouTube
> 
> Disappointing : the TUF memory is 12°C hotter than the Gigabyte Gaming OC. And in France, it is one of the most expensive models.


the gigabyte gaming oc shapes up to be the best one so far temp wise


----------



## ArcticZero

Has there been any news on custom blocks for MSI Suprim cards? I'm debating foregoing the extra HDMI port on the TUF/Strix models since the TUF seems to be the most in demand card here, and so is the Strix despite the price difference. I just want to be able to put it in my loop with a decent block.


----------



## Benni231990

I have send my suprim to Alphacool and they will scan the card and make a Waterblock 

Maybe 4 weeks we all can buy it


----------



## Glottis

GTANY said:


> ASUS RTX 4090 TUF review : (29) The Best Yet! Asus RTX 4090 TUF Gaming OC Review, Thermals, Power & Overclocking - YouTube
> 
> Disappointing : the TUF memory is 12°C hotter than the Gigabyte Gaming OC. And in France, it is one of the most expensive models.


Gigabyte ran at higher fan speed. 2300RPM vs 2000RPM. If fan speeds were normalized memory temperatures wold be closer. Anyway, this doesn't matter as temps on TUF are still excellent and well under FE temps. I would get TUF any day simply because of all metal build quality and extra HDMI 2.1 port. Gigabyte card looks like $500 entry model with that gaudy plastic shard. Also I had bad experience with Gigabyte GPUs.


----------



## yt93900

The Aorus Xtreme is also plastic-fantastic, hoewever I see it as a good thing - less weight on the PCI-E slot.


----------



## Daemon_xd

Der8auer post his voltage mod video with STRIX 4090


----------



## yt93900

He says it hit the power limit but didn't say a word about how high the limit actually was? Does the Strix cap at 600W?


----------



## DokoBG

Yes


----------



## Benni231990

so now we know If you want higher boost you need a Voltage mod AND a Shuntmod holy ****


----------



## yzonker

Well I feel a little better about this. Card is better than I thought at first.









Result not found







www.3dmark.com





For some reason Speedway is slow on my system though. Seems to just cap at the score below. Going from +240 to +270 on the core did almost nothing. No power limiting. Not sure what is going on there. Need to take a look at effective clocks maybe, although I didn't do anything different with this one than the others.









Result not found







www.3dmark.com





And some TS/TSE,









I scored 36 516 in Time Spy


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 18 531 in Time Spy Extreme


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Azazil1190

yzonker said:


> Well I feel a little better about this. Card is better than I thought at first.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> For some reason Speedway is slow on my system though. Seems to just cap at the score below. Going from +240 to +270 on the core did almost nothing. No power limiting. Not sure what is going on there. Need to take a look at effective clocks maybe, although I didn't do anything different with this one than the others.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> And some TS/TSE,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 36 516 in Time Spy
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 531 in Time Spy Extreme
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Nice scores 👍
Which asus card do you have?


----------



## Carillo

Anyone heard anything from Alphacool regarding Strix/ TUF waterblock ETA ?


----------



## Pietro

GTANY said:


> ASUS RTX 4090 TUF review : (29) The Best Yet! Asus RTX 4090 TUF Gaming OC Review, Thermals, Power & Overclocking - YouTube
> 
> Disappointing : the TUF memory is 12°C hotter than the Gigabyte Gaming OC. And in France, it is one of the most expensive models.


Since Asus has a better cooler on the same levels of noise it's again them being cheap about thermal pads. At least Americans will be able to easily replace it with 15W/mK Gelid GP-Ultimate Thermal Pad to reduce that temps by 15 *C, in Europe that operation will invalid warranty. That's a one thing, other is that a lot of people are reporting very irritating coil whine on TUFs, worse than it was in 2020/2021 with 3XXX Asus cards.



Benni231990 said:


> so now we know If you want higher boost you need a Voltage mod AND a Shuntmod holy ****


Well, posts in this thread showed that it's all about silicon lottery, there are plenty of cards that boost in 3DMark to over 3100MHz after OC and some do 3000MHz out of the box.


----------



## dk_mic

got me a gaming x trio
what nice upgrade from the 2080 Ti (which was a nice upgrade from the 980 Ti):










wasn't easy to fit in the case (Meshify S2), but only because the water reservoir was in the way
remounted the res with some cable ties. sidepanel has plenty of clearance.









holding top1 firestrike and timespy for my hardware 5950x + 4090 right now 


















the card can run heaven at 3090 or 3105 MHz, but i guess ill run it with a 60-70% PL and some UV curve daily.

here some cyperpunk fps vs powerlimit and consumed watt:









bios is 480W (106%)
card has dual bios, but same power limits
did all my tests with a TX-750 from seasonic and couldn't make it trip

Waiting for a waterblock now


----------



## MikeGR7

Pietro said:


> I disagree, electrically:
> 1. MSI SUPRIM X (LIQUID) - Vcore: 26x70A MPS mem:4x70A MPS
> 2. ASUS STRIX - Vcore 24x70A Onsemi, mem 4x70A Onsemi
> 3. FOUNDERS EDITION - Vcore: 20x70A MPS mem:3x70A MPS
> 4. Colorful Vulcan OC-V - Vcore: 24x55A Alpha&Omega, mem 4x55A
> 5. GB MASTER (WATERFORCE) - Vishay 24x50A, mem: 4x50A Vishay
> 6. MSI TRIO X - Vcore: 18x70A MPS mem:4x70A MPS
> 7. ASUS TUF Vcore: 18x70A Infineon, mem: 4x50A Vishay
> 8. GB GAMING OC Vcore: 20x50A Vishay, mem 4x50A Vishay
> 
> If cooling is priority with being still decent electrically:
> 1. GB Waterforce with 360mm AIO
> 2. MSI Suprim Liquid X with 240mm AIO
> 3. Asus Strix
> 4. MSI Suprim
> 5. GB MASTER - first with 1800 rpm and 38dBA? - not really, what's the point of getting 59-60*C when card it's loud and check the width with 162,8mm + connector it will fit only few cases
> 6. Colorful Vulcan OC-V, probably not available in Europe and America, also is quite big 159mm width it won't fit with connector just like master
> 7. ASUS TUF it has good cooler, non English reviews prove that, but it's a shame that those cards so far had the most coil whine cases
> 8. GB GAMING OC 60 *C and the loudest card in TPU review to get that, no so great on normalised tests when on 35 dBA it gets 65 *C and Strix 61 *C
> 9. MSI TRIO X
> 10. FOUNDERS EDITION
> 
> The rest is not worth due to being either poor electically or cooling wise compared to those from list


I appreciate your input and opinion! 

The difference between your ranking and mine is that you evaluate on the VRMs but my ranking takes into account other parts like input - output filtering capacitors (number - type - rating).

On the cooling side, i obviously didn't include the watercooled models but since you did i want to add the following comments:

- It seems easy to conclude that GB waterforce is cooler than Msi liquid due to the bigger rad (360 vs 240) but realize that Msi aircools it's vrm components that obviously produce a lot of heat, so all the capacity of the 240 rad goes to core + mem. 
GB waterforce on the other hand uses the 360 rad to cool everything INCLUDING the VRM which will of course negatively affect the cores temperature and offset the bigger rads advantage to a degree.
So which is actually cooler will depend on fan quality, waterblock quality and we need to wait for a direct comparison on this one for sure.

- There is no way the regular Suprim outperforms GB Master and not even GB gaming on the cooling side, MSI didn't upgrade their coolers enough this round. In regards to the supposed higher noise on GB models on TPU review people need to realize that the gaming OC needed only 2 dBA more to outperform the Suprim on the cooling front. 
2 dBA is nothing and even the Masters 38dBA whould not make any difference in real life compared to the 35dBA used in the noise normalized test unless side by side in a muted room. 
The main reason for the slight advantage on Suprims side up to 35dBA is propably it falls in the most efficient range of fans rpm.

- It is almost impossible physically for the Tuf to be cooler than GB offerings since the cooler and fans are a lot smaller.
People get tricked by the weight of the card: ITS MISLEADING, TUF AND SUPRIM WEIGH MORE BECAUSE OF THE METALIC COSMETIC COVERS WHILE HAVING SMALLER ACTUALL HEATSINKS.

My 2 cents.



xboxmingjai said:


> I got GB master. is it good?


Top tier! Enjoy it!



GTANY said:


> ASUS RTX 4090 TUF review : (29) The Best Yet! Asus RTX 4090 TUF Gaming OC Review, Thermals, Power & Overclocking - YouTube
> 
> Disappointing : the TUF memory is 12°C hotter than the Gigabyte Gaming OC. And in France, it is one of the most expensive models.


Read above my comments on the tufs cooler, excellent card overall with nice metal build but cooling wise inferior to GB unless ASUS manages to do magic.


----------



## yzonker

Azazil1190 said:


> Nice scores 👍
> Which asus card do you have?


TUF OC


----------



## ZealotKi11er

RDNA3 will be way more fun to OC with real gain from good cooling.


Daemon_xd said:


> Der8auer post his voltage mod video with STRIX 4090


So even more pointless to volt mod the card. I can see the best silicon to get maybe 3150-3200 with 1.2v but no more. Also all this at probably 700-800w.


----------



## yt93900

Can't even come close to that TUF 4090 Port Royal score, the Waterforce Xtreme is power limited to hell in PR.








Result not found







www.3dmark.com


----------



## RaMsiTo

....


----------



## yzonker

J7SC said:


> 'grats
> ...and speaking of Canadian Gigabyte Gaming OC from the same store (now down to 2 available cards from 25 or so they had initially) and shelf, I've been slowly working my way up on VRAM...don't know where the limit is yet (also re. efficiency)... going in 100 MHz steps, but hopefully find out tomorrow. These cards seem to be great clockers, only issue is hotspot temps beyond 500W and 1.1V...need a water-block, preferably from Phanteks or even Bykski (my last 3090 Strix EKWB block had multiple issues).
> View attachment 2576267


My hotspot delta is only 10-11C, not much different than my 3090 in the HKV block.


----------



## RaMsiTo

@yzonker 


you're very close


----------



## yt93900

Yeah I seem to have no luck choosing hardware at all, wanted to buy the best model because of the biggest cooler, turns out it has a crap power limit - same story with both SUPRIMs. At least the Aorus has the tubing in the right spot, at the side and not at the front.


----------



## BeZol

ZealotKi11er said:


> RDNA3 will be way more fun to OC with real gain from good cooling.
> 
> 
> So even more pointless to volt mod the card. I can see the best silicon to get maybe 3150-3200 with 1.2v but no more. Also all this at probably 700-800w.


+ there is a memory-bandwith-limitation at around 3200MHz and +1500 to the memory.
Dunno the exact numbers, but there is surely some limitations at around 3000+MHz and +1000 memory. My card was bandwith-limited for sure.
Had to increase memory so that I could get more score with the increased gpu-clock, else I scored less with higher clock but still +1000 to the memory.


----------



## Luda

got my tuf oc to break 30k on TS, but this is on the same windows install as my 3090 build, i just ddu and swapped cards. Im hoping on a fresh install and fresh paste and finally moving the 10900k to a open loop with more heat capacity will get help move further up from 15th on the leader boards! [ +180 cord and +1700 mem for what its worth ] 









I scored 30 377 in Time Spy


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

Streamroller said:


> Anyone able to active force rebar for 4090 in 3dmark ?


Yes, seems to work. I picked up about 300 pts in PR.


----------



## Xavier233

Pietro said:


> I disagree, electrically:
> 1. MSI SUPRIM X (LIQUID) - Vcore: 26x70A MPS mem:4x70A MPS
> 2. ASUS STRIX - Vcore 24x70A Onsemi, mem 4x70A Onsemi
> 3. FOUNDERS EDITION - Vcore: 20x70A MPS mem:3x70A MPS
> 4. Colorful Vulcan OC-V - Vcore: 24x55A Alpha&Omega, mem 4x55A
> 5. GB MASTER (WATERFORCE) - Vishay 24x50A, mem: 4x50A Vishay
> 6. MSI TRIO X - Vcore: 18x70A MPS mem:4x70A MPS
> 7. ASUS TUF Vcore: 18x70A Infineon, mem: 4x50A Vishay
> 8. GB GAMING OC Vcore: 20x50A Vishay, mem 4x50A Vishay
> 
> If cooling is priority with being still decent electrically:
> 1. GB Waterforce with 360mm AIO
> 2. MSI Suprim Liquid X with 240mm AIO
> 3. Asus Strix
> 4. MSI Suprim
> 5. GB MASTER - first with 1800 rpm and 38dBA? - not really, what's the point of getting 59-60*C when card it's loud and check the width with 162,8mm + connector it will fit only few cases
> 6. Colorful Vulcan OC-V, probably not available in Europe and America, also is quite big 159mm width it won't fit with connector just like master
> 7. ASUS TUF it has good cooler, non English reviews prove that, but it's a shame that those cards so far had the most coil whine cases
> 8. GB GAMING OC 60 *C and the loudest card in TPU review to get that, no so great on normalised tests when on 35 dBA it gets 65 *C and Strix 61 *C
> 9. MSI TRIO X
> 10. FOUNDERS EDITION
> 
> The rest is not worth due to being either poor electically or cooling wise compared to those from list


Which cards do you think have the least coilwhine so far?


----------



## bmagnien

yzonker said:


> Yes, seems to work. I picked up about 300 pts in PR.


What’s forcing rebar on in 3DMark entail? Never knew it wasn’t running if it was enabled on the system overall


----------



## Baasha

Is the rumor that Asus is making a 'halo' version of the Strix 4090 a la KingPin true? Any news on that?


----------



## Xavier233

MikeGR7 said:


> I appreciate your input and opinion!
> 
> The difference between your ranking and mine is that you evaluate on the VRMs but my ranking takes into account other parts like input - output filtering capacitors (number - type - rating).
> 
> On the cooling side, i obviously didn't include the watercooled models but since you did i want to add the following comments:
> 
> - It seems easy to conclude that GB waterforce is cooler than Msi liquid due to the bigger rad (360 vs 240) but realize that Msi aircools it's vrm components that obviously produce a lot of heat, so all the capacity of the 240 rad goes to core + mem.
> GB waterforce on the other hand uses the 360 rad to cool everything INCLUDING the VRM which will of course negatively affect the cores temperature and offset the bigger rads advantage to a degree.
> So which is actually cooler will depend on fan quality, waterblock quality and we need to wait for a direct comparison on this one for sure.
> 
> - There is no way the regular Suprim outperforms GB Master and not even GB gaming on the cooling side, MSI didn't upgrade their coolers enough this round. In regards to the supposed higher noise on GB models on TPU review people need to realize that the gaming OC needed only 2 dBA more to outperform the Suprim on the cooling front.
> 2 dBA is nothing and even the Masters 38dBA whould not make any difference in real life compared to the 35dBA used in the noise normalized test unless side by side in a muted room.
> The main reason for the slight advantage on Suprims side up to 35dBA is propably it falls in the most efficient range of fans rpm.
> 
> - It is almost impossible physically for the Tuf to be cooler than GB offerings since the cooler and fans are a lot smaller.
> People get tricked by the weight of the card: ITS MISLEADING, TUF AND SUPRIM WEIGH MORE BECAUSE OF THE METALIC COSMETIC COVERS WHILE HAVING SMALLER ACTUALL HEATSINKS.
> 
> My 2 cents.
> 
> 
> 
> Top tier! Enjoy it!
> 
> 
> 
> Read above my comments on the tufs cooler, excellent card overall with nice metal build but cooling wise inferior to GB unless ASUS manages to do magic.


Am starting to think that the liquid cooling solutions are useless for the 4090. noise wise, you still have 2 rad fans + the fan on the card itself, so a total of 3 fans spinning fast. Since Rads by design hold a lot of heat themselves, they act like a heat sponge (absorbent), IMO more so than water can hold high temps after a while. So if we have to spin those 3 fans fast anyway, what is the real difference between the liquid coolers vs the air-cooled cards?


----------



## yzonker

bmagnien said:


> What’s forcing rebar on in 3DMark entail? Never knew it wasn’t running if it was enabled on the system overall


You use NVidiaProfileInspector to force reBar to activate with all games/benches. You can probably find a guide if you Google it. This thread at least gives the steps.






Here is how to enable REBAR on non whitelisted apps/games


Made own thread for more visibility. 1 - Download Nvidia Inspector. Probably at least version 2.3 ideal. 2 - In Nvidia Inspector, look for the icon that kind of looks like a hammer 2nd from the right, if you hover over it, it says "Show unknown setting from Nvidia Predefined Profiles"...




forums.overclockers.co.uk


----------



## ArcticZero

Seeing the results everyone is having on this thread is making me seriously consider just tearing down my loop altogether and getting a nice air cooler for my 5950X. The primary reason I even built a loop around my system was to cool the rear memory chips effectively on my 3090 without having to put a ton of heatsinks on top of the backplate 😅


----------



## RetroWave78

Xavier233 said:


> Am starting to think that the liquid cooling solutions are useless for the 4090. noise wise, you still have 2 rad fans + the fan on the card itself, so a total of 3 fans spinning fast. Since Rads by design hold a lot of heat themselves, they act like a heat sponge (absorbent), IMO more so than water can hold high temps after a while. So if we have to spin those 3 fans fast anyway, what is the real difference between the liquid coolers vs the air-cooled cards?


You can dump the heat outside of the case. That's an anemic radiator and in no way should represent how good liquid cooling is. 4090 will run in the high 30's @ 300-350w with an undervolt with two medium to thick 360 / 420 rads shared with the CPU. My 4090, shared with 12900k @ 5.1 GHz all P-Core (250w) runs at 42C with at 40% fan speed on the rads. The entire system is near dead silent.

Pushing the heat out of the case allows for better system memory overclocking. In an experiment I did last year I compared temps between setting up rads as exhaust (negative pressure) with rear 140mm fan as intake and with rads as intake (as per consensus / conventional wisdom). The difference was lower VRAM temps by 5c (5c lower about all the components on the card, including power delivery) and, lower mem temps by 10-15c and lower PCI-E M.2 by 20C pushing the heat out of the case:


----------



## yzonker

ArcticZero said:


> Seeing the results everyone is having on this thread is making me seriously consider just tearing down my loop altogether and getting a nice air cooler for my 5950X. The primary reason I even built a loop around my system was to cool the rear memory chips effectively on my 3090 without having to put a ton of heatsinks on top of the backplate 😅


I'll most likely block mine, if no other reason to justify the silly amount of money I have in my loop. Lol

Seriously though, the air cooler is so good that I don't feel in any hurry to get a block. It's really only audible at 500w or more and still not loud at all. This is so much better than 30 series in many ways. 

I set up a temp sensor inside the case below the vid card and use it to control my case fans.


----------



## MikeGR7

yt93900 said:


> Yeah I seem to have no luck choosing hardware at all, wanted to buy the best model because of the biggest cooler, turns out it has a crap power limit - same story with both SUPRIMs. At least the Aorus has the tubing in the right spot, at the side and not at the front.


If you're from Europe i am more than willing to buy this card from you in case you're disappointed!



Xavier233 said:


> Am starting to think that the liquid cooling solutions are useless for the 4090. noise wise, you still have 2 rad fans + the fan on the card itself, so a total of 3 fans spinning fast. Since Rads by design hold a lot of heat themselves, they act like a heat sponge (absorbent), IMO more so than water can hold high temps after a while. So if we have to spin those 3 fans fast anyway, what is the real difference between the liquid coolers vs the air-cooled cards?


It still has a temperature advantage but compared to high end air models like Asus Strix and GB Master the gap is smaller than ever.

Mostly in the 5 - 8c range.


----------



## Xavier233

RetroWave78 said:


> You can dump the heat outside of the case. That's an anemic radiator and in no way should represent how good liquid cooling is. 4090 will run in the high 30's @ 300-350w with an undervolt with two medium to thick 360 / 420 rads shared with the CPU. My 4090, shared with 12900k @ 5.1 GHz all P-Core (250w) runs at 42C with at 40% fan speed on the rads. The entire system is near dead silent.
> 
> Pushing the heat out of the case allows for better system memory overclocking. In an experiment I did last year I compared temps between setting up rads as exhaust (negative pressure) with rear 140mm fan as intake and with rads as intake (as per consensus / conventional wisdom). The difference was lower VRAM temps by 5 c and, lower mem temps by 10c and lower PCI-E M.2 by 15-20C pushing the heat out of the case:


I already have a 360mm rad for my 12900K through an AIO on the top of my case. So if I wanted to get a liquid cooled 4090, the only places for that 240mm rad would be: to dump air in the front of the case, or, through the other side panel. In both placements, am still dumping hot air outside the case. But air cooled cards ALSO have to dump hot air outside the case, so am not really seeing the benefit now. Plus those blower fans on the liquid cooled cards are not the quietest, nor the twin fans on the rad itself. As far as the GPU goes, the only difference is about 10C cooler on the GPU core, but does 60C vs 70C really matters if the "overall" noise level is the same?


----------



## Xavier233

MikeGR7 said:


> If you're from Europe i am more than willing to buy this card from you in case you're disappointed!
> 
> 
> 
> It still has a temperature advantage but compared to high end air models like Asus Strix and GB Master the gap is smaller than ever.
> 
> Mostly in the 5 - 8c range.


Exactly, so 8C versus the extra $$, and its STILL not dead silent with the 2 rad fans + the fan on the card itself. At the end, its still 3 fans (liquid cooled) vs 3 fans (air cooled).


----------



## Xavier233

He covers all this here:


----------



## RetroWave78

Xavier233 said:


> I already have a 360mm rad for my 12900K through an AIO on the top of my case. So if I wanted to get a liquid cooled 4090, the only places for that 240mm rad would be: to dump air in the front of the case, or, through the other side panel. In both placements, am still dumping hot air outside the case. But air cooled cards ALSO have to dump hot air outside the case, so am not really seeing the benefit now. Plus those blower fans on the liquid cooled cards are not the quietest, nor the twin fans on the rad itself. As far as the GPU goes, the only difference is about 10C cooler on the GPU core, but does 60C vs 70C really matters if the "overall" noise level is the same?


Set up front rad as exhaust, turn your rear 140mm around as intake. How are air cooled cards able to dump heat outside of the case unless it's a test bench / Thermaltake P5?

MSI cheapened out with the rad, a 240x25 rad (slim rad by water cooling standards) isn't going to cool better than a Vulcan or Strix air cooler of comparable surface area (basically you have a 240mm rad air cooler equivalent attached directly to the GPU, have fun with the GPU sag). The should have gone with a 360mm rad like Gigabyte, but I understand why they went with 240 as many cases cannot accommodate a 360 rad and people gravitate towards Suprim X Liquid because it's technically the shortest 4090 available. You could probably squeeze it into a SFF case.


----------



## DokoBG

Looking at the way these things clock , spending money for a custom water block for these cards is really not worth it on this generation.


----------



## energie80

Water cooling is a lafestyle 😅


----------



## Xavier233

RetroWave78 said:


> Set up front rad as exhaust, turn your rear 140mm around as intake. How are air cooled cards able to dump heat outside of the case unless it's a test bench / Thermaltake P5?
> 
> MSI cheapened out with the rad, a 240x25 rad (slim rad by water cooling standards) isn't going to cool better than a Vulcan or Strix air cooler of comparable surface area (basically you have a 240mm rad air cooler equivalent attached directly to the GPU, have fun with the GPU sag). The should have gone with a 360mm rad like Gigabyte, but I understand why they went with 240 as many cases cannot accommodate a 360 rad and people gravitate towards Suprim X Liquid because it's technically the shortest 4090 available. You could probably squeeze it into a SFF case.


Ya that could be one way to do it, but it still does not address the "overall" noise levels of 3 fans for liquid vs 3 fans for air-cooled.

With an air cooled card, it naturally dumps some of that hot air through the back of the card, but, u also have 2x140mm front fans + the rear 120mm fan which continuously brings in cold air and dumps it in the back anyhow. With that, the air inside the case is actually cool to the touch


----------



## yt93900

I think the tube placement on the Suprim is the dealbreaker, I've already had trouble fitting a 2080Ti KP because of the tubing at the front.
The Aorus is ideal if you want the rad to be at the front or top of the case. I have the 360 sitting as intake and the side panel off, dumping 400W+ into the case feels suicidal. I know it would work unless for prolonged benchmarking, but still unnecessary heat buildup on the MB, CPU (420 AIO) and RAM.

There is also one more model that has a waterblock w. integrated pump and tube sockets on the side...something from Inno3D?

+
Bummer, some photos on the web showed it without tubing but it seems to have an ordinary AIO:





Découvrez avec nous la INNO3D GeForce RTX 4090 ICHILL BLACK avec son AIO Watercooling - Cartes graphiques


Une cinquième GeForce RTX 4090 Custom est arrivée à la ferme, c'est donc un nouveau modèle vendu par un partenaire de NVIDIA q..., actualité 83584




www.cowcotland.com




Still it has a thicker rad than the Suprim.


----------



## dante`afk

DokoBG said:


> Looking at the way these things clock , spending money for a custom water block for these cards is really not worth it on this generation.


this also applies to any model, there are no good OCers and not, just get the cheapest model and be done


----------



## Pietro

Xavier233 said:


> Am starting to think that the liquid cooling solutions are useless for the 4090. noise wise, you still have 2 rad fans + the fan on the card itself, so a total of 3 fans spinning fast. Since Rads by design hold a lot of heat themselves, they act like a heat sponge (absorbent), IMO more so than water can hold high temps after a while. So if we have to spin those 3 fans fast anyway, what is the real difference between the liquid coolers vs the air-cooled cards?


Look here overclocked Aorus Xtreme Waterforce to 3000MHz with UV that has 360mm AIO, 400-410W in Cyberpunk with raytracing 46-48*C on core and up to 59 *C hotspot, you're not getting that on Strix, Suprim nor Master:


----------



## mirkendargen

ArcticZero said:


> Seeing the results everyone is having on this thread is making me seriously consider just tearing down my loop altogether and getting a nice air cooler for my 5950X. The primary reason I even built a loop around my system was to cool the rear memory chips effectively on my 3090 without having to put a ton of heatsinks on top of the backplate 😅


I'm a special snowflake, but I can't get a block fast enough, lol. My pumps/triple180 radiator are in the basement, so I'm used to an *actually* silent computer. I just have case fans running at a super low speed for the mobo/m.2/etc., and an AX1600i that's super quiet. Even using the stock power limit and voltage, a 4090 is definitely *not* anything close to the silent I'm used to on air, lol.


----------



## cletus-cassidy

RetroWave78 said:


> You can dump the heat outside of the case. That's an anemic radiator and in no way should represent how good liquid cooling is. 4090 will run in the high 30's @ 300-350w with an undervolt with two medium to thick 360 / 420 rads shared with the CPU. My 4090, shared with 12900k @ 5.1 GHz all P-Core (250w) runs at 42C with at 40% fan speed on the rads. The entire system is near dead silent.
> 
> Pushing the heat out of the case allows for better system memory overclocking. In an experiment I did last year I compared temps between setting up rads as exhaust (negative pressure) with rear 140mm fan as intake and with rads as intake (as per consensus / conventional wisdom). The difference was lower VRAM temps by 5c (5c lower about all the components on the card, including power delivery) and, lower mem temps by 10-15c and lower PCI-E M.2 by 20C pushing the heat out of the case:


I think one exception to your findings is if you have water cooled RAM. I echo what folks said above and what others have said before. Water cooling does not make financial sense. But it does yield (perhaps slightly) better performance.


----------



## stefxyz

It’s always worth it for silence (unless u r unlucky with coil whine…). Of course for silence you need something bigger than a 360 but with 2 480mm rads and the new noctuas in tandem with a double pump it’s basically inaudible.
I do agree however that this generation it’s not as bad as it used to be. I still can’t believe anyone could live with a 2080ti or 3090 fe card it sounds like my hair dryer.


----------



## RetroWave78

Xavier233 said:


> Ya that could be one way to do it, but it still does not address the "overall" noise levels of 3 fans for liquid vs 3 fans for air-cooled.
> 
> With an air cooled card, it naturally dumps some of that hot air through the back of the card, but, u also have 2x140mm front fans + the rear 120mm fan which continuously brings in cold air and dumps it in the back anyhow. With that, the air inside the case is actually cool to the touch


I've had AIO cooled cards, Kraken G15 + Corsair H55 was ok, but I tried to jury rig EVGA's 120x25mm cooling solution to a 1080 Ti early on and that thing was ridiculously noisy with a lot of the noise coming from the pump itself when the card was under load. 

Optimum Tech didn't observe a noisy pump. 

Water cooling doesn't yield "slightly" better temps, it absolutely destroys air cooling and once you get the cost of the rads, pumps and CPU block out of the way the cost of liquid cooling is only the cost of a new block for a new component. 

A 4090 FE / Tuf under full water block @ $1900 ($1600 + $250) will annihilate a Strix on air @ $2k because you can get away with aggressive undervolting once you bring the core down near 40C, to say nothing of the memory modules which also benefit from running cooler (see Tech Yes City's recent undervolting video).

It isn't a "slight" performance difference. Also, all of that heat is expelled from the case, as I show in my testing in my previous post, expelling the heat from the case resulted in -5C across GPU core, memory and power delivery, -10-15C across system memory, -20C across Sabret Rocket 2TB PCI-E M.2, and that's dead silent, near inaudible. The air coolers make noise and are nowhere near as quiet as a loop with gobs of water and 35% fan speed.


----------



## RetroWave78

cletus-cassidy said:


> I think one exception to your findings is if you have water cooled RAM. I echo what folks said above and what others have said before. Water cooling does not make financial sense. But it does yield (perhaps slightly) better performance.


My ram isn't and was not water cooled during that testing. Go run you 4090 at 450w for 15 minutes and open the case, that 20C increase in temp within the case is saturated to all of the other components, the motherboard, the SSD's, the PSU (if intaking from inside case, like my ASUS ROG Thor).

As I show in my testing in my previous post, expelling the heat from the case resulted in -5C across GPU core, memory and power delivery, -10-15C across system memory, -20C across Sabrent Rocket 2TB PCI-E M.2, and that's dead silent, near inaudible 35% fan speed on the rads.


----------



## StreaMRoLLeR

yzonker said:


> Yes, seems to work. I picked up about 300 pts in PR.


-


----------



## MikeGR7

yt93900 said:


> I think the tube placement on the Suprim is the dealbreaker, I've already had trouble fitting a 2080Ti KP because of the tubing at the front.
> The Aorus is ideal if you want the rad to be at the front or top of the case. I have the 360 sitting as intake and the side panel off, dumping 400W+ into the case feels suicidal. I know it would work unless for prolonged benchmarking, but still unnecessary heat buildup on the MB, CPU (420 AIO) and RAM.
> 
> There is also one more model that has a waterblock w. integrated pump and tube sockets on the side...something from Inno3D?
> 
> +
> Bummer, some photos on the web showed it without tubing but it seems to have an ordinary AIO:
> 
> 
> 
> 
> 
> Découvrez avec nous la INNO3D GeForce RTX 4090 ICHILL BLACK avec son AIO Watercooling - Cartes graphiques
> 
> 
> Une cinquième GeForce RTX 4090 Custom est arrivée à la ferme, c'est donc un nouveau modèle vendu par un partenaire de NVIDIA q..., actualité 83584
> 
> 
> 
> 
> www.cowcotland.com
> 
> 
> 
> 
> Still it has a thicker rad than the Suprim.


Damn that card looks insanely good!
It clearly sports an Arctic cooling aio you can clearly see the Bionix fans!

Thanks for bringing this one up, i bet it outperforms the other two thermally!


----------



## Xavier233

DokoBG said:


> Looking at the way these things clock , spending money for a custom water block for these cards is really not worth it on this generation.





RetroWave78 said:


> I've had AIO cooled cards, Kraken G15 + Corsair H55 was ok, but I tried to jury rig EVGA's 120x25mm cooling solution to a 1080 Ti early on and that thing was ridiculously noisy with a lot of the noise coming from the pump itself when the card was under load.
> 
> Optimum Tech didn't observe a noisy pump.
> 
> Water cooling doesn't yield "slightly" better temps, it absolutely destroys air cooling and once you get the cost of the rads, pumps and CPU block out of the way the cost of liquid cooling is only the cost of a new block for a new component.
> 
> A 4090 FE / Tuf under full water block @ $1900 ($1600 + $250) will annihilate a Strix on air @ $2k because you can get away with aggressive undervolting once you bring the core down near 40C, to say nothing of the memory modules which also benefit from running cooler (see Tech Yes City's recent undervolting video).
> 
> It isn't a "slight" performance difference. Also, all of that heat is expelled from the case, as I show in my testing in my previous post, expelling the heat from the case resulted in -5C across GPU core, memory and power delivery, -10-15C across system memory, -20C across Sabret Rocket 2TB PCI-E M.2, and that's dead silent, near inaudible. The air coolers make noise and are nowhere near as quiet as a loop with gobs of water and 35% fan speed.


I just got rid of my water loop, and I dont want to go back to it, no matter what. 

If I go the liquid 4090 route, the 3 fans on it (rad + on the card itself) have to be much quieter than an air-cooled 4090. The difference between 50C and 70C is not what matters to me (or for most), but rather, is the liquid-cooled 4090 *quieter *than air-cooled 4090s overall? If its within 2-3 db all things considered, then nah, I will stick with a air-cooled 4090, adjust the fan curves, and let the GPU run a bit hotter. 

Any one knows whats the max GPU temp allowed on this gen?


----------



## MikeGR7

Pietro said:


> Look here overclocked Aorus Xtreme Waterforce to 3000MHz with UV that has 360mm AIO, 400-410W in Cyberpunk with raytracing 46-48*C on core and up to 59 *C hotspot, you're not getting that on Strix, Suprim nor Master:


Here is 4090 Suprim X hovering around 50c without undervolt.
Not doubting the Water model whould win but i don't think more than 5 - 7c from what i see.


----------



## Roldo

MikeGR7 said:


> Here is 4090 Suprim X hovering around 50c without undervolt.
> Not doubting the Water model whould win but i don't think more than 5 - 7c from what i see.


I've seen many of their vids.
They're running a custom fan curve with the fans usually between 75-85%, far from the default one
Another example where you can see the fans speed


----------



## MikeGR7

Here is another one for those like me waiting for the 13900K/4090 combo.
AMD had it's fun dealing with the lower clocked Alder Lake parts, how about a more fair comparison:


----------



## dante`afk

Nice fake channel and benchmark

gotta love how they emerge out of nothing every release


----------



## originxt

Anyone see the Tim used for the Fe 4090? Honeywell Tpm 7950. They claim it's almost as good as lm but curious as to the actual performance compared to a good thermal paste like kpx if we want to waterblock or we need to find the same material to repaste?


----------



## inedenimadam

DokoBG said:


> Looking at the way these things clock , spending money for a custom water block for these cards is really not worth it on this generation.


Nothing about custom loops is sensible. It’s a fools game building custom loops when air cooling gets 90% of the performance for 10% of the price. But for many of us, ilthaf doesn’t matter. We want quiet, cool, and pretty. You can’t take the money with you when you die, so might as well enjoy the hobby, you know?


----------



## inedenimadam

originxt said:


> Anyone see the Tim used for the Fe 4090? Honeywell Tpm 7950. They claim it's almost as good as lm but curious as to the actual performance compared to a good thermal paste like kpx if we want to waterblock or we need to find the same material to repaste?


I used KPx on die when I was messing around with the shunt resistors. No noticeable change in thermal performance, but I honestly wasn’t looking at thermals outside of a quick glance to see if I screwed up. Whatever TIM is being used on the gaming x trio is not only really good, but very well applied at the factory. Factory Tim and application are as good as my best spread with KPx


----------



## MikeGR7

dante`afk said:


> Nice fake channel and benchmark
> 
> gotta love how they emerge out of nothing every release


Yeh yeh everytime i see people say the same and every time it turns legit.
Same thing with that Mike Benchmark channel that had 4090 running with osd and telemetry visible days before others and everyone said fakeee and then realease came and guess what it was perfectly legit and in line with all the rest.

You guys need to stop being overly suspicious, there are more parts in circulation weeks before review nda and not everything belongs to Jaytwocents.

The video looks perfectly normal to me.


----------



## MIST3RST33Z3

I have noticed that the lower my temps are, the better stability I get at higher clocks. I am excited to get the Optimus block to see how high I can push the clocks, and still have it be stable while gaming. I can hit 3075 while playing cyberpunk maxed out at 5120x1440, which as temps approach 60c, it comes down to 3050.

If I can keep the card below 50c, I think I can hit 3100 and have it stable in games.


----------



## cletus-cassidy

RetroWave78 said:


> My ram isn't and was not water cooled during that testing. Go run you 4090 at 450w for 15 minutes and open the case, that 20C increase in temp within the case is saturated to all of the other components, the motherboard, the SSD's, the PSU (if intaking from inside case, like my ASUS ROG Thor).
> 
> As I show in my testing in my previous post, expelling the heat from the case resulted in -5C across GPU core, memory and power delivery, -10-15C across system memory, -20C across Sabrent Rocket 2TB PCI-E M.2, and that's dead silent, near inaudible 35% fan speed on the rads.


I believe you, as I've done it both ways. However, since I DO WC my RAM (in addition to CPU and GPU), doing intake for my rads (I run 1680mm of rads) gives better performance for me.


----------



## cletus-cassidy

originxt said:


> Anyone see the Tim used for the Fe 4090? Honeywell Tpm 7950. They claim it's almost as good as lm but curious as to the actual performance compared to a good thermal paste like kpx if we want to waterblock or we need to find the same material to repaste?


I can't speak to its performance on desktop parts, but I replaced the TIM in my Blade 14 (both on the 5800HX and 3080) and get better temps than high performance paste. More importantly, there is no pump out effect because it's pad. It's also a bear to buy. I have some left and can try on my 4090 when I get my Bykski block (holdover until I get Optimus).


----------



## Daemon_xd

MikeGR7 said:


> Damn that card looks insanely good!
> It clearly sports an Arctic cooling aio you can clearly see the Bionix fans!
> 
> Thanks for bringing this one up, i bet it outperforms the other two thermally!


There is actually a review of the card on the same site but it is in french. I read google translated version and this card seems pretty good 





Test INNO3D GEFORCE RTX 4090 ICHILL BLACK : un AIO pour Ada Lovelace ! : Introduction, page 1


Passons à la RTX 4090 ICHILL BLACK proposé par INNO3D, qui a la particularité de disposer d'un refroidissement watercooling AIO, la carte exte... - Introduction, page 1




www.cowcotland.com


----------



## Nizzen




----------



## MikeGR7

Daemon_xd said:


> There is actually a review of the card on the same site but it is in french. I read google translated version and this card seems pretty good
> 
> 
> 
> 
> 
> Test INNO3D GEFORCE RTX 4090 ICHILL BLACK : un AIO pour Ada Lovelace ! : Introduction, page 1
> 
> 
> Passons à la RTX 4090 ICHILL BLACK proposé par INNO3D, qui a la particularité de disposer d'un refroidissement watercooling AIO, la carte exte... - Introduction, page 1
> 
> 
> 
> 
> www.cowcotland.com


57C full load fans @700 rpm = Yes please!!
Safe to say with fans @1500 it's easy below 50.


----------



## J7SC

stefxyz said:


> It’s always worth it for silence (unless u r unlucky with coil whine…). Of course for silence you need something bigger than a 360 but with 2 480mm rads and the new noctuas in tandem with a double pump it’s basically inaudible.
> I do agree however that this generation it’s not as bad as it used to be. I still can’t believe anyone could live with a 2080ti or 3090 fe card it sounds like my hair dryer.


When water blocks for my GPU from Phanteks & others are available, I will plug it into my current loop (3x D5s, 1320x63 triple core rads) = sunk costs already. This build has a side-by-side 2x atx mobo in one case and is a 'work and play build' with rads for the second, similar-sized loop also about 8 feet away. There is no way I can fit the behemoth 4090 air-cooler in there unless I come up with a 'Tardis' solution. I don't want to hear the systems at all whether working or playing, and w-cooling delivers that - apart from sustained oc gains. For gaming at stock settings, these cards probably won't need w-cooling, but for series non-sub zero benching, they definitely do, IMO. 500W+ 4nm hotspot is very hard to cool by air in normal ambient conditions.
---
I enjoyed the DerBauer vid linked above on volt-modding the Strix w/ ElmorLabs EVCxx as he had already telegraphed that last week, per prior posts. Also: The *Strix* may be expensive, but it has additional voltage checking points even over the TUF OC. Unless you are into continuous benching for 3DM HoF or Hwbot and are thinking about sub-ambient if not sub-zero, most 4090 models will deliver the same-out-of-the box experience with what appears to be a 4% - 5% 'random' spread...but the Strix is the currently best option for more committed benchers since there is no KPE (not sure what Galax will bring, though). W-cooling will definitely help to unlock performance beyond a certain CPUv / PL for most models. I can just see the day when XOC vbios get leaked into the wild...🥴hopefully, they keep some fail-safe intact
---
Interestingly enough, the reviews of various vendor models show PCBs with some power stages missing even on upper-midrange models. Apart from adding those in for super-duper 4090 (+TI) follow-ups, they could also increase from 50A, 70 A to 90 A and beyond. So we may yet see an Asus Matrix, MSI Lightning & co...
---
I posted screenies last night that my Gigabyte Gaming OC was sustaining over 3100 MHz core - hotspot limits are the 'bad guy' as there was a lot of core-V and PL left on the table. I now got VRAM up to +1459, and so far no crashes _and_ increased scores. I suspect that I'm close on VRAM, though.


----------



## yzonker

J7SC said:


> When water blocks for my GPU from Phanteks & others are available, I will plug it into my current loop (3x D5s, 1320x63 triple core rads) = sunk costs already. This build has a side-by-side 2x atx mobo in one case and is a 'work and play build' with rads for the second, similar-sized loop also about 8 feet away. There is no way I can fit the behemoth 4090 air-cooler in there unless I come up with a 'Tardis' solution. I don't want to hear the systems at all whether working or playing, and w-cooling delivers that - apart from sustained oc gains. For gaming at stock settings, these cards probably won't need w-cooling, but for series non-sub zero benching, they definitely do, IMO. 500W+ 4nm hotspot is very hard to cool by air in normal ambient conditions.
> ---
> I enjoyed the DerBauer vid linked above on volt-modding the Strix w/ ElmorLabs EVCxx as he had already telegraphed that last week, per prior posts. Also: The *Strix* may be expensive, but it has additional voltage checking points even over the TUF OC. Unless you are into continuous benching for 3DM HoF or Hwbot and are thinking about sub-ambient if not sub-zero, most 4090 models will deliver the same-out-of-the box experience with what appears to be a 4% - 5% 'random' spread...but the Strix is the currently best option for more committed benchers since there is no KPE (not sure what Galax will bring, though). W-cooling will definitely help to unlock performance beyond a certain CPUv / PL for most models. I can just see the day when XOC vbios get leaked into the wild...🥴hopefully, they keep some fail-safe intact
> ---
> Interestingly enough, the reviews of various vendor models show PCBs with some power stages missing even on upper-midrange models. Apart from adding those in for super-duper 4090 (+TI) follow-ups, they could also increase from 50A, 70 A to 90 A and beyond. So we may yet see an Asus Matrix, MSI Lightning & co...
> ---
> I posted screenies last night that my Gigabyte Gaming OC was sustaining over 3100 MHz core - hotspot limits are the 'bad guy' as there was a lot of core-V and PL left on the table. I now got VRAM up to +1459, and so far no crashes _and_ increased scores. I suspect that I'm close on VRAM, though.


You know I haven't seen any performance regression from mem OC like we did with 30 series. Performance seems to keep increasing until it artifacts and/or crashes.


----------



## mirkendargen

yzonker said:


> You know I haven't seen any performance regression from mem OC like we did with 30 series. Performance seems to keep increasing until it artifacts and/or crashes.


I need to play with the performance implications of the ECC checkbox that's showed up in NVCP now.


----------



## MikeGR7

yzonker said:


> You know I haven't seen any performance regression from mem OC like we did with 30 series. Performance seems to keep increasing until it artifacts and/or crashes.


Yes because ECC is now turned off by default and has a toggle in drivers.
I much prefer this classical approach since it's much easier to actually find the stable freq.


----------



## WayWayUp

Even more than temp, power limit, voltage

it seems like memory is the big bottleneck
I’m thinking at 4090ti could be an even bigger increase that most already suggest
Especially with over headroom


----------



## ZealotKi11er

WayWayUp said:


> Even more than temp, power limit, voltage
> 
> it seems like memory is the big bottleneck
> I’m thinking at 4090ti could be an even bigger increase that most already suggest
> Especially with over headroom


Why would 4090 Ti be bigger increase if memory is the bottleneck?


----------



## J7SC

mirkendargen said:


> I need to play with the performance implications of the ECC checkbox that's showed up in NVCP now.


...I was just about to ask - what ECC setting do you prefer (so far, I have left it on default) ?


----------



## mirkendargen

ZealotKi11er said:


> Why would 4090 Ti be bigger increase if memory is the bottleneck?


4090 GDDR6X is rated at 21Gbps, 24Gbps GDDR6X exists and could be used on 4090ti's.


----------



## yzonker

mirkendargen said:


> I need to play with the performance implications of the ECC checkbox that's showed up in NVCP now.


Someone tested that and saw, not surprisingly, a pretty significant performance drop with it on.

Edit: I would test it, but every time I set it and reboot, it's unchecked again.


----------



## Zogge

Guys each time I reboot and start windows my palit gamerock oc 4090 is not found (unknown display device) in device manager and after 15 sec or so it is found and driver loaded. It works fine otherwise. What could this be ? 

I reinstalled drivers, no luck. Latest bios x570 formula, 64gb cl14 memory, 5950x. Pci x16 4.0 mode. Win 11 latest updates.


----------



## ZealotKi11er

mirkendargen said:


> 4090 GDDR6X is rated at 21Gbps, 24Gbps GDDR6X exists and could be used on 4090ti's.


You can OC 4090 + 1500 which gives you 24GBps already.


----------



## originxt

cletus-cassidy said:


> I can't speak to its performance on desktop parts, but I replaced the TIM in my Blade 14 (both on the 5800HX and 3080) and get better temps than high performance paste. More importantly, there is no pump out effect because it's pad. It's also a bear to buy. I have some left and can try on my 4090 when I get my Bykski block (holdover until I get Optimus).











Honeywell PTM7950 SP Super Highly Thermally Conductive PCM Pad


Buy Honeywell PTM7950 SP Super Highly Thermally Conductive PCM Pad for $6.29 with Free Shipping Worldwide (In Stock)




www.moddiy.com





Curious what sizes I should buy. Might be willing to try on my 10980xe or 3090.


----------



## Antsu

ZealotKi11er said:


> You can OC 4090 + 1500 which gives you 24GBps already.


Don't want to be rude, but this isn't the Nvidia reddit... Yes, you can OC to match that, but those 24GBps chips can also be overclocked, presumably by a similar amount.


----------



## WayWayUp

ZealotKi11er said:


> You can OC 4090 + 1500 which gives you 24GBps already.


And that’s still a limit
Look at 3d mark HoF port royal
Scores are scaling with memory. We also have users in this thread reporting no improvements in performance until after they are able to OC memory even higher,, that’s despite higher core clocks
4090ti memory will be better/faster and let’s not pretend it can’t be overclocked as well


----------



## kx11

So the new 7200mhz ram sticks actually do something











I scored 16 116 in Time Spy Extreme


Intel Core i9-12900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## kx11




----------



## yzonker

Zogge said:


> Guys each time I reboot and start windows my palit gamerock oc 4090 is not found (unknown display device) in device manager and after 15 sec or so it is found and driver loaded. It works fine otherwise. What could this be ?
> 
> I reinstalled drivers, no luck. Latest bios x570 formula, 64gb cl14 memory, 5950x. Pci x16 4.0 mode. Win 11 latest updates.


Did you run DDU in safe mode?


----------



## Luda

yzonker said:


> You know I haven't seen any performance regression from mem OC like we did with 30 series. Performance seems to keep increasing until it artifacts and/or crashes.












it would seem that might be because ECC is off by default?


----------



## zhrooms

Hulk1988 said:


> Will something happen soon? No information on the front page


Well, there's basically nothing to add, all cards seem to be ref PCB and similar power limits, doesn't matter which card you get if you're just gaming.. sad but true, also stupid expensive.. and no stock, I will add the (very few) cards available when I _feel_ like it, I'll have it done before the 4080 release date at least. Also, 4080 12GB cancelled.. this entire 40 launch has been a disaster, pre-mature for sure.
1. Barely any cards available, both stock and actual models by the few NVIDIA partners now that EVGA is gone (partners didn't release all of their models either, or extremely low availability on some).​2. Extremely expensive, it's just madness at this point.​3. Long delay on lower end cards, only flagship release with again, next to no availability.​4. One of the "lower end" cards even outright cancelled.​5. No ATX 3.0 power supplies out (there are a few select ones that you can get at some specific stores but most already sold out already)​6. Who actually needs this flagship card? I certainly don't, it's way too fast for anything in 1440p or 1440p UW, literal waste of money, and even for 4K it's sketchy now that we have DLSS, and at the same time barely any 4K monitors, or 4K UW.​7. Coolers are severely oversized, low demand for AIO and water blocks.​8. Still DP 1.4, that's an actual problem moving forward, HDMI 2.1 is not enough for high refresh rate 4K, so be prepared to replace the 4090 at a huge loss in 2 years if you want the new monitors.​9. Not that it matters, but it's silly that it's still PCIe 4.0 on top of still being DP 1.4.​So yeah, excuse me for not making this my top priority, setting it all up probably takes 6-18 hours, doesn't take an hour if anyone thought that, even if it's mostly copy and paste at this point.


----------



## Pietro

Xavier233 said:


> Which cards do you think have the least coilwhine so far?


Gigabyte so far had the least cases of coilwhine, asus the most, but there were also a lot of TUF cards sold so who knows in the end. So maybe Gigabyte have done a good job in Gaming OC and Master(if it will fit in your case, seriously card is huge and not only long, but also thick and has the biggest widght), but I think that's still a lottery.


----------



## mirkendargen

zhrooms said:


> Well, there's basically nothing to add, all cards seem to be ref PCB and similar power limits, doesn't matter which card you get if you're just gaming.. sad but true, also stupid expensive.. and no stock, I will add the (very few) cards available when I _feel_ like it, I'll have it done before the 4080 release date at least. Also, 4080 12GB cancelled.. this entire 40 launch has been a disaster, pre-mature for sure.
> 1. Barely any cards available, both stock and actual models by the few NVIDIA partners now that EVGA is gone (partners didn't release all of their models either, or extremely low availability on some).​2. Extremely expensive, it's just madness at this point.​3. Long delay on lower end cards, only flagship release with again, next to no availability.​4. One of the "lower end" cards even outright cancelled.​5. No ATX 3.0 power supplies out (there are a few select ones that you can get at some specific stores but most already sold out already)​6. Who actually needs this flagship card? I certainly don't, it's way too fast for anything in 1440p or 1440p UW, literal waste of money, and even for 4K it's sketchy now that we have DLSS, and at the same time barely any 4K monitors, or 4K UW.​7. Coolers are severely oversized, low demand for AIO and water blocks.​8. Still DP 1.4, that's an actual problem moving forward, HDMI 2.1 is not enough for high refresh rate 4K, so be prepared to replace the 4090 at a huge loss in 2 years if you want the new monitors.​9. Not that it matters, but it's silly that it's still PCIe 4.0 on top of still being DP 1.4.​So yeah, excuse me for not making this my top priority, setting it all up probably takes 6-18 hours, doesn't take an hour if anyone thought that, even if it's mostly copy and paste at this point.
> View attachment 2576374


Actually almost none of the cards for sale are the reference PCB, to the point no one's actually sure what the reference PCB is, lol. Hence why the first post should be updated...

Who cares about ATX 3.0 PSU's? Dispelling all the FUD around that should be exactly what's in the first post.

Why make the post if you think the card is silly, don't want one, and don't want to add any info?


----------



## tps3443

Nizzen said:


> 3090 feels like 1050ti compared to 4090 now 🤣


Yeah, I play 2560x1440P though. 3090 Kingpin overclocked out to the max is plenty fast for playing video games.

I get around 12,350 in Timespy extreme, with my cooling, I put that down in games steadily. I believe the RTX4090 is about 50% faster than that right? But when you are already getting 80-165+ fps in even the most demanding titles. Sometimes it’s smarter to use your brain and not your wallet.

The RTX4090 sounds fun though, and I will eventually grab one. 

I will let the excitement pass first though.


----------



## Xavier233

Pietro said:


> Gigabyte so far had the least cases of coilwhine, asus the most, but there were also a lot of TUF cards sold so who knows in the end. So maybe Gigabyte have done a good job in Gaming OC and Master(if it will fit in your case, seriously card is huge and not only long, but also thick and has the biggest widght), but I think that's still a lottery.


If you recall, Gigabyte also had the least (or none actually) coil whine on the 3090Ti, just their choice of VRMs IMO from what it looks like. MSI has some cases of coil whine as well, but ya, Asus seems to have the most, so I will avoid them this time around.


----------



## yzonker

Luda said:


> View attachment 2576373
> 
> 
> it would seem that might be because ECC is off by default?


Did you try clicking it? On my machine when I turn it on, it says I need to reboot for it to take effect. Then after the reboot it's off again.


----------



## Xavier233

tps3443 said:


> Yeah, I play 2560x1440P though. 3090 Kingpin overclocked out to the max is plenty fast for playing video games.
> 
> I get around 12,350 in Timespy extreme, with my cooling, I put that down in games steadily. I believe the RTX4090 is about 50% faster than that right? But when you are already getting 80-165+ fps in even the most demanding titles. Sometimes it’s smarter to use your brain and not your wallet.
> 
> The RTX4090 sounds fun though, and I will eventually grab one.
> 
> I will let the excitement pass first though.


I agree, the advantages of the 4090 are: running cooler, quieter, higher low 1% FFS, and DLSS3. again, all depends on budgets of course


----------



## Blameless

mirkendargen said:


> 4090 GDDR6X is rated at 21Gbps, 24Gbps GDDR6X exists and could be used on 4090ti's.


That's a ~14% bandwidth increase to support a very similar increase in potential functional units. If the fully enabled AD102 part is clocked higher on the core than the current 4090, it might be more memory bandwidth limited. Then again, it will have more enabled L2 cache as well.


----------



## mirkendargen

Blameless said:


> That's a ~14% bandwidth increase to support a very similar increase in potential functional units. If the fully enabled AD102 part is clocked higher on the core than the current 4090, it might be more memory bandwidth limited. Then again, it will have more enabled L2 cache as well.


The fact that Micron is making it means it's almost certainly going to be used on a 4090ti/super/Titan/whatever, because I think Nvidia is the only Micron GDDR6X customer.

I guess they might put it in the cards with lower chips and narrower busses to counter the narrower bus somewhat also, but it's a bad look if on a spec sheet a 4070 has faster VRAM than a 4090.


----------



## Blameless

mirkendargen said:


> The fact that Micron is making it means it's almost certainly going to be used on a 4090ti/super/Titan/whatever, because I think Nvidia is the only Micron GDDR6X customer.


I think it's pretty much a given that the 4090 Ti or whatever will use 24Gbps GDDR6X, but it's also extremely likely that the core is still going to get proportionally faster than the memory.



mirkendargen said:


> I guess they might put it in the cards with lower chips and narrower busses to counter the narrower bus somewhat also, but it's a bad look if on a spec sheet a 4070 has faster VRAM than a 4090.


The 4080 16GB has been rumored to have 23Gbps memory (probably with 24Gbps rated ICs) for quite some time and I wouldn't be surprised if this was accurate. It would not be the first time that higher speed memory was put on a lower tier card to deliver the bandwidth needed for the targeted segment.


----------



## zhrooms

mirkendargen said:


> Actually almost none of the cards for sale are the reference PCB, to the point no one's actually sure what the reference PCB is, lol. Hence why the first post should be updated...


The few people I've talked to mentioned briefly that they appear to be ref, but I just checked now that you pointed it out, TPU reviews, and they are clearly not ref, they look similar but at least vary in their typical custom PCB fashion featuring their own brands special features, like fan, RGB and LN2 connectors, looks like the FE/Strix/Suprim use the same controller (MP2891), going to go ahead and assume they all use it, so in a sense they are very close to the same, considering they have the same one power connector on top of the same voltage controller, so the only difference is the amount of power stages, which should be irrelevant, as long as you don't push it on LN2. If you look at the back of the PCB you can clearly see like 30% is the same, MLCC only on FE, but doesn't matter. Just going off of this like 1 minute research, yes they are "custom" for sure, entire shape, height, width is different, along with each cards features (connectors), but it does look a little less custom than what we've seen before, what makes me not want to call it custom is that the VRM is more or less the same, no one cares about 2 fewer power stages (or well you shouldn't), I don't see why you wouldn't be able to flash any BIOS to any card, in that aspect they look the same. So.. I think my point of "buy whichever is cheapest" still stands, for gamers.


mirkendargen said:


> Who cares about ATX 3.0 PSU's? Dispelling all the FUD around that should be exactly what's in the first post.


I care? No clue what people are saying about ATX 3.0, I just want 1 cable instead of 4.. obviously nothing else changes, but also "nice" to have newer ATX spec in general which feature minor stuff like enhanced/improved modern standby for Windows, I switched out my PSU recently to one from 2009, so I'm literally on 2.3 right now, so it's about time for a change.


mirkendargen said:


> Why make the post if you think the card is silly, don't want one, and don't want to add any info?


I didn't say I think it's silly, it's clearly not "silly", extreme performance, the most efficient GPU to date thanks to TSMC 5nm.
I didn't say I didn't "want" one, I just refuse to pay the "silly" price, which for a Strix OC is (in Sweden, with 25% VAT): $2476 USD / €2530 EUR / £2192 GBP / $3416 CAD / $3963 AUD
I didn't say I didn't want to add the data, it takes more time than you think, I just don't see the urgency.


----------



## bmagnien

Guys Wish.com started selling 4090s grab them before they sell out:


__ https://twitter.com/i/web/status/1581768300219969537


----------



## Roldo

zhrooms said:


> looks like the FE/Strix/Suprim use the same controller (MP2891), going to go ahead and assume they all use it, so in a sense they are very close to the same, considering they have the same one power connector on top of the same voltage controller, so the only difference is the amount of power stages


Shouldn't assume that.
You took the three cards with probably the best PCB.
Gaming OC, Gamerock, Amp Extreme, Vulcan use a uPI uP9512U and worse PS too

Edit:

Gamerock PCB breakdown

TLDW: big downgrade from FE according to him
"I really don't like it. If you can avoid this PCB, avoid it. It shouldn't fail during the warranty period but if I was spending $1600+ on a GPU this would make me very upset cause...it's bad"


----------



## dante`afk

zhrooms said:


> Well, there's basically nothing to add, all cards seem to be ref PCB and similar power limits, doesn't matter which card you get if you're just gaming.. sad but true, also stupid expensive.. and no stock, I will add the (very few) cards available when I _feel_ like it, I'll have it done before the 4080 release date at least. Also, 4080 12GB cancelled.. this entire 40 launch has been a disaster, pre-mature for sure.
> 1. Barely any cards available, both stock and actual models by the few NVIDIA partners now that EVGA is gone (partners didn't release all of their models either, or extremely low availability on some).​2. Extremely expensive, it's just madness at this point.​3. Long delay on lower end cards, only flagship release with again, next to no availability.​4. One of the "lower end" cards even outright cancelled.​5. No ATX 3.0 power supplies out (there are a few select ones that you can get at some specific stores but most already sold out already)​6. Who actually needs this flagship card? I certainly don't, it's way too fast for anything in 1440p or 1440p UW, literal waste of money, and even for 4K it's sketchy now that we have DLSS, and at the same time barely any 4K monitors, or 4K UW.​7. Coolers are severely oversized, low demand for AIO and water blocks.​8. Still DP 1.4, that's an actual problem moving forward, HDMI 2.1 is not enough for high refresh rate 4K, so be prepared to replace the 4090 at a huge loss in 2 years if you want the new monitors.​9. Not that it matters, but it's silly that it's still PCIe 4.0 on top of still being DP 1.4.​So yeah, excuse me for not making this my top priority, setting it all up probably takes 6-18 hours, doesn't take an hour if anyone thought that, even if it's mostly copy and paste at this point.
> View attachment 2576374


no offense but you shouldn't have opened a thread if you are not in the mood of pampering it.


yes no one needs a 4090, we're all just a bunch of hardware enthusiasts who want to have the newest piece of hardware to play around with. once every 2 years the community gets together like little kids below the Christmas tree for this.


----------



## N19htmare666

yt93900 said:


> Can't even come close to that TUF 4090 Port Royal score, the Waterforce Xtreme is power limited to hell in PR.
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


Is this because you increased the voltage or does it not make any difference?
Are you planning on switching the bios to raise the power limit?


----------



## WayWayUp

i was hoping kingpin goes to one of the larger aibs


----------



## J7SC

dante`afk said:


> no offense but you shouldn't have opened a thread if you are not in the mood of pampering it.
> 
> 
> yes no one needs a 4090, we're all just a bunch of hardware enthusiasts who want to have the newest piece of hardware to play around with.* once every 2 years the community gets together like little kids below the Christmas tree for this.*


...at times, it can also appear like 'rutting season' in the forest...


----------



## Roldo

bmagnien said:


> Guys Wish.com started selling 4090s grab them before they sell out:
> 
> 
> __ https://twitter.com/i/web/status/1581768300219969537


Leadtek, that takes me back...
(had a Ti 4200 from them, in 2002 I think?)


----------



## bmagnien

Roldo said:


> Leadtek, that takes me back...
> (had a Ti 4200 from them, in 2002 I think?)


Heard Kingpin works for Leadtek now


----------



## Glerox

What's the attainable clockspeed for the core and VRAM with the stock coolers on most 4090s?

I usually order the Nvidia flagship on day 1, but this time I'm waiting for AMD's answer. Also the fact that I'm building a SFF build for my living room, I'm not sure a 450w GPU is the best idea. 
I'm curious what will be the TDP amd AMD's flagship.


----------



## MikeGR7

Roldo said:


> Shouldn't assume that.
> You took the three cards with probably the best PCB.
> Gaming OC, Gamerock, Amp Extreme, Vulcan use a uPI uP9512U and worse PS too


Just wanted to point out the fact that the use of uP9512U is actually the best part of those cards because it a very well know and tested component and one that could push 1KW bioses on 3000 series.

The problem with those cards is mainly the other components such as low phase count, low quality power stages (QUALITY, PROBLEM IS QUALITY NOT CAPACITY), low quality solid capacitors and low quality i/o filtering which brings me to the other point:

Gigabyte's cards have none of the above problems this round and should not be placed among the likes of Zotac and Palit.

I hate to defend them again again (because some may get the wrong idea) but the reason i do it is the fact we "destroyed" them in 3000 forums(with good reason) but it is unfair to not give credit when a company does step up it's product quality.


----------



## Nico67

J7SC said:


> W-cooling will definitely help to unlock performance beyond a certain CPUv / PL for most models. I can just see the day when XOC vbios get leaked into the wild...🥴hopefully, they keep some fail-safe intact


bound to be something for the Strix eventually, and a custom water cooled card would come into its own then 


One thing I don't see people considering is the power excursion filtering that the better cards might offer. Probably not an issue around 300-400w, but at 520-550w you might be able to trigger power limiting and require a ATX3.0 power supply with excursion overhead so it does trip. Nvidia were pretty vocal about how much they improved the FE in this regard.

On the other hand, Asus cards do seem to be very compact component layout compared to MSI for example, that may also assist in better power delivery, but is likely to make the Vram hotter due to the vrm being closer. Watercooling will probably be more helpful in these conditions.


----------



## yt93900

N19htmare666 said:


> Is this because you increased the voltage or does it not make any difference?
> Are you planning on switching the bios to raise the power limit?


No difference, at some point it just doesn't scale at all because it's PWR limited, worst case scenario it will crash straight at the start when the frequency peaks way above the stable limit.


----------



## bmagnien

@yzonker how are you going to hold out on me about the rebar trick in PR until the 3090 generation is already over? Who else knew about this?! Just crushed my previous best:








I scored 15 918 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 3090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





a bit of a swan song as I await the opportunity to purchase a 4090...


----------



## bottjeremy

Just undervolted my 4090 and I'm kind of floored. Definitely try. Increase volts until your card's effective clocks match your core clock. VSync/Gsync on.

Max settings, no RT, DLSS Quality, 1440P. Battlefield 2042.


----------



## mirkendargen

Roldo said:


> Leadtek, that takes me back...
> (had a Ti 4200 from them, in 2002 I think?)


I had a Leadtek 5900U. Worst GPU I've ever owned lol. I still remember it came with a game called "Big Mother Truckers" too.


----------



## yzonker

bmagnien said:


> @yzonker how are you going to hold out on me about the rebar trick in PR until the 3090 generation is already over? Who else knew about this?! Just crushed my previous best:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 15 918 in Port Royal
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 3090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> a bit of a swan song as I await the opportunity to purchase a 4090...


We talked that stuff to death over in the 3090 thread. You must have fallen asleep during our endless watercooling talk.


----------



## ts2fe

I want to get the Suprim X but the reports of coil whine is turning me off.. Is the Gigabyte Gaming OC generally pretty safe in terms of not having coil whine?


----------



## Xavier233

ts2fe said:


> I want to get the Suprim X but the reports of coil whine is turning me off.. Is the Gigabyte Gaming OC generally pretty safe in terms of not having coil whine?


Can you return whichever u buy? 

I am willing to wait and grab the 4090 from somewhere I can easily return than get one now and not being able to return it due to coil whine


----------



## mirkendargen

ts2fe said:


> I want to get the Suprim X but the reports of coil whine is turning me off.. Is the Gigabyte Gaming OC generally pretty safe in terms of not having coil whine?


IMO it's tough to tell what the coil whine situation is like without a waterblock. Coil whine you could hear over the fan would be pretty damn extreme, I certainly can't hear anything over the fan on my Gaming OC.

I just tried blasting the fans at 100% to cool it down to room temp, setting them to 0%, and running Furmark for a few seconds before it heatsoaked the cooler. I couldn't hear any coil whine.


----------



## LunaP

Here's hoping for more stock updates tomorrow. I keep seeing the card fluctuate from unavailable/sold out to not available in your area so at least FEELS promising lol.


----------



## lordkahless

I think its interesting that the FE on Best Buy always says Unavailable Nearby. That tells me that stock exists but its not being released. It might be withheld for the Geforce Experience Sweepstakes.


----------



## LunaP

lordkahless said:


> I think its interesting that the FE on Best Buy always says Unavailable Nearby. That tells me that stock exists but its not being released. It might be withheld for the Geforce Experience Sweepstakes.


I'm in Phx so I'm hoping at least 1 store gets stock since its big enough, no msgs yet, unsure if I have to leave GFE running or if it has a service in the bg listening, uninstalling though the moment I get one.


----------



## joneffingvo

ts2fe said:


> I want to get the Suprim X but the reports of coil whine is turning me off.. Is the Gigabyte Gaming OC generally pretty safe in terms of not having coil whine?


 Mine works like a dream and I can’t hear it at all even when fans are spinning


----------



## Xavier233

joneffingvo said:


> Mine works like a dream and I can’t hear it at all even when fans are spinning


What about under load? Say GPU temp of 60C?


----------



## joneffingvo

Xavier233 said:


> What about under load? Say GPU temp of 60C?


mine stays around 50-55c under load my case fans are louder than the actual gpu fans


----------



## zhrooms

Roldo said:


> Shouldn't assume that.
> You took the three cards with probably the best PCB.
> Gaming OC, Gamerock, Amp Extreme, Vulcan use a uPI uP9512U and worse PS too


Unlucky, I just clicked the first three random cards I saw on TPU.



Roldo said:


> Gamerock PCB breakdown, big downgrade from FE
> "I really don't like it. If you can avoid this PCB, avoid it. It shouldn't fail during the warranty period but if I was spending $1600+ on a GPU this would make me very upset cause...it's bad"


Well, that's the one issue I have with BZ (and many others who make the same mistake), they generalize without realizing it, there are so many people out there that can't think for themselves or grasp the underlying meaning, there are several conflicting things to point out in what you just quoted:

"*I* really don't like it."
= "I" as in BZ.. as far as I know he barely or do not play games, and do not buy expensive hardware himself (mostly stuff being sent to him), he essentially just talks about them and do some overclocking (on the stuff he gets sent), correct me if I'm wrong but that's what he literally said himself late last year when I last watched his videos during the 12th Gen release, point is, his opinion (and point of view) is *unique *and should not matter to you, it certainly does not for me, I use my computer in a completely different way than he does (I benchmark, overclock and play games).

"If *you* can avoid this PCB, avoid it."
= "You" as in ME.. as I'm watching the video, he's talking directly to me, his reasons for avoiding it has no legs to stand on, to me, these criticisms has no impact on anything I do with my computer, including overclocking, see specifics below.

"It *shouldn't* fail during the warranty period"
= "shouldn't" as in.. there's enough of a risk of it happening that he has to use the word "shouldn't", like jesus christ, it sounds like the card should be returned as defective after you buy it even though it works, obviously it's not going to die within 2-3 years, that's absurd, I would personally say for obvious reasons "It *will not* fail during the warranty period", this is a significantly more accurate statement and will not scare away people from going for a perfectly good flagship GPU that no matter the model, will last likely last a decade if not far longer.

"but if *I* was spending $1600+ on a GPU this would make *me* very upset"
= "I/me" as in BZ.. uhm, am I supposed to care if he gets "very upset"? Why would I get "very upset" buying a $1600 GPU that had the highest efficiency ever in a GPU, TSMC 5nm with an oversized cooler, making it run ridiculously cool at the power it consumes (if you choose to push it), largest performance jump since 1080 Ti, delivering me 70% or higher performance than a 3090? I have nothing to be upset about at all, a slightly different power delivery will not make any difference what so ever, if no one told you what VRM your card had you would have no way of knowing, that's how little it really matters.

"cause...it's *bad*"
= "bad" as in bad to him.. as a PC enthusiast and overclocker with ridiculously high standards that far exceed most of ours (he has his vision of a perfect VRM), and especially gamers. So the word "bad" here means something completely different to me and my standards. If I buy a Palit card with uP9512U, I will enjoy overclocking it just as much as I would on a Strix, there'd be *no difference*, the software is the same, the overclocking potential is the same, it's the same GPU and VRAM, just slightly different power delivery, it's not going to make any difference casually overclocking, especially on the stock air cooler. So what he means by bad, is that it's not optimal to his vision of what a power delivery should look like on a 4090, that's it, he used the word bad, but it doesn't mean in a general sense that the VRM is bad, not at all. I can obviously see that, but there are enough people out there that can't, that verbatim go on tirades saying that stuff, calling the VRM "bad", making it sound like it will not last the 2-3 year warranty period (which he literally also happened to hint at..), which is borderline insane to believe, like just think about it for 2 seconds, so NVIDIA and its partners would intentionally create a graphics card for the "mass market" that would be extremely expensive (most expensive one yet) that.. wouldn't last the warranty period? So they'd have hundreds of thousands to millions of people return their graphics cards, getting new ones back for free, or even their money back, is that what I'm supposed to believe? Just no, if you buy the card with the "worst" PCB and VRM, there is absolutely no reason to think that it won't last 10 years, the VRM from that standpoint is "excellent" (it lasting a decade). But that is not what BZ is saying, and that is the issue.

So, I highly recommend you to listen and watch his videos, as it contains very useful data, but I don't recommend you listen to his opinion, as his and your standards and needs are likely very different. (Here's his channel: Actually Hardcore Overclocking - YouTube)



dante`afk said:


> no offense but you shouldn't have opened a thread if you are not in the mood of pampering it.


Eh, well it's easy for you to say that when you haven't put more than a hundred hours into it and need to put another what, 6-18 hours into it, without getting anything in return, I do these threads.. I did.. elevate these threads far and above what anyone else had done before me, because I saw a need for it and I felt like helping out, as the saying goes; "I did this out of the kindness of my heart".
Telling me to "hurry up" because I happened to create the thread.. I am offended. You are welcome to pay me to speed up the process if that's what you want. Or at the very least, start by finding PCB data on every card, including images, and DM it to me, just because I started the thread doesn't mean the "community" can't help gather the information on the latest flagship models.



dante`afk said:


> yes no one needs a 4090, we're all just a bunch of hardware enthusiasts


Eh? Lots of people need a 4090, including me, even people still running 1080p monitors "need it" as they can do 4K DSR DLSS 144Hz for perfect image quality. And no, the vast majority of people here are not hardware enthusiasts, they are gamers. The reason for the 2080 Ti thread blowing up to millions of views was because of the affordability of the 2080 Ti, along with RTX coming out, there were an enormous amount of gamers who sought guidance overclocking for further gaming performance, the people here right now, most of them still talk about basic overclocking, almost no one is running water (or talk about doing it). Back on Turing the model variance was absolutely wild, stupid amount of models and 100 power limit difference was common, that absolutely decided if it was worth overclocking or not, especially on AIO or Water, those days are over, on 3090 we got far higher power limits and now we have a new single connector that supports up to 600W, while the cards average far lower power consumption than that (making it matter less if you have 500 or 600), either way it's absolutely clear that the need for guidance when selecting a 4090 is far lower than on previous architectures and flagships, it's safe to say that which one you choose, matter the least out of all the flagships in the past decade.



dante`afk said:


> who want to have the newest piece of hardware to play around with. once every 2 years the community gets together like little kids below the Christmas tree for this.


That I completely agree on, too bad this time, Christmas was cancelled.. because of the ridiculous prices (and availability). No presents under the tree this year, including for me, I refuse to pay that much, and it's going to be replaced with a 4090 Ti soon enough, it will also depreciate in value to around half in two years, so rest in peace your wallet. Next flagship will probably be $1800, let's go!


----------



## LukeOverHere

zhrooms said:


> Unlucky, I just clicked the first three random cards I saw on TPU.
> 
> 
> Well, that's the one issue I have with BZ (and many others who make the same mistake), they generalize without realizing it, there are so many people out there that can't think for themselves or grasp the underlying meaning, there are several conflicting things to point out in what you just quoted:
> 
> "*I* really don't like it."
> = "I" as in BZ.. as far as I know he barely or do not play games, and do not buy expensive hardware himself (mostly stuff being sent to him), he essentially just talks about them and do some overclocking (on the stuff he gets sent), correct me if I'm wrong but that's what he literally said himself late last year when I last watched his videos during the 12th Gen release, point is, his opinion (and point of view) is *unique *and should not matter to you, it certainly does not for me, I use my computer in a completely different way than he does (I benchmark, overclock and play games).
> 
> "If *you* can avoid this PCB, avoid it."
> = "You" as in ME.. as I'm watching the video, he's talking directly to me, his reasons for avoiding it has no legs to stand on, to me, these criticisms has no impact on anything I do with my computer, including overclocking, see specifics below.
> 
> "It *shouldn't* fail during the warranty period"
> = "shouldn't" as in.. there's enough of a risk of it happening that he has to use the word "shouldn't", like jesus christ, it sounds like the card should be returned as defective after you buy it even though it works, obviously it's not going to die within 2-3 years, that's absurd, I would personally say for obvious reasons "It *will not* fail during the warranty period", this is a significantly more accurate statement and will not scare away people from going for a perfectly good flagship GPU that no matter the model, will last likely last a decade if not far longer.
> 
> "but if *I* was spending $1600+ on a GPU this would make *me* very upset"
> = "I/me" as in BZ.. uhm, am I supposed to care if he gets "very upset"? Why would I get "very upset" buying a $1600 GPU that had the highest efficiency ever in a GPU, TSMC 5nm with an oversized cooler, making it run ridiculously cool at the power it consumes (if you choose to push it), largest performance jump since 1080 Ti, delivering me 70% or higher performance than a 3090? I have nothing to be upset about at all, a slightly different power delivery will not make any difference what so ever, if no one told you what VRM your card had you would have no way of knowing, that's how little it really matters.
> 
> "cause...it's *bad*"
> = "bad" as in bad to him.. as a PC enthusiast and overclocker with ridiculously high standards that far exceed most of ours (he has his vision of a perfect VRM), and especially gamers. So the word "bad" here means something completely different to me and my standards. If I buy a Palit card with uP9512U, I will enjoy overclocking it just as much as I would on a Strix, there'd be *no difference*, the software is the same, the overclocking potential is the same, it's the same GPU and VRAM, just slightly different power delivery, it's not going to make any difference casually overclocking, especially on the stock air cooler. So what he means by bad, is that it's not optimal to his vision of what a power delivery should look like on a 4090, that's it, he used the word bad, but it doesn't mean in a general sense that the VRM is bad, not at all. I can obviously see that, but there are enough people out there that can't, that verbatim go on tirades saying that stuff, calling the VRM "bad", making it sound like it will not last the 2-3 year warranty period (which he literally also happened to hint at..), which is borderline insane to believe, like just think about it for 2 seconds, so NVIDIA and its partners would intentionally create a graphics card for the "mass market" that would be extremely expensive (most expensive one yet) that.. wouldn't last the warranty period? So they'd have hundreds of thousands to millions of people return their graphics cards, getting new ones back for free, or even their money back, is that what I'm supposed to believe? Just no, if you buy the card with the "worst" PCB and VRM, there is absolutely no reason to think that it won't last 10 years, the VRM from that standpoint is "excellent" (it lasting a decade). But that is not what BZ is saying, and that is the issue.
> 
> So, I highly recommend you to listen and watch his videos, as it contains very useful data, but I don't recommend you listen to his opinion, as his and your standards and needs are likely very different. (Here's his channel: Actually Hardcore Overclocking - YouTube)
> 
> 
> Eh, well it's easy for you to say that when you haven't put more than a hundred hours into it and need to put another what, 6-18 hours into it, without getting anything in return, I do these threads.. I did.. elevate these threads far and above what anyone else had done before me, because I saw a need for it and I felt like helping out, as the saying goes; "I did this out of the kindness of my heart".
> Telling me to "hurry up" because I happened to create the thread.. I am offended. You are welcome to pay me to speed up the process if that's what you want. Or at the very least, start by finding PCB data on every card, including images, and DM it to me, just because I started the thread doesn't mean the "community" can't help gather the information on the latest flagship models.
> 
> 
> Eh? Lots of people need a 4090, including me, even people still running 1080p monitors "need it" as they can do 4K DSR DLSS 144Hz for perfect image quality. And no, the vast majority of people here are not hardware enthusiasts, they are gamers. The reason for the 2080 Ti thread blowing up to millions of views was because of the affordability of the 2080 Ti, along with RTX coming out, there were an enormous amount of gamers who sought guidance overclocking for further gaming performance, the people here right now, most of them still talk about basic overclocking, almost no one is running water (or talk about doing it). Back on Turing the model variance was absolutely wild, stupid amount of models and 100 power limit difference was common, that absolutely decided if it was worth overclocking or not, especially on AIO or Water, those days are over, on 3090 we got far higher power limits and now we have a new single connector that supports up to 600W, while the cards average far lower power consumption than that (making it matter less if you have 500 or 600), either way it's absolutely clear that the need for guidance when selecting a 4090 is far lower than on previous architectures and flagships, it's safe to say that which one you choose, matter the least out of all the flagships in the past decade.
> 
> 
> That I completely agree on, too bad this time, Christmas was cancelled.. because of the ridiculous prices (and availability). No presents under the tree this year, including for me, I refuse to pay that much, and it's going to be replaced with a 4090 Ti soon enough, it will also depreciate in value to around half in two years, so rest in peace your wallet. Next flagship will probably be $1800, let's go!


Changing the topic as it is getting heavily off-track, Your efforts are 100% noticed by many people on the forum, you just didn’t get enough appreciation which sucks. I for one made much use of the 3080 content you maintained, it was extremely valuable, and subscribed to this exact post because of it, in fact i signed up to this site because of your posts, so don’t feel thankless, we just need to do a better job of appreciating the fact that certain people put hundreds of hours into some of these posts……. Lets not get toxic people, this is suppose to be fun and informative


----------



## Roldo

zhrooms said:


> stuff


Just to say me editing my post to add the Gamerock PCB breakdown had nothing to do with your post. It wasn't intended as a follow up to my reply.
I just didn't want to post twice in a row


----------



## smushroomed

J7SC said:


> Yes - I already had a riser PCIe 4.0 cable in use for the 3090 Strix, just transferred it & it works great with the 4090 (PCIe. 4.0 X16). I use the type of riser in the spoiler (have two of those, from Amazon last year).
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2576244


I assume the right angle is the one I want for a "normal" setup, simply moving the card vertical?

Also I'm reading 15cm is good, do you agree?


----------



## Arizor

smushroomed said:


> I assume the right angle is the one I want for a "normal" setup, simply moving the card vertical?
> 
> Also I'm reading 15cm is good, do you agree?


Yep, I also bought this for my 3090 Strix and transferred to my 4090, works a charm. 15CM is the one.


----------



## statman28

Does anyone know of a longer support brackets for the 4090? I have a Caselabs SMA8 case and need 210mm (8.2in) of height. I cant seem to find one that is longer.


----------



## Jordyn

Lego? My case didnt have the right alignment for the the bracket that came with the Gaming OC so this is my current workaround.


----------



## DokoBG

That is why i like the support bracket of the MSI Gaming Trio, screws to the back ports of the case.


----------



## J7SC

smushroomed said:


> I assume the right angle is the one I want for a "normal" setup, simply moving the card vertical?
> 
> Also I'm reading 15cm is good, do you agree?


I only know the 20 cm ones (work perfectly, including on the 4090) and had run tests with normal (mobo) mount and these (2x) risers - no difference in scores . You can get them in other angle(s), sizes, and in black of course. I used the exact one I showed for 2x vertical mounts in a TT Core P8.

---

Starting to understand clocks on my 4090 Gigabyte Gaming OC a bit better. The air-cooler is good for normal ops and gaming, but it desperately needs water-cooling for extended benching sessions - I keep on bumping into max Hotspot temps before my known max clocks at 1.1V. I did not use the 'full power' option in the NV tab yet and let it downclock since Hotspot temps are the issue already, but once w-cooled, I'll try. 5950X is ok for Port Royal, but DDR5 and IPC improvements would help...still, 21st (last time I checked - it changes so quickly ) at 3DM Port Royal HoF.


----------



## 8472

He got access to a 13900k and ran time spy and firestrike.


----------



## Shaded War

bottjeremy said:


> Just undervolted my 4090 and I'm kind of floored. Definitely try. Increase volts until your card's effective clocks match your core clock. VSync/Gsync on.
> 
> Max settings, no RT, DLSS Quality, 1440P. Battlefield 2042.
> View attachment 2576393


Two sources I'v seen so far have shown you don't want to undervolt. Showed lower FPS even with the same clock. 

That 48c looks insane though, cant wait to see how my Trio does. Regret not going faster shipping method, but I was trying to check out as fast as possible and didn't change from default.


----------



## jcue123

does anyone know what the PNY 4090 PCB consists of. From the pictures on the best buy page it looks similar to the palit pcb. just wondering if it woudl be worth it to return the gpu when it gets here.


----------



## statman28

Jordyn said:


> Lego? My case didnt have the right alignment for the the bracket that came with the Gaming OC so this is my current workaround.
> 
> View attachment 2576415


Awesome! Did not think of that. Cheers


----------



## Roldo

jcue123 said:


> does anyone know what the PNY 4090 PCB consists of. From the pictures on the best buy page it looks similar to the palit pcb. just wondering if it woudl be worth it to return the gpu when it gets here.


I think you can expect entry to lower midrange cards to use some variation of that PCB.
50/55A Dr.MOS (some will have more, others less, I think the Windforce even uses 14?), uP9512U controller etc


----------



## ReightNineSeven

Having some odd behavior with my 4090 Suprim Liquid X. No matter what game I test, performance seems to be noticeably lower than all benchmarks I am seeing online. 40 - 60 FPS in Control with DLSS off at 4k. GPU Usage consistently stays around 99%, so it is definitely using the resources available.

I noticed that my Wattage seems to be hard capped at 350w. All videos I see online show it hitting above 400 - 450 in the same scenarios. My performance also seems to match the 350w setting shown here:

__ https://twitter.com/i/web/status/1579842910086123521
Has anyone run into this? I've verified all my PCI-E Power Cables are seated properly, reseated them, and reinstalled drivers. Regardless of this troubleshooting, I can't seem to get this to go above 350w.


----------



## StreaMRoLLeR

ts2fe said:


> I want to get the Suprim X but the reports of coil whine is turning me off.. Is the Gigabyte Gaming OC generally pretty safe in terms of not having coil whine?


Yeah kinda disappointed. Also saw 3x user from YT comments that their SpX Liquid and Air were coil whining. My friend have 4090 TUF non OC with 3100 hold and zero coil whine. How many in this thread experience cw for suprim ?


----------



## Arizor

ReightNineSeven said:


> Has anyone run into this? I've verified all my PCI-E Power Cables are seated properly, reseated them, and reinstalled drivers. Regardless of this troubleshooting, I can't seem to get this to go above 350w.


sounds like an adapter issue, as you expect. Maybe grab one of the PSU cables from cable mod or similar to go direct from psu to gpu?


----------



## RaMsiTo

Streamroller said:


> Yeah kinda disappointed. Also saw 3x user from YT comments that their SpX Liquid and Air were coil whining. My friend have 4090 TUF non OC with 3100 hold and zero coil whine. How many in this thread experience cw for suprim ?


0 coil suprim x , good clock.


----------



## derthballs

Im undervolting, i get the same PR scores with a -160 and then upto 2850 on core for boost at 1v (950v crashed) + 1000 mem as I was overclocking at + 125/800 without undervolting plus I kept hitting power limits with the manual overclock compared to the undervolt which never hits power limits now.


----------



## pewpewlazer

Shaded War said:


> Two sources I'v seen so far have shown you don't want to undervolt. Showed lower FPS even with the same clock.
> 
> That 48c looks insane though, cant wait to see how my Trio does. Regret not going faster shipping method, but I was trying to check out as fast as possible and didn't change from default.


I'm assuming one "source" is that Optimum Tech doofus... what's the second source? Hopefully not another clickbait titled YouTube video from someone who doesn't even know how to undervolt correctly.


----------



## 8472

__ https://twitter.com/i/web/status/1581907952193019904


----------



## Nizzen

ZealotKi11er said:


> RDNA3 will be way more fun to OC with real gain from good cooling.
> 
> 
> So even more pointless to volt mod the card. I can see the best silicon to get maybe 3150-3200 with 1.2v but no more. Also all this at probably 700-800w.


So you have rdna3?


----------



## Nd4spdvn

Streamroller said:


> How many in this thread experience cw for suprim ?


Did not yet hear cw on my Suprim X, but mostly benched with higher fans rpms so far. Mine is an open bench build, no case.


----------



## Madness11

Why all blame palit ?? I got one , can beat 28k at port royal , core 3030-2985 . Memory 12000 , temps around 64 on auto .


----------



## StreaMRoLLeR

RaMsiTo said:


> 0 coil suprim x , good clock.
> 
> View attachment 2576430


That is amazing. Gratz friend. Whats your fan rpm ?


----------



## StreaMRoLLeR

Madness11 said:


> Why all blame palit ?? I got one , can beat 28k at port royal , core 3030-2985 . Memory 12000 , temps around 64 on auto .


Did you watch this video ?


----------



## Carillo

Madness11 said:


> Why all blame palit ?? I got one , can beat 28k at port royal , core 3030-2985 . Memory 12000 , temps around 64 on auto .


As long as your card works and you’re happy , it doesn’t matter what Buildzoid or anyone else says


----------



## spcysls

Does anyone know where to hook up the EVC2 to the Zotac Amp Extreme?


----------



## RaMsiTo

Streamroller said:


> That is amazing. Gratz friend. Whats your fan rpm ?














I need a bios with 600w.... 22k graphics score!!









I scored 15 005 in Time Spy Extreme


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## StreaMRoLLeR

RaMsiTo said:


> View attachment 2576439
> 
> 
> 
> 
> I need a bios with 600w.... 22k graphics score!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 15 005 in Time Spy Extreme
> 
> 
> Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2576440


Jesus you are the TOP 2 place owner xD Dont waste your time here and go play a lottery asap. Your internal clocks must be through the roof. Dat 29.5k score


----------



## Carillo

if anyone wonders , the TUF fits in Torrent Compact case 🤣


----------



## J7SC

Carillo said:


> if anyone wonders , the TUF fits in Torrent Compact case 🤣
> 
> View attachment 2576442
> 
> View attachment 2576441


...more like the Torrent Compact fits the Tuf  

---

...up to an even 1500 MHz VRAM now - score still went up slightly, so a bit more testing tomorrow with smaller increments; VRAM tuning is time consuming. 
I really look forward to a water block for my G-Gaming OC; loop is otherwise ready to go


----------



## morph.

Some quick 3d mark results:








I scored 34 588 in Time Spy


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 17 538 in Time Spy Extreme


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 27 942 in Port Royal


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 10 790 in Speed Way


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

J7SC said:


> ...more like the Torrent Compact fits the Tuf
> 
> ---
> 
> ...up to an even 1500 MHz VRAM now - score still went up slightly, so a bit more testing tomorrow with smaller increments; VRAM tuning is time consuming.
> I really look forward to a water block for my G-Gaming OC; loop is otherwise ready to go


You're losing the most by being 10-15C warmer than a lot of the scores that have been posted.


----------



## Gilgam3sh

I'm waiting to get my PNY 4090 delivered, anyone here using it and have some feedback on it? thanks 






GeForce RTX 4090 24GB XLR8 VERTO EPIC-X RGB Triple Fan | pny.com


The GeForce RTX 4090 24GB XLR8 GPU (powered by NVIDIA Ada Lovelace) features enhanced RT and Tensor Cores, 8K HDR, & the world's fastest G6X memory.




www.pny.com


----------



## YouKnowSedri

One quick qestion, there is any diff in PL with Asus Tuf vs Tuf oc?


----------



## morph.

YouKnowSedri said:


> One quick qestion, there is any diff in PL with Asus Tuf vs Tuf oc?


Peak boost core clock, essentially its a vbios flash or manual oc should equal and or surpass that clock speed.


----------



## Carillo

My TUF OC is maxing out [email protected] PR with fans ramped up. Not a mhz more


----------



## ZealotKi11er

8472 said:


> He got access to a 13900k and ran time spy and firestrike.


Nothing Burger. Maybe will help me since I dont OC with 12900K.


----------



## Glottis

Carillo said:


> My TUF OC is maxing out [email protected] PR with fans ramped up. Not a mhz more


Stock voltage?


----------



## bottjeremy

Shaded War said:


> Two sources I'v seen so far have shown you don't want to undervolt. Showed lower FPS even with the same clock.
> 
> That 48c looks insane though, cant wait to see how my Trio does. Regret not going faster shipping method, but I was trying to check out as fast as possible and didn't change from default.


Performance degradation occurs when you don't have enough voltage to cover the clocks you are asking for. You can verify your undervolt is valid when you look at HWiNFO64 and check that your effective clocks meet your clock rate. To resolve any effective clock issues, start incrementally bumping your _Voltage, mV_ and then reverify. I wanted my 4090 to have a game clock of 2700mhz, therefore these were my settings. +248 @ 925mV.



















Don't read past this line unless you need an undervolt tutorial:
*_*
My process to get there.
Step 1:
Pull core clock slider all the way to left to drop the curve. Click the checkbox to confirm.








Step 2:
Drag up the curve to desired frequency.








Step 3:
Click the check button and then test that effective clocks match your desired frequency.


----------



## Carillo

Glottis said:


> Stock voltage?


1100mV


----------



## Xavier233

joneffingvo said:


> mine stays around 50-55c under load my case fans are louder than the actual gpu fans





Carillo said:


> As long as your card works and you’re happy , it doesn’t matter what Buildzoid or anyone else says


uhhh ya it matters. If the components on the PCB are not up to par for a 4090, ignoring it and paying high price is not really the best idea in the world, when u have other cards which have better components


----------



## Xavier233

Carillo said:


> if anyone wonders , the TUF fits in Torrent Compact case 🤣
> 
> View attachment 2576442
> 
> View attachment 2576441


Any coil whine on your TUF?


----------



## Panchovix

Comparison table of some VRMs, took from reddit (here

__
https://www.reddit.com/r/nvidia/comments/y5x5pg/_/isn62q2
)


CARD​GPU Voltage Phases​GPU Voltage Controller​GPU MOS​MOS Current (A)​Total GPU Current​Memory Voltage Phases​Memory Voltage Controller​Memory MOS​MOS Current (A)​Total Memory Current​Total Phases​Source​MSI Suprim X (& Liquid)​26​Monolithic MP2891​Monolithic MP86957​70​1820​4​Monolithic MP2891​Monolithic MP86957​70​280​30​Techpowerup​ASUS STRIX OC​24​Monolithic MP2891​OnSemi FDMF3170​70​1680​4​Monolithic MP2891​OnSemi FDMF3170​70​280​28​Techpowerup​Nvidia Founders​20​Monolithic MP2891​Monolithic MP86957​70​1400​3​Monolithic MP2891​Monolithic MP86957​70​210​23​Techpowerup​Colorful Vulcan OC-V​24​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​1320​4​uPI uP9512R​Alpha & Omega AOZ5311NQI-03 BLN3​55​220​28​Techpowerup​Zotac AIRO​24​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​1320​4​uPI uP9512R​Alpha & Omega AOZ5311NQI BLN0​50​200​28​Techpowerup​ASUS TUF OC​18​Monolithic MP2888A​Infineon TDA21570​70​1260​4​uPI uP9512Q​Vishay SIC638A​50​200​22​Hardware Unboxed​Gigabyte Gaming OC​20​uPI uP9512U​Vishay SIC653A​50​1000​4​uPI uP9512R​Vishay SIC653A​50​200​24​Techpowerup​MSI Gaming Trio X​18​Monolithic MP2891​OnSemi NCP303151​50​900​4​Monolithic MP2891​OnSemi NCP303151​50​200​22​CoolPC​Palit GameRock OC​16​uPI uP9512U​OnSemi NCP302150​50​800​3​uPI uP9512R​OnSemi NCP302150​50​150​19​Techpowerup​Zotac Trinity OC​14​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​770​4​uPI uP9512R​Alpha & Omega AOZ5311NQI BLN0​50​200​18​CoolPC​Gigabyte Windforce​14​?​Vishay SIC653A​50​700​4​?​Vishay SIC653A​50​200​18​profesionalreview​

At the same power load, the less capacity (# of phases * DrMOS Current) a card has, the more heat it generates. Cards at the top of this list have the most power capacity, and cards at the bottom of this list have the least power capacity.

I'm planning to get a 4090 in the next months, so I hope this also helps someone to choose a card


----------



## Azulath

Azulath said:


> Similar to you, I have issues with my Gigabyte X570 Aorus Master coupled with a MSI RTX 4090 Suprim X. My 3090 Strix works without any issues, but when I put my 4090 into the slot I don't get any display output, the fans on the card do not spin up and the lights of the card do not go on.
> 
> I have tested the card with both a 1KW PSU and a 1.6KW PSU and I have used both the 12VHPWR (12+4) dongle that came with the card and the 12VHPWR (12+2) cable that came with the PSU. (PSU cables are interchangeable given that both are Seasonic Prime.)
> 
> Just to be certain, I have also updated my Mainboard's BIOS to the latest version (both, stable and beta) but the problem persists. Furthermore, I have tried a CMOS reset for good measure but to no avail...
> 
> My problem seems similar to Jay's, but he does neither mention if the fans of the card spin up, nor if RGB stays off entirely, which is why I'm not entirely sure on this.
> 
> 
> 
> 
> 
> Any help or input would be appreciated!


There is another video on the Strix situation which explains everything in more detail:





So, given that his Strix' fans spin up and the RGB goes on, while my Suprim X does not react in any way, I suppose the problem is different and my 4090 is just broken...


----------



## PisaroOne

Azulath said:


> There is another video on the Strix situation which explains everything in more detail:
> 
> 
> 
> 
> 
> So, given that his Strix' fans spin up and the RGB goes on, while my Suprim X does not react in any way, I suppose the problem is different and my 4090 is just broken...


I have the same problem with my Gainward RTX 4090. No fan spinning, only rainbow lights and a red VGA LED... MB is a Gigabyte Aorus Pro B550.


----------



## bmagnien

Panchovix said:


> Comparison table of some VRMs, took from reddit (here
> 
> __
> https://www.reddit.com/r/nvidia/comments/y5x5pg/_/isn62q2
> )
> 
> 
> CARD​GPU Voltage Phases​GPU Voltage Controller​GPU MOS​MOS Current (A)​Total GPU Current​Memory Voltage Phases​Memory Voltage Controller​Memory MOS​MOS Current (A)​Total Memory Current​Total Phases​Source​MSI Suprim X (& Liquid)​26​Monolithic MP2891​Monolithic MP86957​70​1820​4​Monolithic MP2891​Monolithic MP86957​70​280​30​Techpowerup​ASUS STRIX OC​24​Monolithic MP2891​OnSemi FDMF3170​70​1680​4​Monolithic MP2891​OnSemi FDMF3170​70​280​28​Techpowerup​Nvidia Founders​20​Monolithic MP2891​Monolithic MP86957​70​1400​3​Monolithic MP2891​Monolithic MP86957​70​210​23​Techpowerup​Colorful Vulcan OC-V​24​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​1320​4​uPI uP9512R​Alpha & Omega AOZ5311NQI-03 BLN3​55​220​28​Techpowerup​Zotac AIRO​24​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​1320​4​uPI uP9512R​Alpha & Omega AOZ5311NQI BLN0​50​200​28​Techpowerup​ASUS TUF OC​18​Monolithic MP2888A​Infineon TDA21570​70​1260​4​uPI uP9512Q​Vishay SIC638A​50​200​22​Hardware Unboxed​Gigabyte Gaming OC​20​uPI uP9512U​Vishay SIC653A​50​1000​4​uPI uP9512R​Vishay SIC653A​50​200​24​Techpowerup​MSI Gaming Trio X​18​Monolithic MP2891​OnSemi NCP303151​50​900​4​Monolithic MP2891​OnSemi NCP303151​50​200​22​CoolPC​Palit GameRock OC​16​uPI uP9512U​OnSemi NCP302150​50​800​3​uPI uP9512R​OnSemi NCP302150​50​150​19​Techpowerup​Zotac Trinity OC​14​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​770​4​uPI uP9512R​Alpha & Omega AOZ5311NQI BLN0​50​200​18​CoolPC​Gigabyte Windforce​14​?​Vishay SIC653A​50​700​4​?​Vishay SIC653A​50​200​18​profesionalreview​
> 
> At the same power load, the less capacity (# of phases * DrMOS Current) a card has, the more heat it generates. Cards at the top of this list have the most power capacity, and cards at the bottom of this list have the least power capacity.
> 
> I'm planning to get a 4090 in the next months, so I hope this also helps someone to choose a card


One note about this, is that it's not as simple as multiplying # of phases by mos current rating, as the actual current may not match the stated rating (Buildzoid expands on this on his FE breakdown). So essentially, the 70A OnSemi FDMF3170 on the Strix may be higher than the 70A Monolithic MP86957 on the FE and Suprim. SO take 'total gpu current' ranking with a grain of salt.


----------



## bmagnien

Is it correct that all the MSIs came without a 600w bios option, and with 3x1 adapters instead of 4? If not, which MSI models have 600w bios/4x1 adapters?


----------



## ZealotKi11er

Nizzen said:


> So you have rdna3?


Maybe I do.

I just see it comparing RTX 30 OC vs RDNA2. My 6900 XT can OC from ~ average game clock 2300MHz to ~ 2900MHz. My 3080 Ti Only got from 1900MHz to 2100MHz. 
Overcloking Geforce is pointless.


----------



## RaMsiTo

bmagnien said:


> Is it correct that all the MSIs came without a 600w bios option, and with 3x1 adapters instead of 4? If not, which MSI models have 600w bios/4x1 adapters?


suprim x 4x1 520w
suprim x liquid 4x1 530w
msi trio 3x1


----------



## bmagnien

RaMsiTo said:


> suprim x 4x1 520w
> suprim x liquid 4x1 530w
> msi trio 3x1


Thats what I thought. Thanks! So odd that the Suprim X seems to have the most overbuilt PCB, but launched with one of the lowest max powers.


----------



## Azulath

PisaroOne said:


> I have the same problem with my Gainward RTX 4090. No fan spinning, only rainbow lights and a red VGA LED... MB is a Gigabyte Aorus Pro B550.


Well, this does seem more in line with Jay's problem than mine. At least I'm assuming that sine the card does not even light up.

However, I'm curious if this is some issue with the Gigabyte MB, we are both on AM4 after all....


----------



## MikeGR7

Panchovix said:


> Comparison table of some VRMs, took from reddit (here
> 
> __
> https://www.reddit.com/r/nvidia/comments/y5x5pg/_/isn62q2
> )
> 
> 
> CARD​GPU Voltage Phases​GPU Voltage Controller​GPU MOS​MOS Current (A)​Total GPU Current​Memory Voltage Phases​Memory Voltage Controller​Memory MOS​MOS Current (A)​Total Memory Current​Total Phases​Source​MSI Suprim X (& Liquid)​26​Monolithic MP2891​Monolithic MP86957​70​1820​4​Monolithic MP2891​Monolithic MP86957​70​280​30​Techpowerup​ASUS STRIX OC​24​Monolithic MP2891​OnSemi FDMF3170​70​1680​4​Monolithic MP2891​OnSemi FDMF3170​70​280​28​Techpowerup​Nvidia Founders​20​Monolithic MP2891​Monolithic MP86957​70​1400​3​Monolithic MP2891​Monolithic MP86957​70​210​23​Techpowerup​Colorful Vulcan OC-V​24​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​1320​4​uPI uP9512R​Alpha & Omega AOZ5311NQI-03 BLN3​55​220​28​Techpowerup​Zotac AIRO​24​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​1320​4​uPI uP9512R​Alpha & Omega AOZ5311NQI BLN0​50​200​28​Techpowerup​ASUS TUF OC​18​Monolithic MP2888A​Infineon TDA21570​70​1260​4​uPI uP9512Q​Vishay SIC638A​50​200​22​Hardware Unboxed​Gigabyte Gaming OC​20​uPI uP9512U​Vishay SIC653A​50​1000​4​uPI uP9512R​Vishay SIC653A​50​200​24​Techpowerup​MSI Gaming Trio X​18​Monolithic MP2891​OnSemi NCP303151​50​900​4​Monolithic MP2891​OnSemi NCP303151​50​200​22​CoolPC​Palit GameRock OC​16​uPI uP9512U​OnSemi NCP302150​50​800​3​uPI uP9512R​OnSemi NCP302150​50​150​19​Techpowerup​Zotac Trinity OC​14​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​770​4​uPI uP9512R​Alpha & Omega AOZ5311NQI BLN0​50​200​18​CoolPC​Gigabyte Windforce​14​?​Vishay SIC653A​50​700​4​?​Vishay SIC653A​50​200​18​profesionalreview​
> 
> At the same power load, the less capacity (# of phases * DrMOS Current) a card has, the more heat it generates. Cards at the top of this list have the most power capacity, and cards at the bottom of this list have the least power capacity.
> 
> I'm planning to get a 4090 in the next months, so I hope this also helps someone to choose a card





bmagnien said:


> One note about this, is that it's not as simple as multiplying # of phases by mos current rating, as the actual current may not match the stated rating (Buildzoid expands on this on his FE breakdown). So essentially, the 70A OnSemi FDMF3170 on the Strix may be higher than the 70A Monolithic MP86957 on the FE and Suprim. SO take 'total gpu current' ranking with a grain of salt.


Also the generated heat can be offseted by better cooling, so we may end up with a card of "lower" capacity, for example a GB gaming OC actually having better real life temperatures than a Suprim X only because the cooler is better on the GB.

Plus beyond a certain threshold of current capacity it is way more relevant to evaluate the i/o filtering used and the properties/ type of capacitors.

For example the Trio has way better filtering than Zotac Airo and is therefore a better card pcb wise (Crappy Power limit though but we're on another subject now).


----------



## RaMsiTo

I have seen 3 suprim x and the afterburner curve is different in each one.

2955 1.1v
2880 1.1v
2835 1.1v

different quality of silicon?


----------



## bottjeremy

Panchovix said:


> Comparison table of some VRMs, took from reddit (here
> 
> __
> https://www.reddit.com/r/nvidia/comments/y5x5pg/_/isn62q2
> )
> 
> 
> CARD​GPU Voltage Phases​GPU Voltage Controller​GPU MOS​MOS Current (A)​Total GPU Current​Memory Voltage Phases​Memory Voltage Controller​Memory MOS​MOS Current (A)​Total Memory Current​Total Phases​Source​MSI Suprim X (& Liquid)​26​Monolithic MP2891​Monolithic MP86957​70​1820​4​Monolithic MP2891​Monolithic MP86957​70​280​30​Techpowerup​ASUS STRIX OC​24​Monolithic MP2891​OnSemi FDMF3170​70​1680​4​Monolithic MP2891​OnSemi FDMF3170​70​280​28​Techpowerup​Nvidia Founders​20​Monolithic MP2891​Monolithic MP86957​70​1400​3​Monolithic MP2891​Monolithic MP86957​70​210​23​Techpowerup​Colorful Vulcan OC-V​24​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​1320​4​uPI uP9512R​Alpha & Omega AOZ5311NQI-03 BLN3​55​220​28​Techpowerup​Zotac AIRO​24​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​1320​4​uPI uP9512R​Alpha & Omega AOZ5311NQI BLN0​50​200​28​Techpowerup​ASUS TUF OC​18​Monolithic MP2888A​Infineon TDA21570​70​1260​4​uPI uP9512Q​Vishay SIC638A​50​200​22​Hardware Unboxed​Gigabyte Gaming OC​20​uPI uP9512U​Vishay SIC653A​50​1000​4​uPI uP9512R​Vishay SIC653A​50​200​24​Techpowerup​MSI Gaming Trio X​18​Monolithic MP2891​OnSemi NCP303151​50​900​4​Monolithic MP2891​OnSemi NCP303151​50​200​22​CoolPC​Palit GameRock OC​16​uPI uP9512U​OnSemi NCP302150​50​800​3​uPI uP9512R​OnSemi NCP302150​50​150​19​Techpowerup​Zotac Trinity OC​14​uPI uP9512U​Alpha & Omega AOZ5311NQI-03 BLN3​55​770​4​uPI uP9512R​Alpha & Omega AOZ5311NQI BLN0​50​200​18​CoolPC​Gigabyte Windforce​14​?​Vishay SIC653A​50​700​4​?​Vishay SIC653A​50​200​18​profesionalreview​
> 
> At the same power load, the less capacity (# of phases * DrMOS Current) a card has, the more heat it generates. Cards at the top of this list have the most power capacity, and cards at the bottom of this list have the least power capacity.
> 
> I'm planning to get a 4090 in the next months, so I hope this also helps someone to choose a card


Just a data point here. Ran Furmark Stress Test for exactly 10 minutes on my Gigabyte Windforce with stock settings. Temps seem fine. I would not be concerned with the component make up as much with this generation. The coolers are so overbuilt that temps are kept in check. My memory overclocks to 24000mhz no issues, core to over 3000mhz with no issues either. Heat is kept well in check while gaming.


----------



## Spiriva

Finally I got the Strix delivered to me. (yes its my sons bed, i dont like paw patrol that much) 

It boosted to 2940mhz with core voltage at 100% and power limit at 120%, core clock at +0


----------



## ReightNineSeven

Arizor said:


> sounds like an adapter issue, as you expect. Maybe grab one of the PSU cables from cable mod or similar to go direct from psu to gpu?


I will take a look at the CableMod options! Running it with a EVGA Supernova 1300w GT which is brand new, so I can't imagine the PSU is the issue itself. Tried swapping out the 8 pin PCI E cables but zero change in behavior as well.

Hoping this is just an issue with the adapter, rather than with the GPU itself. What I find odd however is that I am experiencing zero stability issues. Everything seems to work fine, I'm just locked down to 350w.


----------



## ZealotKi11er

RaMsiTo said:


> I have seen 3 suprim x and the afterburner curve is different in each one.
> 
> 2955 1.1v
> 2880 1.1v
> 2835 1.1v
> 
> different quality of silicon?


That seems strange. Both the Gigabytes G OC i tried boosted the same at stock 1.05v


----------



## MikeGR7

RaMsiTo said:


> I have seen 3 suprim x and the afterburner curve is different in each one.
> 
> 2955 1.1v
> 2880 1.1v
> 2835 1.1v
> 
> different quality of silicon?


Different stability testing methods used is the most logical explanation, with silicon lottery just as possible.


----------



## mirkendargen

RaMsiTo said:


> I have seen 3 suprim x and the afterburner curve is different in each one.
> 
> 2955 1.1v
> 2880 1.1v
> 2835 1.1v
> 
> different quality of silicon?


We should maybe start a spreadsheet for people to record this. My Gaming OC for example is 2895.


----------



## bottjeremy

Found a good torture test game for 4090. Rise of Tomb Raider, Max settings, DX12, SSAA 4x, no DLSS, 1440p. Let me know what you guys are seeing with your 4090's from a power perspective. My card limited to 106%.


----------



## yzonker

mirkendargen said:


> We should maybe start a spreadsheet for people to record this. My Gaming OC for example is 2895.


Mine was only 2805. Doh. 









[Official] NVIDIA RTX 4090 Owner's Club


Is anyone applying an undervolt to get the best power performance. I've managed 1850/1000mv with +1000 on ram.




www.overclock.net


----------



## cletus-cassidy

bmagnien said:


> One note about this, is that it's not as simple as multiplying # of phases by mos current rating, as the actual current may not match the stated rating (Buildzoid expands on this on his FE breakdown). So essentially, the 70A OnSemi FDMF3170 on the Strix may be higher than the 70A Monolithic MP86957 on the FE and Suprim. SO take 'total gpu current' ranking with a grain of salt.


Perhaps at a minimum we could note which are smart power stages? Not perfect, but not a terrible proxy for power stage quality.


----------



## ReightNineSeven

ReightNineSeven said:


> Having some odd behavior with my 4090 Suprim Liquid X. No matter what game I test, performance seems to be noticeably lower than all benchmarks I am seeing online. 40 - 60 FPS in Control with DLSS off at 4k. GPU Usage consistently stays around 99%, so it is definitely using the resources available.
> 
> I noticed that my Wattage seems to be hard capped at 350w. All videos I see online show it hitting above 400 - 450 in the same scenarios. My performance also seems to match the 350w setting shown here:
> 
> __ https://twitter.com/i/web/status/1579842910086123521
> Has anyone run into this? I've verified all my PCI-E Power Cables are seated properly, reseated them, and reinstalled drivers. Regardless of this troubleshooting, I can't seem to get this to go above 350w.
> 
> View attachment 2576424



After doing some more digging, I'm leaning towards maybe my GPU shipping with the wrong BIOS. The Suprim Liquid X has a switch to swap between a gaming mode and a quiet mode. Tried both between restarts and there is zero difference in thermals and power draw. 

I don't believe its the 12vhpwr adapter as that doesn't have a state corresponding to 350w. I am inbetween the 300W state and the 450W state. Right now a GPU BIOS issue is the only theory I have, but I'm waiting on MSI support to respond still.

Would anyone have any additional ideas on this?


----------



## GQNerd

ReightNineSeven said:


> After doing some more digging, I'm leaning towards maybe my GPU shipping with the wrong BIOS. The Suprim Liquid X has a switch to swap between a gaming mode and a quiet mode. Tried both between restarts and there is zero difference in thermals and power draw.
> 
> I don't believe its the 12vhpwr adapter as that doesn't have a state corresponding to 350w. I am inbetween the 300W state and the 450W state. Right now a GPU BIOS issue is the only theory I have, but I'm waiting on MSI support to respond still.
> 
> Would anyone have any additional ideas on this?


what's your PSU?..


----------



## kmellz

Anyone seen anything about the VRM on Gainward RTX 4090 Phantom / GS cards? Guru3d sadly didn't publish anything about that :/
And also the PNY cards


----------



## ReightNineSeven

Miguelios said:


> what's your PSU?..


EVGA Supernova 1300w GT. Tried different PCI-E Power cables as well but no change there.


----------



## HeadlessKnight

I backordered the pictureless regular ASUS TUF from B&H on Saturday. Not positive I will get anything from them, but we'll see. Once they return from their holidays I bet they will cancel most of the preorders 😂. I almost managed to grab a Zotac Trinity. But Zotac and MSI are a big no-no this generation.


----------



## Spiriva

I ran a few test, put the core to +130 (3030 - 3045mhz) and the memory to +500, i didnt try any higher yet.


----------



## GQNerd

ReightNineSeven said:


> EVGA Supernova 1300w GT. Tried different PCI-E Power cables as well but no change there.


Okay.. I've gone through a few EVGA PSU's and ran into some strange stuff before... So, this may sound like a dumb question, but I have to ask. Are you using the original PCIE cables that came with the PSU? (Are ALL 4 original, or any that you mixed and matched?)

I've had problems when not using the original cables, and had to order more from EVGA.
(This was on the EVGA Supernova P2 1200w, and I'm currently on the Supernove P2 1600w)


----------



## AngryLobster

My Suprim X liquid stock VF curve is 2865/1.1v. Not really all that special, tops out 3015mhz.


----------



## jim2point0

zhrooms said:


> The reason for the 2080 Ti thread blowing up to millions of views was because of the affordability of the 2080 Ti, along with RTX coming out, there were an enormous amount of gamers who sought guidance overclocking for further gaming performance, the people here right now, most of them still talk about basic overclocking, almost no one is running water (or talk about doing it).


I appreciated the help I got here and on Discord for when I ran into issues with my 2080TI. I still have that with an AIO in my living room PC.

To this day I still check the owner threads for a breakdown of all the different models when new cards are released. True, it's disappointing that there's less reason to care regarding the 4090s. BUT while increased power limits are not really necessary, I think one reason people might want to check the thread or look up info is that some AIBs actually have a lower TDP than the FE. Some are capped at 450-500w, whereas the FE can go to 600w. Still mostly going to be voltage limited but I'd rather have an FE knowing that.

Sadly for me it's about picking one that will fit my case, and I have some unique space restrictions since it's a vertical mount only for GPUs. I'm not buying a new case. So this is one of those rare scenarios where card dimensions also matters to me. So the FE is pretty much the only model I can use other than the water cooled versions, but I'd simply prefer the FE. But the FE seems damn near impossible to get.

This is a very disappointing launch so far 😔


----------



## ReightNineSeven

Miguelios said:


> Okay.. I've gone through a few EVGA PSU's and ran into some strange stuff before... So, this may sound like a dumb question, but I have to ask. Are you using the original PCIE cables that came with the PSU? (Are ALL 4 original, or any that you mixed and matched?)
> 
> I've had problems when not using the original cables, and had to order more from EVGA.
> (This was on the EVGA Supernova P2 1200w, and I'm currently on the Supernove P2 1600w)


All 4 are the originals from the PSU, yeah. Just picked the PSU up brand new right before getting the 4090. Tempted to try and return it for a replacement, but I'm not fully convinced its a PSU issue with the current behavior


----------



## GQNerd

ReightNineSeven said:


> All 4 are the originals from the PSU, yeah. Just picked the PSU up brand new right before getting the 4090. Tempted to try and return it for a replacement, but I'm not fully convinced its a PSU issue with the current behavior


Ok.. You mentioned something about thinking the Power Limits for your VBIOS(es) were incorrect, did you check in GPUZ? (Should be 530w for Gaming vbios I believe)..

If it all looks good.. Gotta follow the Process of elimination. 


You should install your GPU in a friend's computer and see if you have the same issue...
If it works fine in friend's PC, the issue MAY be your Power Supply.
If you have the same problem on friend's PC, it's the CARD.

Good luck Bro


----------



## yzonker

ReightNineSeven said:


> EVGA Supernova 1300w GT. Tried different PCI-E Power cables as well but no change there.


You might post a screenshot of GPUZ with Control running so we can see all of the sensors. HWINFO with everything expanded in the GPU section would be better yet.


----------



## kx11

Anyone wants to take a crack at this situation???


----------



## changboy

I just saw they sold out around 15 MSI rtx-4090 trio online ! 
I didn't bought 1 !


----------



## BigMack70

Anyone know how much performance is left on the table at 4k when using a 9900ks instead of a 12900k?


----------



## Xavier233

BigMack70 said:


> Anyone know how much performance is left on the table at 4k when using a 9900ks instead of a 12900k?


Easiest way to know: run a game on your PC, compare it with a vid on youtube on 4K using a 12900K


----------



## J7SC

yzonker said:


> You're losing the most by being 10-15C warmer than a lot of the scores that have been posted.


...I know  ...we have a very warm period here, w/ambient around 24 C. I am however satisfied that this card has great potential once w-cooled.


----------



## J7SC

Carillo said:


> 1100mV


...3 GHz is nothing to sneeze at; what temps (incl. Hotspot, VRAM) during a full bore test run w/ max PL etc ?


----------



## LuckyImperial

ReightNineSeven said:


> I will take a look at the CableMod options! Running it with a EVGA Supernova 1300w GT which is brand new, so I can't imagine the PSU is the issue itself. Tried swapping out the 8 pin PCI E cables but zero change in behavior as well.
> 
> Hoping this is just an issue with the adapter, rather than with the GPU itself. What I find odd however is that I am experiencing zero stability issues. Everything seems to work fine, I'm just locked down to 350w.


The adapter offers 450W/600W depending on which sense pins are getting signals, so the fact that your capped at 350W doesn't lead me to believe it's the adapter that's causing you issues.

This new style adapter is bitter sweet. It's cool they offer the adapter, and they built some logic into it, but it sucks having to fight through teething issues. I'm sure a lot of people would rather just have four 8pin headers on the card.

I'll be watching what your issue for a resolution. It seems like a BIOS problem to me, or a VRM issue or something. 

I'm still trying to get a hold of a MSI Liquid X for my SFF build. Crossing my fingers for some new inventory to start flowing in this week.


----------



## Nizzen

3 ghz , 1.05v +1000 memory. Not forced rebar. 
Looks like average result.


----------



## Zogge

kmellz said:


> Anyone seen anything about the VRM on Gainward RTX 4090 Phantom / GS cards? Guru3d sadly didn't publish anything about that :/
> And also the PNY cards


I would really like to know as well as I am thinking about returning my palit gamerock oc for a gainward phantom. (Edit: after seeing the buildzoid video, but perhaps I should keep it anyway...?) Anything else is difficult to get in Sweden right now, maybe a Zotac could by snagged also but that is it.

The palit gamerock oc is rock stable at 3000mhz in all games though is that good or bad ? +100 voltage, 111% power, +165 core, +1000 mem. 61-68 degress in gpu after a 1 h gaming session.


----------



## Madness11

Zogge said:


> I would really like to know as well as I am thinking about returning my palit gamerock oc for a gainward phantom. (Edit: after seeing the buildzoid video, but perhaps I should keep it anyway...?) Anything else is difficult to get in Sweden right now, maybe a Zotac could by snagged also but that is it.
> 
> The palit gamerock oc is rock stable at 3000mhz in all games though is that good or bad ? +100 voltage, 111% power, +105 core, +1000 mem. 61-68 degress in gpu after a 1 h gaming session.


It's good , but I guess average res , my palit make , 185 core , 1500 memory .. and temps good . Up to you ) but my friend got unluck with strix , after 3000 crash anywhere ( silicon lottery


----------



## Nizzen

3060mhz core, 1300mhz mem. This is the max for my 4090 Tuf.

Need to force rebar soon to cross 28k


----------



## J7SC

Nizzen said:


> 3060mhz core, 1300mhz mem. This is the max for my 4090 Tuf.
> 
> Need to force rebar soon to cross 28k
> View attachment 2576537


Nice ! - what ambient and Hotspot temps ?


----------



## Nizzen

J7SC said:


> Nice ! - what ambient and Hotspot temps ?


59c max on core. Hotspot I don't know.


----------



## ZealotKi11er

It seems the cards with higher core do less memory. I think higher memory is better this gen. 
2970/1500 > 3054/1250


----------



## BigMack70

Xavier233 said:


> Easiest way to know: run a game on your PC, compare it with a vid on youtube on 4K using a 12900K


Just got my Gigabyte Windforce 4090 installed... let the testing begin!

First impression - WOW this card is huge. I didn't believe it could be be that much larger than the 3090 FE but it sure is. Never thought an O11D XL case could look so tiny. This thing barely fits.


----------



## Falkentyne

yzonker said:


> A little different config. Closer to 600w. This card seems to be the same as many/almost all 30 series cards. Depending on what you run, internal limits can be hit and it will limit below 600w. I wish some other people would do this test for comparison. Curious if any of the other cards are better. Reviews all led us to believe the FE is better for one.
> 
> View attachment 2576177



Test Path of Exile with GI Shadows=Ultra with VSync off and see if you hit internal rail limits there.


----------



## BigMack70

Any idea why MSI Afterburner would show 106% as my maximum power target instead of the expected 133%?


----------



## fever3O8

Anyone else have issues with no video output in the BIOS? I can launch into windows just fine.


----------



## cletus-cassidy

Falkentyne said:


> Test Path of Exile with GI Shadows=Ultra with VSync off and see if you hit internal rail limits there.


Welcome!


----------



## yzonker

Jay rediscovered 2Gb GDDR6x chips don't like to be cold. Lol.


----------



## zhrooms

jim2point0 said:


> Sadly for me it's about picking one that will fit my case, and I have some unique space restrictions since it's a vertical mount only for GPUs. I'm not buying a new case. So this is one of those rare scenarios where card dimensions also matters to me. So the FE is pretty much the only model I can use other than the water cooled versions, but I'd simply prefer the FE. But the FE seems damn near impossible to get.


Yeah I try to include both "card" length (cooler) and cooler height (slots), but the manufacturers are incompetent so the advertised slot dimensions/requirement were wrong half the time, so I had to eye it and guess on many cards.

Here is the cheat sheet I made


Spoiler



~*40*mm 2 slot 
~44mm 2.20 slot 
~*45*mm 2.25 slot 
~46mm 2.30 slot 
~47mm 2.35 slot 
~48mm 2.40 slot 
~49mm 2.45 slot 
~*50*mm 2.50 slot 
~51mm 2.55 slot 
~52mm 2.60 slot 
~53mm 2.65 slot 
~54mm 2.70 slot 
~*55mm* 2.75 slot 
~56mm 2.80 slot 
~57mm 2.85 slot 
~58mm 2.90 slot 
~59mm 2.95 slot 
~*60* mm 3.0 slot 
~61mm 3.05 slot 
~62mm 3.10 slot 
~63mm 3.15 slot 
~64mm 3.20 slot 
~*65* mm 3.25 slot 
~66mm 3.30 slot 
~67mm 3.35 slot 
~68mm 3.40 slot 
~69mm 3.45 slot 
~*70* mm 3.50 slot


I guess I'll start working on the list today, someone linked this list which is a good place to start.


----------



## changboy

I still can buy a msi 4090 gaming trio, will you buy that one ?
Coz no waterblock for that one grrr.


----------



## mirkendargen

changboy said:


> I still can buy a msi 4090 gaming trio, will you buy that one ?
> Coz no waterblock for that one grrr.


It's the same PCB as the Suprim, someone will make one.


----------



## changboy

Its not the gaming x trio, just the gaming trio, i dont know if the pcb is the same ?
Just 3x 8 pins the box.


----------



## LuckyImperial

BigMack70 said:


> Any idea why MSI Afterburner would show 106% as my maximum power target instead of the expected 133%?


Well, only a few select cards have BIOS' that allow full 133% (600W) power settings.









MSI GeForce RTX 4090 Suprim Liquid X Review


With their GeForce 40 Series, MSI is introducing a liquid cooling solution that's pre-filled and maintenance free. While other cards take up three or four slots in your system, the Suprim Liquid X is only dual-slot. Our review confirms: noise levels are fantastic, even the pump is inaudible in idle.




www.techpowerup.com





Do you have a Colorful card? That is limited to 7%.

Ironically, the MSI Suprim seems to have the beefiest VRM design but BIOS is limiting power setpionts to +10% (530W).

Edit: However, people aren't finding a lot of performance at higher power settings. Heck, some people are choosing a 70% power limit because it seems to have a minor impact on performance (~5% less). Also, people are hopeful new BIOS' may be released for higher power limits on some cards.


----------



## mirkendargen

changboy said:


> Its not the gaming x trio, just the gaming trio, i dont know if the pcb is the same ?
> Just 3x 8 pins the box.


I think Gaming Trio/Gaming X Trio are the same just whether it has a factory OC bios or not. Like TUF/TUF OC.


----------



## yzonker

Falkentyne said:


> Test Path of Exile with GI Shadows=Ultra with VSync off and see if you hit internal rail limits there.


Yup, at least the PL is high enough that it doesn't limit very much. Obviously fine for gaming.


----------



## GAN77

Colorful iGame RTX 4090 Neptune - 630 W





七彩虹iGame RTX 4090 Neptune OC显卡评测：翻江倒海的浪里白条 - 超能网


超能网（Expreview）专注于为主流科技产品提供全新视角的资讯，专注于100%高价值原创内容的创造，专注于最真实的体验式报道。




expreview.com


----------



## BigMack70

LuckyImperial said:


> Well, only a few select cards have BIOS' that allow full 133% (600W) power settings.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI GeForce RTX 4090 Suprim Liquid X Review
> 
> 
> With their GeForce 40 Series, MSI is introducing a liquid cooling solution that's pre-filled and maintenance free. While other cards take up three or four slots in your system, the Suprim Liquid X is only dual-slot. Our review confirms: noise levels are fantastic, even the pump is inaudible in idle.
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Do you have a Colorful card? That is limited to 7%.
> 
> Ironically, the MSI Suprim seems to have the beefiest VRM design but BIOS is limiting power setpionts to +10% (530W).


That must be it. I have a Gigabyte Windforce card. I don't really care about a higher power limit given the wealth of evidence suggesting that the performance gains above 450W are negligible. Just wanted to make sure I didn't have something set up incorrectly. 

What kinds of memory clocks are people getting? Initial tests on my card look like it's going to perform at about 2.9GHz core clock with a +1500 memory offset


----------



## MIST3RST33Z3

When I max out fan speed on the GPU and case fans, and turn on my A/C, I can run at 3110, but it tapers down to 3090 as the temperatures increase. I think there is more in it once I get a good waterblock for it. This is with an Asus TUF. Definitely am voltage limited at this point. Will try some more runs later this evening.


----------



## kx11

GAN77 said:


> Colorful iGame RTX 4090 Neptune - 630 W
> 
> 
> 
> 
> 
> 七彩虹iGame RTX 4090 Neptune OC显卡评测：翻江倒海的浪里白条 - 超能网
> 
> 
> 超能网（Expreview）专注于为主流科技产品提供全新视角的资讯，专注于100%高价值原创内容的创造，专注于最真实的体验式报道。
> 
> 
> 
> 
> expreview.com



Beautiful card but this is scary


----------



## Falkentyne

New game for you to mess around with making your 4090's do some work.









A Plague Tale: Requiem on Steam


Embark on a heartrending journey into a brutal, breathtaking world, and discover the cost of saving those you love in a desperate struggle for survival. Strike from the shadows or unleash hell with a variety of weapons, tools and unearthly powers.




store.steampowered.com


----------



## Rbk_3

Damn you guys and your memory OCs. I start artifacting around over +1000.


----------



## tubs2x4

Rbk_3 said:


> Damn you guys and your memory OCs. I start artifacting around over +1000.


For benchmarking synthetics you will see a gain but gaming going to 1000 on my 3070 was like 1 fps and that could have been margin of error. 500 on mem lots.


----------



## Daemon_xd

GAN77 said:


> Colorful iGame RTX 4090 Neptune - 630 W
> 
> 
> 
> 
> 
> 七彩虹iGame RTX 4090 Neptune OC显卡评测：翻江倒海的浪里白条 - 超能网
> 
> 
> 超能网（Expreview）专注于为主流科技产品提供全新视角的资讯，专注于100%高价值原创内容的创造，专注于最真实的体验式报道。
> 
> 
> 
> 
> expreview.com


It seems this card has the highest PL we have seen to date and she has 360 AIO cooler seems nice. Will keep an eye on her


----------



## J7SC

...I think I'll have to wait for the water-block for the G-GamingOC before more benching as it drops bins compared to shorter runs. Check the temps (incl. Hotspot) and PL-W max below, per red 'pyramid arrows'. I feel guilty about that Hotspot - thinking about 76.3 billion transistors @ 4 nm running this hot makes me queasy 🥴 ...on a brighter note, Cyberpunk 2077 is a blast even at stock GPU settings !


----------



## Dragonsyph

8472 said:


> He got access to a 13900k and ran time spy and firestrike.


Bro that whole video is bull. Look at game settings for 13900k in video vs 12900k. 13900k has higher settings. And for firestrike, how can a CPU 13900k with 8 more cores and higher clocks only get 1% over 12900k in a synthetic benchmark that scales? Ya i call bS.


----------



## GAN77

Daemon_xd said:


> It seems this card has the highest PL we have seen to date and she has 360 AIO cooler seems nice. Will keep an eye on her


Плата нептуна аналогична Colorful GeForce RTX 4090 Vulcan OC-V , у которого 550.0 W
Neptune board is similar to Colorful GeForce RTX 4090 Vulcan OC-V which has 550.0 W


----------



## yt93900

Dragonsyph said:


> Bro that whole video is bull. Look at game settings for 13900k in video vs 12900k. 13900k has higher settings. And for firestrike, how can a CPU 13900k with 8 more cores and higher clocks only get 1% over 12900k in a synthetic benchmark that scales? Ya i call bS.






This bloke has a 13900K ES chip, if the IPC is correct then the gains will probably marginal.


----------



## Mad Pistol

4090 gets here tomorrow. Less than a week after launch!!!

Don't think I've ever been this early for a new GPU launch. Done it a couple of times with a phone, but never with a bleeding-edge GPU.


----------



## morph.

You can only take ES chips with a grain of salt though... But yeah ill be sad if its the same same...


----------



## morph.

Dragonsyph said:


> Bro that whole video is bull. Look at game settings for 13900k in video vs 12900k. 13900k has higher settings. And for firestrike, how can a CPU 13900k with 8 more cores and higher clocks only get 1% over 12900k in a synthetic benchmark that scales? Ya i call bS.


he said in the video he disabled e-cores


----------



## yt93900

Mad Pistol said:


> 4090 gets here tomorrow. Less than a week after launch!!!
> 
> Don't think I've ever been this early for a new GPU launch. Done it a couple of times with a phone, but never with a bleeding-edge GPU.


Bet you'll like it, there aren't really any bad models this gen so far and the performance is amazing.


----------



## Dragonsyph

yt93900 said:


> This bloke has a 13900K ES chip, if the IPC is correct then the gains will probably marginal.


That maybe so, i'm only pointing out how that guy skews results in alot of videos by not using same settings. His face gets on my nerves lol.


----------



## coelacanth

Delete


----------



## MikeGR7

BigMack70 said:


> Any idea why MSI Afterburner would show 106% as my maximum power target instead of the expected 133%?


You bought the wrong card 😁

Jokes aside, tell us which model.

Edit: Just saw you have Windforce.


----------



## Dragonsyph

coelacanth said:


> He said he disabled the e-cores.


Then why in the video is he telling people it's a garbage cpu and it not faster than a 12900k when both are full tuned? He went on to talk about how hes a tuning god. lol /facepalm


----------



## BigMack70

MikeGR7 said:


> You bought the wrong card 😁
> 
> Jokes aside, tell us which model.


Gigabyte Windforce. 

Pretty happy with it so far. 2900 core / +1500 mem looking stable. Card is very quiet, and I can't detect any coil whine.


----------



## bottjeremy

BigMack70 said:


> That must be it. I have a Gigabyte Windforce card. I don't really care about a higher power limit given the wealth of evidence suggesting that the performance gains above 450W are negligible. Just wanted to make sure I didn't have something set up incorrectly.
> 
> What kinds of memory clocks are people getting? Initial tests on my card look like it's going to perform at about 2.9GHz core clock with a +1500 memory offset


This is what my Windforce is attaining. 3045mhz core clock and 24000mhz memory (depending on game)


----------



## Arizor

Anyone tried GPU tweak 3 on their ASUS? 

It's pretty nice, offers a solid VF tuner above 3ghz, but the memory clocks are absolutely bizarre. It "starts" (for the TUF at least) saying it's at 24000mhz baseline, which it isn't (in-game shows 21000), and then any overclocking just doesn't seem to take effect.


----------



## MikeGR7

Dragonsyph said:


> Then why in the video is he telling people it's a garbage cpu and it not faster than a 12900k when both are full tuned? He went on to talk about how hes a tuning god. lol /facepalm


Chill man, he just said that the gains are minimal around 4-5% and it is true.
You need to pay attention to the test conditions, his 12900KS runs all core 5.3 and the 13900K was 5.5.

Many games stop scaling with more Hz and even some cpu architectures have a point of diminishing returns after some Hz.

He did say that if his 12900K/KS (we all know it's the same) was @stock then the 13900K whould have a lot bigger percentage improvement. 

As for his tuning abilities, good or not he aplies the same on all models so it's comparable. 

I have never caught him do sneaky settings in his comparisons, if you have proof then point me to it.

I have no idea about the asian guy, i am talking about Frame Chasers.

To not like someones face is irrelevant.


----------



## BigMack70

bottjeremy said:


> This is what my Windforce is attaining. 3045mhz core clock and 24000mhz memory (depending on game)
> View attachment 2576565


Mine is about the same except you got +20MHz on the core over my card - I think anything above +200 core offset is unstable for me. 

I'm impressed this card has dual BIOS as well - not a feature I usually expect on the base SKU. May eventually play around with flashing an OC bios.


----------



## bottjeremy

BigMack70 said:


> Mine is about the same except you got +20MHz on the core over my card - I think anything above +200 core offset is unstable for me.
> 
> I'm impressed this card has dual BIOS as well - not a feature I usually expect on the base SKU. May eventually play around with flashing an OC bios.


Card seems to have plenty of headroom as temps are so low. If they allow any extra voltage, that will do the trick.


----------



## MikeGR7

GAN77 said:


> Плата нептуна аналогична Colorful GeForce RTX 4090 Vulcan OC-V , у которого 550.0 W
> Neptune board is similar to Colorful GeForce RTX 4090 Vulcan OC-V which has 550.0 W


Bro just read the review.

It may be the same pcb but this one has 630W power limit.
They literally show the picture from Gpu-Z.


----------



## jim2point0

Seems that with a small voltage increase, the power draw shoots up to 600w easily. So maybe that's why it's locked where it is?


----------



## J7SC

BigMack70 said:


> Mine is about the same except you got +20MHz on the core over my card - I think anything above +200 core offset is unstable for me.
> 
> I'm impressed this card has dual BIOS as well - not a feature I usually expect on the base SKU. May eventually play around with flashing an OC bios.


...Gigabyte seems to have bet early on various 4090 models, perhaps because of either what AMD RDNA3 might bring and/or the economy slowing. My Gaming OC was priced at US$ 1618 so near entry-level, apart from being the only card available locally. I wanted at least one card that can do DLSS3 and get close to my monitor's 4K 120. Once past 450W or so, it's all down to cooling, no matter what the brand or model.

BTW, does your Windforce warranty card say 'Gigabyte' or 'Aorus' ?


----------



## Carillo

J7SC said:


> ...3 GHz is nothing to sneeze at; what temps (incl. Hotspot, VRAM) during a full bore test run w/ max PL etc ?


Strange thing, the card runs fine 2940mhz @1005mV, but max 2984mhz @1100mV. Memory seems strong, +1700, can go higher but starts scaling negative. 52c core and 61c hot spot.









I scored 28 279 in Port Royal


Intel Core i5-12600K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com






Edit: My windforce does 3105mhz core, but only has 480w bios, so no point in PR. It's PL with 1100mV. My Strix arrives tomorrow or wednesday,.


----------



## J7SC

Carillo said:


> Strange thing, the card runs fine 2940mhz @1005mV, but max 2984mhz @1100mV. Memory seems strong, +1700, can go higher but starts scaling negative. 52c core and 61c hot spot.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 279 in Port Royal
> 
> 
> Intel Core i5-12600K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Nice ! My temps are 13 C higher :-(, though ambient is unusually warm here for this time of year.


----------



## Mad Pistol

Ugh, y'all are killing me. I can't wait to have the card tomorrow!


----------



## Carillo

J7SC said:


> Nice ! My temps are 13 C higher :-(, though ambient is unusually warm here for this time of year.


Yeah, it’s always cold in Norway. Even inside I’m wearing a coat 😂


----------



## BigMack70

J7SC said:


> BTW, does your Windforce warranty card say 'Gigabyte' or 'Aorus' ?


Everything just says "Gigabyte" - they didn't put any "Aorus" branding with this card.

I was initially a little disappointed I wasn't fast enough to snag the Aorus Waterforce card, because I was worried these would be loud, but this card is substantially more silent than my 3090 FE card was.

I appreciate the GPU support mount they included... this is now the first time I've ever used a level when installing a new GPU 🤣


----------



## J7SC

Carillo said:


> Yeah, it’s always cold in Norway. Even inside I’m wearing a coat 😂


W.Canada isn't exactly the Sahara, either, though the last few years are getting weirder and weirder. There are still forest fires raging south of here (Washington State) and West of here (Vancouver Island), and benching an air-cooled card at ~ 24 + C isn't fun. Neither is breathing the outside air because of the fires...
---
In other news, the Vancouver store where several folks apart from myself got their 4090 is now sold out (ditto for their main competitors).



BigMack70 said:


> Everything just says "Gigabyte" - they didn't put any "Aorus" branding with this card.
> 
> I was initially a little disappointed I wasn't fast enough to snag the Aorus Waterforce card, because I was worried these would be loud, but this card is substantially more silent than my 3090 FE card was.
> 
> I appreciate the GPU support mount they included... this is now the first time I've ever used a level when installing a new GPU 🤣


Thanks. I asked because my Gaming OC actually has an Aorus (-only) warrant card. At the end of the day, doing a custom water-block is the way to go for most models, IMO, unless it is 'only' for gaming and the occasional benching.


----------



## yzonker

J7SC said:


> Nice ! My temps are 13 C higher :-(, though ambient is unusually warm here for this time of year.


This is why you need a chiller. It works as an a/c unit too if you leave the rad fans on in your case.


----------



## KedarWolf

Mad Pistol said:


> Ugh, y'all are killing me. I can't wait to have the card tomorrow!


I have to wait for a month for my Strix OC.


----------



## DokoBG

My 4090 coming on Wednesday - same day the Uncharted bundle for PC gets released - so stoked !!!!


----------



## xcx xcxvgyt

Palit Gamerock(non oc) with 450w+0w  

I scored 28 249 in Port Royal 

Who made a card with zero power limit adjustment in 2022?? 

Thanks Palit.


----------



## AngryLobster

How are you guys cool with dumping 500w+ heat into your room? Gaming with this thing OC'd balls to the walls raises my room temp significantly. I've gone down to 2700mhz at 0.925v for stock performance but only 360w draw in worst case and usually quite a bit lower than that.

140w+ for 2.5FPS and sauna, no thanks.


----------



## morph.

AngryLobster said:


> How are you guys cool with dumping 500w+ heat into your room? Gaming with this thing OC'd balls to the walls raises my room temp significantly. I've gone down to 2700mhz at 0.925v for stock performance but only 360w draw in worst case and usually quite a bit lower than that.
> 
> 140w+ for 2.5FPS and sauna, no thanks.


air con?


----------



## morph.

J7SC said:


> Nice ! My temps are 13 C higher :-(, though ambient is unusually warm here for this time of year.


Hrmm I can't seem to push my memory that high or core clock much higher without the PR session crashing getting jelly in not being able to push pas 28k PR...


----------



## Mad Pistol

KedarWolf said:


> I have to wait for a month for my Strix OC.


Damn dude. I feel that one.


----------



## yt93900

xcx xcxvgyt said:


> Palit Gamerock(non oc) with 450w+0w
> 
> I scored 28 249 in Port Royal
> 
> Who made a card with zero power limit adjustment in 2022??
> 
> Thanks Palit.


Don't you dare complaining, it performs better than Waterforce Xtreme OC'ed with ~510W PL. How yours does it, no idea.








Result not found







www.3dmark.com


----------



## BTK

I was able to order a zotac trinity 4090 as that’s all I could get my hands on. Is there anything I should know is it an alright one

I do like the 5 year warranty


----------



## Arizor

Port Royal seems to run a bit weird on these cards, or perhaps the drivers. It's the only test to crash on me whatsoever.

I can run every other test, I can game for hours, but Port Royal consistently crashes.


----------



## dr/owned

AngryLobster said:


> How are you guys cool with dumping 500w+ heat into your room? Gaming with this thing OC'd balls to the walls raises my room temp significantly.


HVAC even without air conditioning should be able to handle a single GPU no problem. If 140W is make-or-break you've got other issues to address with your location more than it being a fault of the equipment.

This is also where remote watercooling can help if you care enough about being an "enthusiast".


----------



## Xavier233

Hi all, for those with the Gigabyte Gaming OC, I need your help! 

What is the minimum fan curve (in RPM or percentage) that you are able to run the fans at a steady speed? 

I set it to 45% and the fans are continuously going to zero, then ramp up to 45%, then back to zero, etc right now


----------



## yt93900

Could it be it's fighiting it's own fan controls? I can force my WFX to 100% fanspeed in MSI AB and yet under load it will start spinning down, and up, and down.

New HS, still very heavily power limited:








Result not found







www.3dmark.com




Crazy how it pretty much doubles 3090's score.


----------



## BigMack70

Well dang... I'm going to have to think more seriously about a CPU upgrade than I thought. 

Comparing data to the 12900K + 4090 results in this video:


Spoiler











*His results are, with a tuned 12900K and stock 4090, the following at 4k:*
Shadow of the Tomb Raider: 187 FPS
Horizon Zero Dawn: 160 FPS
Cyberpunk: 82 FPS
Forza Horizon 5: 171 FPS

*My results, with a 5.2 GHz 9900KS and tuned DDR4-4000 memory, are the following at 4k at identical settings to his video (overclocked 4090 results in parenthesis):*
SOTR: 178 FPS (190 OC)
HZD: 139 FPS (144 OC)
CP2077: 79 FPS (84 OC)
FH5: 124 FPS (128 OC)

So yeah, the 9900K bottlenecks some games by a large margin. The uber demanding RT titles won't matter; they're GPU bound no matter what. And my 4k screen is 120Hz so no problems yet. But I also have the rig hooked up to a 5120x1440 240Hz screen and that will definitely be a lot more CPU limited than I expected with this card.

Hmmm... guess I'll have to put some thought into this, when I expected it to be an easy "wait a few more years before upgrading" situation.

The 4090 is crazy fast.


----------



## Xavier233

BigMack70 said:


> Well dang... I'm going to have to think more seriously about a CPU upgrade than I thought.
> 
> Comparing data to the 12900K + 4090 results in this video:
> 
> 
> Spoiler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *His results are, with a tuned 12900K and stock 4090, the following at 4k:*
> Shadow of the Tomb Raider: 187 FPS
> Horizon Zero Dawn: 160 FPS
> Cyberpunk: 82 FPS
> Forza Horizon 5: 171 FPS
> 
> *My results, with a 5.2 GHz 9900KS and tuned DDR4-4000 memory, are the following at 4k at identical settings to his video (overclocked 4090 results in parenthesis):*
> SOTR: 178 FPS (190 OC)
> HZD: 139 FPS (144 OC)
> CP2077: 79 FPS (84 OC)
> FH5: 124 FPS (128 OC)
> 
> So yeah, the 9900K bottlenecks some games by a large margin. The uber demanding RT titles won't matter; they're GPU bound no matter what. And my 4k screen is 120Hz so no problems yet. But I also have the rig hooked up to a 5120x1440 240Hz screen and that will definitely be a lot more CPU limited than I expected with this card.
> 
> Hmmm... guess I'll have to put some thought into this, when I expected it to be an easy "wait a few more years before upgrading" situation.
> 
> The 4090 is crazy fast.


Well anything above your screen refresh wont matter. Then, for anything below your refresh rate, like CyberPunk 2077, I dont think u will ever notice a difference between 79 and 82 FPS. Looks like for most games it wont matter much


----------



## BigMack70

Xavier233 said:


> Well anything above your screen refresh wont matter. Then, for anything below your refresh rate, like CyberPunk 2077, I dont think u will ever notice a difference between 79 and 82 FPS. Looks like for most games it wont matter much


Yeah the only issue is my secondary screen (a Neo G9)... When i play on that, a new CPU would help.


----------



## Arizor

BigMack70 said:


> Well dang... I'm going to have to think more seriously about a CPU upgrade than I thought.
> 
> Comparing data to the 12900K + 4090 results in this video:
> 
> 
> Spoiler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *His results are, with a tuned 12900K and stock 4090, the following at 4k:*
> Shadow of the Tomb Raider: 187 FPS
> Horizon Zero Dawn: 160 FPS
> Cyberpunk: 82 FPS
> Forza Horizon 5: 171 FPS
> 
> *My results, with a 5.2 GHz 9900KS and tuned DDR4-4000 memory, are the following at 4k at identical settings to his video (overclocked 4090 results in parenthesis):*
> SOTR: 178 FPS (190 OC)
> HZD: 139 FPS (144 OC)
> CP2077: 79 FPS (84 OC)
> FH5: 124 FPS (128 OC)
> 
> So yeah, the 9900K bottlenecks some games by a large margin. The uber demanding RT titles won't matter; they're GPU bound no matter what. And my 4k screen is 120Hz so no problems yet. But I also have the rig hooked up to a 5120x1440 240Hz screen and that will definitely be a lot more CPU limited than I expected with this card.
> 
> Hmmm... guess I'll have to put some thought into this, when I expected it to be an easy "wait a few more years before upgrading" situation.
> 
> The 4090 is crazy fast.


I wouldn't worry mate, also I think you watched the wrong bit of the vid for some - at 4K, his result for Forza is 138fps, so really not too far away from you. I get 133fps on my ryzen 5900x / 4090 so you're really within the ballpark.


----------



## dr/owned

I'm not sure I understand where this "am I cpu bound" thing is coming from. Are you less than 200fps and on a platform less than 4 years old? -> You're not cpu bound.

Unless I'm missing where a new gpu somehow magically requires more cpu to feed it frames at the same fps as previous generations.

Are people just misinterpreting "really powerful gpu being under utilized leads to really high framerates"?


----------



## Xavier233

Whats the max temp this GPU is rated for b4 it starts throttling?


----------



## Mad Pistol

dr/owned said:


> I'm not sure I understand where this "am I cpu bound" thing is coming from. Are you less than 200fps and on a platform less than 4 years old? -> You're not cpu bound.
> 
> Unless I'm missing where a new gpu somehow magically requires more cpu to feed it frames at the same fps as previous generations.
> 
> Are people just misinterpreting "really powerful gpu being under utilized leads to really high framerates"?


The problem with PC gaming is that you always want to be GPU limited. If you're not, you're going to experience undesired hitches/glitches. Both CPU and RAM limitations lead to stuttering, while a GPU limitation leads to smooth gameplay (as long as the framerate is high enough and you have enough VRAM).


----------



## bmagnien

dr/owned said:


> I'm not sure I understand where this "am I cpu bound" thing is coming from. Are you less than 200fps and on a platform less than 4 years old? -> You're not cpu bound.
> 
> Unless I'm missing where a new gpu somehow magically requires more cpu to feed it frames at the same fps as previous generations.
> 
> Are people just misinterpreting "really powerful gpu being under utilized leads to really high framerates"?


The new spiderman at 4K with all settings max including RT and crowd size is CPU limited on the fastest available CPUs at around 113fps on the 4090.


----------



## alitayyab

According to this table (linked from LTT forums), the Zotac airo has the better power delivery ( 1320W core, 220W mem), in terms of total power as compared to the TUF (1260W/200W) and way better than MSI (trio-X; 900W/200W) and GB (gaming OC;1000W/ 200W ).

The AOS DrMOS on the zotac are rated for 55Amps continuous and up to 80amps which puts it in the same ball park as the MP used on FE/ Asus cards (which has up to 80amps).

The only let down on the zotac is power filtering.






RTX 4090 VRM meta-analysis and FE/AIB comparison


I've been watching a lot of content on the 4090s and with Buildzoid coming out with his two board analysis videos I wanted a quick reference to all 4090 cards in terms of VRM arrangements. It's interesting seeing the wild variance from the Halo cards (Suprim, STRIX, AIRO) and the OC cards (Trio, ...




linustechtips.com





TPU also shown pretty consistent power usage on the zotac
Already have a airo, with return windows closing fast. Confused whether to return and grab a TUF or keep the zotac.
TUF is the only choice as it is the only other card available here locally in addition to the Zotac and Palit.

All are priced about the same, with TUF commanding a bit of a premium.
Edit: I only play games, dont do any sort of competitive over clocking. But would like to not have the card blow up around the time warranty runs out.


----------



## Thebc2

yzonker said:


> Mine was only 2805. Doh.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [Official] NVIDIA RTX 4090 Owner's Club
> 
> 
> Is anyone applying an undervolt to get the best power performance. I've managed 1850/1000mv with +1000 on ram.
> 
> 
> 
> 
> www.overclock.net


Is this how you checked it? Under the VF tuner? 











Sent from my iPhone using Tapatalk Pro


----------



## J7SC

BigMack70 said:


> Well dang...* I'm going to have to think more seriously about a CPU upgrade* than I thought.
> 
> Comparing data to the 12900K + 4090 results in this video: (...)
> 
> So yeah, the 9900K bottlenecks some games by a large margin. The uber demanding RT titles won't matter; they're GPU bound no matter what. And my 4k screen is 120Hz so no problems yet. But I also have the rig hooked up to a 5120x1440 240Hz screen and that will definitely be a lot more CPU limited than I expected with this card.
> 
> Hmmm... guess I'll have to put some thought into this, *when I expected it to be an easy "wait a few more years before upgrading" situation.*
> 
> The 4090 is crazy fast.


...that's the slippery slope I know so well ! My last three new systems started 'just' with the latest & greatest GPU plugged into an existing 'newish' system which otherwise worked perfectly well - but then the thoughts of _'it would run even faster with a new mobo / CPU / RAM' _ crept in .

I game on a 48'' 4K 120, and while CP '77 is not CPU limited, my other fav (FlightSim 2020) is, even w/ a well-running 5950X, as that sim is badly optimized...hoping to wait until Intel Meteor Lake / AM5 refresh, but as temps are forecast to finally drop here (much, much needed rain forecast for the weekend), the system update temptations will rise...first things first though: I need to integrate this air-cooled 4090 into my existing loop once the right kind of water-block is available.

With a few degrees colder this evening, did some more 3DM; 5950X near-maxed, looks like (ergo ^)


----------



## morph.

J7SC said:


> ...that's the slippery slope I know so well ! My last three new systems started 'just' with the latest & greatest GPU plugged into an existing 'newish' system which otherwise worked perfectly well - but then the thoughts of _'it would run even faster with a new mobo / CPU / RAM' _ crept in .
> 
> I game on a 48'' 4K 120, and while CP '77 is not CPU limited, my other fav (FlightSim 2020) is, even w/ a well-running 5950X, as that sim is badly optimized...hoping to wait until Intel Meteor Lake / AM5 refresh, but as temps are forecast to finally drop here (much, much needed rain forecast for the weekend), the system update temptations will rise...first things first though: I need to integrate this air-cooled 4090 into my existing loop once the right kind of water-block is available.
> 
> With a few degrees colder this evening, did some more 3DM; 5950X near-maxed, looks like (ergo ^)
> View attachment 2576606


Some big numbers... what ram specs are you running exactly. Also what's your normal TS score?


----------



## Azazil1190

J7SC said:


> ...that's the slippery slope I know so well ! My last three new systems started 'just' with the latest & greatest GPU plugged into an existing 'newish' system which otherwise worked perfectly well - but then the thoughts of _'it would run even faster with a new mobo / CPU / RAM' _ crept in .
> 
> I game on a 48'' 4K 120, and while CP '77 is not CPU limited, my other fav (FlightSim 2020) is, even w/ a well-running 5950X, as that sim is badly optimized...hoping to wait until Intel Meteor Lake / AM5 refresh, but as temps are forecast to finally drop here (much, much needed rain forecast for the weekend), the system update temptations will rise...first things first though: I need to integrate this air-cooled 4090 into my existing loop once the right kind of water-block is available.
> 
> With a few degrees colder this evening, did some more 3DM; 5950X near-maxed, looks like (ergo ^)
> View attachment 2576606


Super fast!
Too fast too furious


----------



## J7SC

Azazil1190 said:


> Super fast!
> Too fast too furious





morph. said:


> Some big numbers... what ram specs are you running exactly. Also what's your normal TS score?


Thanks gents ! 

...on the RAM question: IF/1900 DDR4 3800 CL14 14 14 ...haven't tried regular TS yet (my 6900XT might still beat the 4090 at that resolution )


----------



## Nizzen

Xavier233 said:


> Well anything above your screen refresh wont matter. Then, for anything below your refresh rate, like CyberPunk 2077, I dont think u will ever notice a difference between 79 and 82 FPS. Looks like for most games it wont matter much


Anything fps above refreshrate results in lower inputlag and smoother gameplay in most games. More fps is allways better 
For people that play 4k 60hz with vsync=on in singleplayer games, it won't matter


----------



## Emmanuel

J7SC said:


> ...that's the slippery slope I know so well ! My last three new systems started 'just' with the latest & greatest GPU plugged into an existing 'newish' system which otherwise worked perfectly well - but then the thoughts of _'it would run even faster with a new mobo / CPU / RAM' _ crept in .
> 
> I game on a 48'' 4K 120, and while CP '77 is not CPU limited, my other fav (FlightSim 2020) is, even w/ a well-running 5950X, as that sim is badly optimized...hoping to wait until Intel Meteor Lake / AM5 refresh, but as temps are forecast to finally drop here (much, much needed rain forecast for the weekend), the system update temptations will rise...first things first though: I need to integrate this air-cooled 4090 into my existing loop once the right kind of water-block is available.
> 
> With a few degrees colder this evening, did some more 3DM; 5950X near-maxed, looks like (ergo ^)
> View attachment 2576606


I am trying to stay off that slippery slope as I'll be upgrading to a 4090 whenever my Tuf preorder ships. Same here, MSFS 2020 is giving me some dumb ideas, like replace my perfectly tuned and stable system with a new Z790, 13900KS (when that comes out) and the fastest DDR5. But I feel like it's still a little early for DDR5 (latency wise), and the 13900KS won't be out for a while. I know the best course of action is to wait and let the new technology mature a bit.


----------



## Azazil1190

J7SC said:


> Thanks gents !
> 
> ...on the RAM question: IF/1900 DDR4 3800 CL14 14 14 ...haven't tried regular TS yet (my 6900XT might still beat the 4090 at that resolution )


Make a regular ts to see. I dont think that but you can try


----------



## ts2fe

I'm so torn on what card brand to get because everyone keeps talking good & bad about each and then people talking about best power design and coil whine.. Is there any solid recommendations people have to where i'm the least likely to regret it?


----------



## Alemancio

ts2fe said:


> I'm so torn on what card brand to get because everyone keeps talking good & bad about each and then people talking about best power design and coil whine.. Is there any solid recommendations people have to where i'm the least likely to regret it?


No official list yet but this is what I've gathered:

1. Unless you're doing LN2, you do not need to look at Power Stages at all.
2. Nvidia's FE Card seems to get better Core and Mem binning.
3. Asus isnt the king anymore (as it was in the 3000 series), not in cooling nor in build quality.
4. MSI and overall Gigabyte have picked up the slack and upped the quality.
5. For OC'ing, you need sillicon lottery.
6. Avoid Palit

I'm getting any brand from the following: MSI, Asus, Gigabyte, Nvidia


----------



## dr/owned

inedenimadam said:


> Gaming X Trio pcb shots. I was curious, so I took it apart. Thought you guys might like to see it too.
> 
> Edit to add: one of the easiest disassembly jobs I’ve done on a GPU in a while. One type of screw for entire backplate, screws on spring mount are captured, and then one more type for I/O cover.
> Might do a shunt mod later if bios issues prevent flashing


Did you get a tighter shot of the VRM component being used? Dude on LTT is quoting an ON Semi 55A DrMOS whereas the Suprim uses MCP. It would seem weird that they would maintain two supply lines for a card that shares a PCB but who knows. The LTT source is citing "CoolPC" and I can't find that info on their page for the card.

EDIT: nevermind confirmed


----------



## Nizzen

Alemancio said:


> No official list yet but this is what I've gathered:
> 
> 
> 3. Asus isnt the king (as for the 3090), not in cooling nor in build quality.


Haven't looked at many reviews, but I have 4090 Tuf OC. What card has better aircooling cooling? Tuf cooling is very impressing, so someone is better, it must be pretty epic


----------



## Alemancio

Nizzen said:


> Haven't looked at many reviews, but I have 4090 Tuf OC. What card has better aircooling cooling? Tuf cooling is very impressing, so someone is better, it must be pretty epic


I've seen several reviews, only for cooling it seems these are the tiers:

A+ Tier: Suprim Liquid (not worth the extra $$$ IMHO)
A Tier: Strix, TUF, Suprim, Gigabyte Gaming Oc
B Tier: Nvidia FE (its not bad at all, but AIBs are slightly better


----------



## EarlZ

Any feedback on the Gigabyte Aorus Master 4090? My retailer is going to get their stocks this week and its strange that I have not seen 1 review/teardown information about this exact model.


----------



## Alemancio

EarlZ said:


> Any feedback on the Gigabyte Aorus Master 4090? My retailer is going to get their stocks this week and its strange that I have not seen 1 review/teardown information about this exact model.


I havent seen anything about it, seems to have been low on stock thus no reviews?


----------



## LukeOverHere

ts2fe said:


> I'm so torn on what card brand to get because everyone keeps talking good & bad about each and then people talking about best power design and coil whine.. Is there any solid recommendations people have to where i'm the least likely to regret it?


To be honest, i cant really see much of a difference at the moment. I went with a Gainward Phantom as it was $2799 against an Asus Strix which is $3799 lol…. Am i really going to gain $1000 extra worth of performance with the Strix… hell no…. Every other card in australia is $3300 upwards, so Gainward is really the only affordable card in Australia (At the moment) and all i see is great feedback. If your going to literally buy a card to run an OC Curve, or just push the Power Sider up and add a custom fan curve and slap a few Mhz on the core clock and memory, i don’t think you Will regret any brand at the moment. Especially when we start seeing BIOS’s available, you can push it a bit further and still make great improvements. It might not be in the A-Tier, but are you going to be that sad about missing out on a few Mhz at most and 2 FPS difference?


----------



## Nd4spdvn

I am getting some good results here with the Suprim X and my little 5800X3D. Though my card is not a good RAM clocker vs some crazy good numbers from FEs and some other AIBs. Holding no4 spot in HoF currently. 








I scored 29 528 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## StreaMRoLLeR

Xavier233 said:


> Whats the max temp this GPU is rated for b4 it starts throttling?


at 57C 1 bin drops


----------



## Alberto_It

Nd4spdvn said:


> I am getting some good results here with the Suprim X and my little 5800X3D. Though my card is not a good RAM clocker vs some crazy good numbers from FEs and some other AIBs. Holding no4 spot in HoF currently.
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 528 in Port Royal
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Can you explain me your Afterbuner settings? My Suprim X have the following settings

Temp and Power Limit to max, +1500 ram, +200 core and voltage 100% but my score Is 28294


----------



## GAN77

Streamroller said:


> at 57C 1 bin drops


What is the frequency drop step?


----------



## zware62

BigMack70 said:


> Mine is about the same except you got +20MHz on the core over my card - I think anything above +200 core offset is unstable for me.
> 
> I'm impressed this card has dual BIOS as well - not a feature I usually expect on the base SKU. May eventually play around with flashing an OC bios.


Are you card recognized by nvflash (v *5.763.0*) ... I have the same card but it doesnt see it. So i cant dump/load any vbios!


----------



## renejr902

BigMack70 said:


> Well dang... I'm going to have to think more seriously about a CPU upgrade than I thought.
> 
> Comparing data to the 12900K + 4090 results in this video:
> 
> 
> Spoiler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *His results are, with a tuned 12900K and stock 4090, the following at 4k:*
> Shadow of the Tomb Raider: 187 FPS
> Horizon Zero Dawn: 160 FPS
> Cyberpunk: 82 FPS
> Forza Horizon 5: 171 FPS
> 
> *My results, with a 5.2 GHz 9900KS and tuned DDR4-4000 memory, are the following at 4k at identical settings to his video (overclocked 4090 results in parenthesis):*
> SOTR: 178 FPS (190 OC)
> HZD: 139 FPS (144 OC)
> CP2077: 79 FPS (84 OC)
> FH5: 124 FPS (128 OC)
> 
> So yeah, the 9900K bottlenecks some games by a large margin. The uber demanding RT titles won't matter; they're GPU bound no matter what. And my 4k screen is 120Hz so no problems yet. But I also have the rig hooked up to a 5120x1440 240Hz screen and that will definitely be a lot more CPU limited than I expected with this card.
> 
> Hmmm... guess I'll have to put some thought into this, when I expected it to be an easy "wait a few more years before upgrading" situation.
> 
> The 4090 is crazy fast.


I will beat you lol. i'm buying a Asus 4090 Strix on a intel i7 4790k cpu with a msi gaming 5 Z97 motherboard but my 4790k is overclocked. Not to much bottleneck with my evga 3090 ftw3 ultra and some benchmark show 4770k on a 3090 was not too bad, but with a 4090 it will be funny. I really try to find and buy a 4090 strix in shop like canadacomputer or in a online shop, but i didnt succeed yet.

I will upgrade my motherboard and cpu in January.

I played with a Samsung QN90a TV 50" 4k 120hz with gsync compatible. I only play in 4k. And 120hz is the maximum, so any higher fps wont matter much. 

By the way i choose the Asus Strix because i want the best video card build quality possible, i want to keep it for 5-6 years, i dont want it to die too early. I suppose the Asus strix is the best or one of the best build quality ? Am i right ? 

( yes i will overclock it but i wont play with voltage this time, i dont want it to die too early, i want to keep it really for a long time this time )

Last thing, do you think nvdia could release a 4090ti in january or before ? Is it something possible ? if amd release some strong gpu the 3 november ? I could have wait until january and upgrade everything in the same time. In the worst case maybe wait until march or april for upgrade, but not later, is it still possible a 4090ti release in april or before ? Thanks for answer guys.


----------



## Nd4spdvn

Alberto_It said:


> Can you explain me your Afterbuner settings? My Suprim X have the following settings
> 
> Temp and Power Limit to max, +1500 ram, +200 core and voltage 100% but my score Is 28294


Sure, mine for the run were: Temp and Power Limit to max, +1024 ram, +165 core and voltage 100%, fans blasting at 100%


----------



## Carillo

J7SC said:


> ...that's the slippery slope I know so well ! My last three new systems started 'just' with the latest & greatest GPU plugged into an existing 'newish' system which otherwise worked perfectly well - but then the thoughts of _'it would run even faster with a new mobo / CPU / RAM' _ crept in .
> 
> I game on a 48'' 4K 120, and while CP '77 is not CPU limited, my other fav (FlightSim 2020) is, even w/ a well-running 5950X, as that sim is badly optimized...hoping to wait until Intel Meteor Lake / AM5 refresh, but as temps are forecast to finally drop here (much, much needed rain forecast for the weekend), the system update temptations will rise...first things first though: I need to integrate this air-cooled 4090 into my existing loop once the right kind of water-block is available.
> 
> With a few degrees colder this evening, did some more 3DM; 5950X near-maxed, looks like (ergo ^)
> View attachment 2576606


This is what i wake up to ? I hoped this day would be Port Royal-Free 😂 Nice score!


----------



## Carillo

Nd4spdvn said:


> I am getting some good results here with the Suprim X and my little 5800X3D. Though my card is not a good RAM clocker vs some crazy good numbers from FEs and some other AIBs. Holding no4 spot in HoF currently.
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 528 in Port Royal
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Great score. I'm 100% sure my 12600K is holding back my 4090


----------



## RaMsiTo

Nd4spdvn said:


> I am getting some good results here with the Suprim X and my little 5800X3D. Though my card is not a good RAM clocker vs some crazy good numbers from FEs and some other AIBs. Holding no4 spot in HoF currently.
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 528 in Port Royal
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


I'm next to you, better frequencies but an old 9900k, 🫡, try time spy extreme .









I scored 29 509 in Port Royal


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Nd4spdvn

RaMsiTo said:


> I'm next to you, better frequencies but an old 9900k, 🫡, try time spy extreme .


I know, was eyeing your result for a few days now, felt kinda good to beat your score... sorry.


----------



## RaMsiTo

Nd4spdvn said:


> I know, was eyeing your result for a few days now, felt kinda good to beat your score... sorry.


no problem, the winter is coming 😂.


----------



## Zogge

Palit Gamerock OC stock 1.1V = 2878Mhz


----------



## yt93900

Shouldn't that be 1.05V at stock?


----------



## Nico67

GAN77 said:


> What is the frequency drop step?


15mhz, no idea where they start drop from though, probably in the 20c range. Wait for waterblocks and chillers, or LN2 streams to see the temp points. Probably 60-75mhz+ if you can drop your temps by enough.


----------



## Zogge

I just checked the curve in afterburner at stock settings. 1.05V is 2805Mhz


----------



## Nd4spdvn

Zogge said:


> Palit Gamerock OC stock 1.1V = 2878Mh


I was getting 2970MHz at stock 1.05V with the Suprim X.


----------



## Xavier233

Anyone knows the max rated temp for this gpu before it throttles?


----------



## jl434

J7SC said:


> Thanks. I asked because my Gaming OC actually has an Aorus (-only) warrant card. At the end of the day, doing a custom water-block is the way to go for most models, IMO, unless it is 'only' for gaming and the occasional benching.


My Gaming OC also with Aorus warrant card, and can register for 4 years warranty via GIGABYTE ∣ AORUS Product Registration


----------



## sblantipodi

all out boot loop problems solved, more info here:








RTX4090 and boot loop


Yeah I had a feeling it was power related as well, as disabling PCIe power saving in my Asrock bios worked for me as I reported yesterday. I no longer have issues. So yes. I set prefer maximin performance in Nvidia control panel and it’s working fine. Re-bar was causing CTD. Honestly can’t...




www.overclock.net





I have another small issue.

I have an MSI Suprim X, if I use the normal profile the card uses a power limit of 100% and can be set up to 115% like it should be.

If I use the gaming profile the card uses a power limit of 93% and can be set up to 108%...

Any idea?


----------



## Nd4spdvn

sblantipodi said:


> I have an MSI Suprim X, if I use the normal profile the card uses a power limit of 100% and can be set up to 115% like it should be.
> 
> If I use the gaming profile the card uses a power limit of 93% and can be set up to 108%...


Same thing with my Suprim X too so I guess this is by design... We do need a 600W BIOS on the most power delivery overbuilt 4090 out there....


----------



## Nd4spdvn

And I also have an issue, related with G-Sync on my LG CX OLED. It looks that I only get VRR mode when I engage G-Sync (fullscreen only or fullscreen + windowed), whereas on 3080ti it worked perfectly fine in G-Sync mode with the tv reporting G-Sync. Am I the only one with this? Thinking that something is fishy with the latest driver, also seeing some inconsistencies with HDR which I still need to look into.


----------



## sblantipodi

Nd4spdvn said:


> Same thing with my Suprim X too so I guess this is by design... We do need a 600W BIOS on the most power delivery overbuilt 4090 out there....


but why gaming BIOS let me set only 108% power limit?
isn't the gaming BIOS supposed to let me higher the power limit more than the standard bios?


----------



## BigMack70

J7SC said:


> ...that's the slippery slope I know so well ! My last three new systems started 'just' with the latest & greatest GPU plugged into an existing 'newish' system which otherwise worked perfectly well - but then the thoughts of _'it would run even faster with a new mobo / CPU / RAM' _ crept in .


Yup I am familiar with it as well... this 9900ks was purchased because my 5930k wasn't really up to the task in some games when the 2080 Ti launched.

I've gotta be honest, though, I am not very impressed by either AMD or Intel's current DDR5 offerings. The 12900k would be an easy upgrade if all I needed were a new CPU, but I'm not yet convinced that it's worth a full platform swap. And I'm not hearing good things about the 13900k.

Maybe I'll just keep an eye on the used market and see if I can pick something up at a discount until AMD/Intel can get their act together and release a new CPU that's actually better than what we've had for years without just cranking power and thermals through the roof.

My 9900KS looks good for 120-144 fps gaming. What I need is a CPU that can drive 240fps gaming, and neither AMD nor Intel has made that CPU yet


----------



## BluePaint

sblantipodi said:


> but why gaming BIOS let me set only 108% power limit?
> isn't the gaming BIOS supposed to let me higher the power limit more than the standard bios?


Because the base powertarget (100%) is higher with the gaming BIOS. The max powertarget is the same for both.
You can see that when u compare the BIOS info on TPs site:
Silent 450W base +16% = 520 max MSI RTX 4090 Silent VBIOS
Gaming 480W base +8%= 520 max MSI RTX Gaming 4090 VBIOS
The idea is that manually both power targets can be raised to the same level but out of the box the silent BIOS has the lower one.


----------



## Azulath




----------



## WayWayUp

no restocks yet at retail?


----------



## BigMack70

WayWayUp said:


> no restocks yet at retail?


I think there have been some, but you will almost certainly need a stock tracker app or discord open to notify you when stock drops so you can get it before it sells out.


----------



## jim2point0

AngryLobster said:


> How are you guys cool with dumping 500w+ heat into your room? Gaming with this thing OC'd balls to the walls raises my room temp significantly. I've gone down to 2700mhz at 0.925v for stock performance but only 360w draw in worst case and usually quite a bit lower than that.
> 
> 140w+ for 2.5FPS and sauna, no thanks.


It's starting to get cold here. I work from home, and sometimes I use a space heater in my room to warm up so I don't have to waste oil heating the entire house. That space heater is 1500w. So a 500w video card is nothing 




BigMack70 said:


> I think there have been some, but you will almost certainly need a stock tracker app or discord open to notify you when stock drops so you can get it before it sells out.


Do you have some recommendations?


----------



## ZealotKi11er

How are you guys getting such high PR score. I am here cant even get 27.3K. Never did PR benchmarking before especially with Nvidia GPU.


----------



## Carillo

AngryLobster said:


> How are you guys cool with dumping 500w+ heat into your room? Gaming with this thing OC'd balls to the walls raises my room temp significantly. I've gone down to 2700mhz at 0.925v for stock performance but only 360w draw in worst case and usually quite a bit lower than that.
> 
> 140w+ for 2.5FPS and sauna, no thanks.


My TUF is 600w, and I’m actually still cold. So I ordered the Elmor EVC and some shunts 🤷‍♂️


----------



## BigMack70

ZealotKi11er said:


> How are you guys getting such high PR score. I am here cant even get 27.3K. Never did PR benchmarking before especially with Nvidia GPU.


27.3k is my max PR score. Runs average 27k-27.3k for me. That's with +1500 memory offset and an average clock speed through the run of just over 2900 MHz (+200 core offset).


----------



## Nizzen

4090 tuf oc is pretty good


----------



## lordkahless

Apparently for people wanting to get a 4090 FE from the GFE Priority Access program there is a bug in the Nvidia account under preferences that keeps reverting your location from whatever country you are in to blank. I checked my account and have had to change it back to USA several times. Only hours later it reverts back to blank. No idea if that is preventing me from receiving a link. I saw somebody pointed this out in another forum. Either way, its pretty annoying.


----------



## stefxyz

Is there an after market 90 degree angle adapter for the 12v connector ? No way I can close the lian li Bauer o11xxl with the Gigabyte Gaming oc. The side panel has no space and this won’t improve with a block …


----------



## Dragonsyph

MikeGR7 said:


> Chill man, he just said that the gains are minimal around 4-5% and it is true.
> You need to pay attention to the test conditions, his 12900KS runs all core 5.3 and the 13900K was 5.5.
> 
> Many games stop scaling with more Hz and even some cpu architectures have a point of diminishing returns after some Hz.
> 
> He did say that if his 12900K/KS (we all know it's the same) was @stock then the 13900K whould have a lot bigger percentage improvement.
> 
> As for his tuning abilities, good or not he aplies the same on all models so it's comparable.
> 
> I have never caught him do sneaky settings in his comparisons, if you have proof then point me to it.
> 
> I have no idea about the asian guy, i am talking about Frame Chasers.
> 
> To not like someones face is irrelevant.


Just watch the video you guys posted, around like 10:30 or so he shows game settings for 13900k and then 12900k. They are different settings. Higher on the 13900k. Thats all im saying LOL. The cpu might be slower or faster, im not question that. Just pointing out different test results. Have a great day.


----------



## changboy

CableMod Makes NVIDIA GeForce RTX 4090 Installation Easier With Its 90-Degree Angled 16-Pin Connector


CableMod has heard consumers and is going to offer a 90-degree angled 16-pin connector to make installing the NVIDIA GeForce RTX 4090 easier.




wccftech.com


----------



## WayWayUp

Nizzen said:


> 4090 tuf oc is pretty good


i see it was a custom run so that score is irrelevant without context


----------



## RaMsiTo

Nizzen said:


> good


mod? Ln2?


----------



## Nizzen

WayWayUp said:


> i see it was a custom run so that score is irrelevant without context


Normal run, but maybe too fast without internet 

Modded yes


----------



## Dragonsyph

Any one have a chart of VRM max temps for each card at stock? Or all the max hot spots?


----------



## BigMack70

Hmmm... my 9900k can't do 120fps in Elden Ring... I might have to cave and get a 12900k or 13900k. 💸💸


----------



## Rbk_3

Can CPU affect PR scores at all? I can only manage to get about 27k with 3000MHZ +1000 mem (can't go higher in memory with out getting artifacting) 

Have a temp 12400 till I can get my 13900k


----------



## bottjeremy

So... Rise of the Tomb Raider max settings is an ultimate stress test for my 4090. Try it with all max settings on (have to manually set some of them), SSAA4x, and no DLSS and let me know what your findings are. Hits my GPU as hard as Cyberpunk. I think if my card could do over 500w, the game would use it. Had a 165hz cap set during video.


----------



## Xavier233

Hi for those with the Gigabyte Gaming OC: what is the minimum fan curve (in RPM or percentage) that you are able to run the fans at a steady speed (without the fan going to zero RPM)?

When I set mine to 45% (around 850 RPM), right now the fans are continuously going to zero, then ramp up to 45%, then back to zero, in a loop. Is that normal?


----------



## Hawk777th

Looking to grab a 4090 Asap. I have always gone with FE cards on my Titans? Should I just go with the FE or is the Asus or MSi a better card this time around? I have heard the FE binned?


----------



## bottjeremy

Hawk777th said:


> Looking to grab a 4090 Asap. I have always gone with FE cards on my Titans? Should I just go with the FE or is the Asus or MSi a better card this time around? I have heard the FE binned?


They are all good. Never know which Bin you are going to get TBH. It's a lottery. Get the card that looks good in your case for a price you are willing to pay.


----------



## BigMack70

Hawk777th said:


> Looking to grab a 4090 Asap. I have always gone with FE cards on my Titans? Should I just go with the FE or is the Asus or MSi a better card this time around? I have heard the FE binned?


I don't think the FE is binned. If you're going to be on air, I don't think it really matters which model you get; all the coolers are over-designed so they're all going to be cool & quiet. I'd say get the cheapest 4090 you can find in stock. Seems like pretty much all models are going to do 2900-3050 MHz for +4-7% performance.


----------



## Glottis

Joining the club with TUF OC. Using official Corsair 16pin cable. Perfect fit inside Fractal Torrent, plenty of space left on all sides and built in GPU support bracket is awesome. One of those days I'll get black version of D15... but for now time to enjoy this thing.


----------



## dante`afk

I just picked up my FE from Best Buy Union NJ, they have one more in stock of anyone lives nearby


----------



## Xavier233

dante`afk said:


> I just picked up my FE from Best Buy Union NJ, they have one more in stock of anyone lives nearby


We have the same mobo and CPU, are you running your ram kit stock or OCed to 6400/CL30 please (what is the original kit specs)?


----------



## Rbk_3

What are you guys using for support brackets? The Gigabyte Gaming OC bracket doesn't work with my MSI Z690 unfortunately.


----------



## Xavier233

Rbk_3 said:


> What are you guys using for support brackets? The Gigabyte Gaming OC bracket doesn't work with my MSI Z690 unfortunately.


I had one from my previous GPU, re-used it on the 4090 as well: https://www.amazon.ca/Graphics-Supp...gifQ==&sprefix=GPU+support+bra,aps,102&sr=8-9


----------



## bmagnien

dante`afk said:


> I just picked up my FE from Best Buy Union NJ, they have one more in stock of anyone lives nearby


Sent you a pm


----------



## ThinbinJim

RaMsiTo said:


> Curve by default 2940 mhz.
> View attachment 2576138


Really strong bin on that chip. My strix oc is 90mhz slower than yours at 1100mv


----------



## Benjit

Rbk_3 said:


> What are you guys using for support brackets? The Gigabyte Gaming OC bracket doesn't work with my MSI Z690 unfortunately.


For £7 got this it just fits without having to extend it and lifts the GPU by 2-3mm. Magnetic on the lower part so holds on the metal and a rubber section on the top part.









There wasn't much sag anyway but thought better to be safe.

Graphics Card GPU Brace Support, Black Universal Video Card Holder Bracket for Various Computers https://amzn.eu/d/ceyllER


----------



## MikeGR7

Dragonsyph said:


> Just watch the video you guys posted, around like 10:30 or so he shows game settings for 13900k and then 12900k. They are different settings. Higher on the 13900k. Thats all im saying LOL. The cpu might be slower or faster, im not question that. Just pointing out different test results. Have a great day.


Yes i saw it now, nothing major, i don't think he did it on purpose but still thank you for pointing it out, I'll keep a tight eye on his comparisons from now on.


----------



## kx11




----------



## mirkendargen

Nizzen said:


> 4090 tuf oc is pretty good
> View attachment 2576654


Is this like...volt mod+shunt mod+open case outside in Norway at night? Lol.


----------



## J7SC

Carillo said:


> This is what i wake up to ? I hoped this day would be Port Royal-Free 😂 Nice score!


...just a few degrees lower ambient, same settings. But at 550W ++ my card is running 62 C (Hotspot 88C+) while other folks are in the low 40s and 50s for GPU temp - not a secret in benching, really the answer is water-block + true fall/ winter temps


----------



## J7SC

...sorry for the sequential posts; starting back several pages; lots of catching up to do. 



Nizzen said:


> Normal run, but maybe too fast without internet
> 
> Modded yes
> View attachment 2576658


...any info on temps (general, Hotspot, VRAM) ?


----------



## Nizzen

J7SC said:


> ...sorry for the sequential posts; starting back several pages; lots of catching up to do.
> 
> 
> 
> ...any info on temps (general, Hotspot, VRAM) ?


This friday LOL


----------



## kx11




----------



## sblantipodi

my Suprim X is not that bad.


----------



## Xavier233

Anyone knows whats the safe highest daily temps this GPU can run? At which temp will it start being a problem or throttle?


----------



## Benni231990

so can anybody confirm a succesfully flash ? or we cant flash at this point and must wait for an update for nvflash?


----------



## dante`afk

Xavier233 said:


> We have the same mobo and CPU, are you running your ram kit stock or OCed to 6400/CL30 please (what is the original kit specs)?


my sig is not updated, i'm on a msi unify now and ram runs on stock settings 6400 (teamgroup).


pretty happy with my FE, does 250/1500 on air.


----------



## GTANY

250 on air = what is the real frequency ?


----------



## bmagnien

Anyone seen a Gigabyte OC block, other than the upcoming EKWB option?


----------



## Baasha

dante`afk said:


> my sig is not updated, i'm on a msi unify now and ram runs on stock settings 6400 (teamgroup).
> 
> 
> pretty happy with my FE, does 250/1500 on air.
> 
> View attachment 2576697


Nice man. Just curious - did you "reserve" your 4090 FE through the GFE link or did you just walk into BB and purchase it?


----------



## Mad Pistol

These things need to go on a diet. Pictures do them no justice... they are HUGE!!!

(3080 FE on left)


----------



## yzonker

Looks like MSI is trying to cash in on early sales after all. 






MSI GeForce RTX 4090 SUPRIM LIQUID X 24G - MSI-US Official Store


GeForce RTX 4090 SUPRIM LIQUID X 24G




us-store.msi.com


----------



## mirkendargen

Mad Pistol said:


> These things need to go on a diet. Pictures do them no justice... they are HUGE!!!
> 
> (3080 FE on left)
> 
> View attachment 2576706


It is indeed huge, but honestly lighter than I was expecting. Significantly lighter than my blocked 3090 is.


----------



## Mad Pistol

In the 3DMark Ray Tracing Feature Test, the 4090 score is nearly 3x the 3080. Nvidia gave the 4090 some serious RT performance.


----------



## Jimshown LMHF

Hello everyone.
I read a little all this thread, not easy being French and bad in English.
I came here because I can't understand the logic of the Port Royal scores.
I tested several 4090, gigabyte gaming oc, Pny xlr8, asus Tuf, tuf oc, Msi supprim X. And my observation is that there is a problem.
I tested a lot of things, forced rebar, several OS, But no, I never really exceed 28k2, with bench OS, Z690 UX with some bios, full Oc 12900ks + ddr5, I even tried the ddr5 A-die, nothing done there.
I see people with low core and memory clocks but approaching or exceeding 29k.
Does anyone have the explanation here? I can't find the solution. Thanks for reading me.


----------



## dante`afk

Baasha said:


> Nice man. Just curious - did you "reserve" your 4090 FE through the GFE link or did you just walk into BB and purchase it?


Yes I had it reserved, a day after the news came out I got a ref link in my GFE.

I never used GFE daily, only occasionally when I was testing some things. Once the news came out I updated it and let it run for a day and got the link. However, I did not get an email, i just wen tto account > redeem and saw it.


----------



## mirkendargen

Jimshown LMHF said:


> Hello everyone.
> I read a little all this thread, not easy being French and bad in English.
> I came here because I can't understand the logic of the Port Royal scores.
> I tested several 4090, gigabyte gaming oc, Pny xlr8, asus Tuf, tuf oc, Msi suppressed X. And my observation is that there is a problem.
> I tested a lot of things, forced rebar, several OS, But no, I never really exceed 28k2, with bench OS, Z690 UX with some bios, full Oc 12900ks + ddr5, I even tried the ddr5 A-die, nothing done there.
> I see people with low core and memory clocks but approaching or exceeding 29k.
> Does anyone have the explanation here? I can't find the solution. Thanks for reading me.


Only 35 people in the world are getting better than 28k2, and at least some of them are using ln2/exotic cooling/volt modding/shunt modding. I don't think you're doing anything wrong, just a few people are lucky and have slightly better cards and a few people are modding cards and using exotic cooling.


----------



## yzonker

Jimshown LMHF said:


> Hello everyone.
> I read a little all this thread, not easy being French and bad in English.
> I came here because I can't understand the logic of the Port Royal scores.
> I tested several 4090, gigabyte gaming oc, Pny xlr8, asus Tuf, tuf oc, Msi supprim X. And my observation is that there is a problem.
> I tested a lot of things, forced rebar, several OS, But no, I never really exceed 28k2, with bench OS, Z690 UX with some bios, full Oc 12900ks + ddr5, I even tried the ddr5 A-die, nothing done there.
> I see people with low core and memory clocks but approaching or exceeding 29k.
> Does anyone have the explanation here? I can't find the solution. Thanks for reading me.


What was your average temp for those runs?


----------



## Jimshown LMHF

yzonker said:


> What was your average temp for those runs?


Around 55-60c Gpu. Max 57 for my last run.


----------



## Jimshown LMHF

mirkendargen said:


> Only 35 people in the world are getting better than 28k2, and at least some of them are using ln2/exotic cooling/volt modding/shunt modding. I don't think you're doing anything wrong, just a few people are lucky and have slightly better cards and a few people are modding cards and using exotic cooling.


I'm trying to understand the logic when I see these high 2 scores with low clock (no mod, no ln2...) :








I scored 29 662 in Port Royal


Intel Core i9-12900KS Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com












I scored 28 249 in Port Royal


AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Xavier233

For reference and for anyone reading this: the max rated temp for the GPU is 90C


----------



## yzonker

Jimshown LMHF said:


> I'm trying to understand the logic when I see these high 2 scores with low clock (no mod, no ln2...) :
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 662 in Port Royal
> 
> 
> Intel Core i9-12900KS Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 249 in Port Royal
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Well one possibility is that effective clocks are relatively high for those runs due to the way they set the VF curve. (which is not what 3DMark reports). 

And temp matters a lot. You're not too much warmer but some. Not enough to account for the difference. 

And then full system tuning does have some impact on the scores which results in higher scores for lower GPU clocks. You're doing a lot of them, but there are probably more.


----------



## xcx xcxvgyt

Second one is mine score . My card is power limited to 450w this is why core clock is that low. Unable to boost higher at port royal.

I need to save as much as power to get room for higher frame. Otherwise i ended up 3100mhz core with 450w but lower frame rate.

So +0v voltage & the highest freq resulted around 2880/2850mhz

Here is the better one I scored 28 916 in Port Royal

I believe when i can bios mod and increase power limit to 500w my score will be much better. We will see in future.

Plus, i believe cpu is also matter at least for 1 or 2 scenes.


----------



## Xavier233

Am not sure why is everyone caught up on scores. This card is meant for mostly gaming, or some similar graphic tasks. The rest is pure lottery for this generation, so spending more does not equal better GPU performance or better scores. I bet there are many FE that would score better than some Strix cards which are $500 more after taxes. Which makes obtaining the highest score irrelevant, at least for this generation.


----------



## J7SC

bmagnien said:


> Anyone seen a Gigabyte OC block, other than the upcoming EKWB option?


...not yet...I keep checking for Phanteks (my 1st preference), Bykski or Alphacool, but so far haven't seen any of those list 4090 Gigabyte Gaming OC (and related models).


----------



## Mad Pistol

Oh my freakin god... Control 4K maxed without DLSS is smooth as butter!!! The 4090 is a certified beast!!!

Nvidia has finally cracked RT performance.


----------



## Tideman

Very surprised by the idle fan stop temp of my TUF. Doesn't go over 35C while my 3090 Ti FTW3 idled mid to high 40s.

Not that I'll use it anyway, I like it on lowest speed for idle.

From what little testing I've done so far, load temps don't exceed 60C at 75% fan speed and +180 (3015mhz).

As I mentioned already though, in another thread, coil whine is pretty bad.


----------



## LuckyImperial

yzonker said:


> Looks like MSI is trying to cash in on early sales after all.
> 
> 
> 
> 
> 
> 
> MSI GeForce RTX 4090 SUPRIM LIQUID X 24G - MSI-US Official Store
> 
> 
> GeForce RTX 4090 SUPRIM LIQUID X 24G
> 
> 
> 
> 
> us-store.msi.com


Dude if this revised MSRP is passed to their distributors I'm going to be livid. It's the only card that will work in my sff build and I tried so hard to get one at launch with no luck. The cherry on top has been a few people saying availability is widespread in the US... This markup proves that statement is pure b*****.


----------



## Carillo

Xavier233 said:


> Am not sure why is everyone caught up on scores. This card is meant for mostly gaming, or some similar graphic tasks. The rest is pure lottery for this generation, so spending more does not equal better GPU performance or better scores. I bet there are many FE that would score better than some Strix cards which are $500 more after taxes. Which makes obtaining the highest score irrelevant, at least for this generation.


You know this forum is called OCN ? Of course scores matters. People use hours tuning both OS and hardware to get those extra points. Yes, pointless in gaming, but we all have competitiveness inside us when new hardware is released. And of course it’s s frustrating to see people with lower clocks beat your scores without understanding why. But for those who wonder, compare reported clock and effective on your card, find the point when memory error correction is starting, and tune down. Port Royal also relays heavily on memory tuning and cpu cashe. My 12600k is holding back my 4090 HARD both in 4k gaming and PR with 5,2ghz all core and 7000c32 memory.


----------



## Carillo

Tideman said:


> Very surprised by the idle fan stop temp of my TUF. Doesn't go over 35C while my 3090 Ti FTW3 idled mid to high 40s.
> 
> Not that I'll use it anyway, I like it on lowest speed for idle.
> 
> From what little testing I've done so far, load temps don't exceed 60C at 75% fan speed and +180 (3015mhz).
> 
> As I mentioned already though, in another thread, coil whine is pretty bad.


Same with my TUF. Just noticed it today while gaming, my fans have been spinning on 100% since i got the card  But seriously , the whine is bad.


----------



## ZealotKi11er

OCN is a terrible place for benchmarking. Everyone is trying to be secretive. Nothing like the German websites.


----------



## schoolofmonkey

So what's everyone opinion of the Galax 4090 SG, I'm more curious of the thermals.


----------



## Tideman

Carillo said:


> Same with my TUF. Just noticed it today while gaming, my fans have been spinning on 100% since i got the card  But seriously , the whine is bad.


Yup worst I've heard but maybe I've just been lucky with it in the past. Thankfully I use headphones for gaming!


----------



## J7SC

Carillo said:


> You know this forum is called OCN ? Of course scores matters. People use hours tuning both OS and hardware to get those extra points. Yes, pointless in gaming, but we all have competitiveness inside us when new hardware is released. And of course it’s s frustrating to see people with lower clocks beat your scores without understanding why. But for those who wonder, compare reported clock and effective on your card, find the point when memory error correction is starting, and tune down. Port Royal also relays heavily on memory tuning and cpu cashe. My 12600k is holding back my 4090 HARD both in 4k gaming and PR with 5,2ghz all core and 7000c32 memory.


Well put !  ...We seem to go through this every single time when a new GPU family is released, and it is par for the course.

I find it helpful to push a card to and slightly beyond its limits with 3DM and other benchies so I know where those limits are for my specific sample - and it is getting more complicated than in the 'old days', since the latest boost algorithms have more complex sensor inputs and limits to content with. 

Specifically on my Gigabyte Gaming OC, I now know the 'best efficiency range' on VRAM is somewhere between +1390 and +1440ish (also depending to some extent on the specific test). I can clock VRAM up to 1500 w/o crashing, but clearly have some efficiency losses then. On the GPU MHz, I know under lighter conditions it is stable to past 3100 MHz, but the real issue is heat once past 500W. My Hotspot delta is as much as 30 C within seconds of a higher load and my 3DM scores such as PortRoyal show much higher temp numbers than many folks within the same scoring range. That is not an excuse, but simple preparation for water-cooling this card, as I do with all my CPUs and GPUs. The loop (1320x63 rads) is ready for it now, just need the block. The only thing I don't know yet is how to deal with the heat concentration of 4 nm compared to 7 nm or 8 nm - stick with Gelid GC Extreme (the thick, sticky stuff I love), or even try liquid metal...VRAM can look forward to thermal putty again...


----------



## Rei86

ZealotKi11er said:


> OCN is a terrible place for benchmarking. Everyone is trying to be secretive. Nothing like the German websites.


LOL like all the nVidia settings and other options and crap to optimize their runs? 
Naive little me thought it was just the OCing of the card and running it, boy was I wrong


----------



## bmagnien

J7SC said:


> Well put !  ...We seem to go through this every single time when a new GPU family is released, and it is par for the course.
> 
> I find it helpful to push a card to and slightly beyond its limits with 3DM and other benchies so I know where those limits are for my specific sample - and it is getting more complicated than in the 'old days', since the latest boost algorithms have more complex sensor inputs and limits to content with.
> 
> Specifically on my Gigabyte Gaming OC, I now know the 'best efficiency range' on VRAM is somewhere between +1390 and +1440ish (also depending to some extent on the specific test). I can clock VRAM up to 1500 w/o crashing, but clearly have some efficiency losses then. On the GPU MHz, I know under lighter conditions it is stable to past 3100 MHz, but the real issue is heat once past 500W. My Hotspot delta is as much as 30 C within seconds of a higher load and my 3DM scores such as PortRoyal show much higher temp numbers than many folks within the same scoring range. That is not an excuse, but simple preparation for water-cooling this card, as I do with all my CPUs and GPUs. The loop (1320x63 rads) is ready for it now, just need the block. The only thing I don't know yet is how to deal with the heat concentration of 4 nm compared to 7 nm or 8 nm - stick with Gelid GC Extreme (the thick, sticky stuff I love), or even try liquid metal...VRAM can look forward to thermal putty again...


Not sure why folks are opposed to LM on GPU dies. It’s been a no brainer for me, and the ratio of improved performance vs difficulty/risk is worth it to me. Just throw a few layers of kapton tape down over the smds around the die and you’re good to go.


----------



## dr/owned

bmagnien said:


> Not sure why folks are opposed to LM on GPU dies. It’s been a no brainer for me, and the ratio of improved performance vs difficulty/risk is worth it to me. Just throw a few layers of kapton tape down over the smds around the die and you’re good to go.


I’ve had LM on my 3090 since launch without issue or ever reapplying it.


----------



## Carillo

J7SC said:


> Well put !  ...We seem to go through this every single time when a new GPU family is released, and it is par for the course.
> 
> I find it helpful to push a card to and slightly beyond its limits with 3DM and other benchies so I know where those limits are for my specific sample - and it is getting more complicated than in the 'old days', since the latest boost algorithms have more complex sensor inputs and limits to content with.
> 
> Specifically on my Gigabyte Gaming OC, I now know the 'best efficiency range' on VRAM is somewhere between +1390 and +1440ish (also depending to some extent on the specific test). I can clock VRAM up to 1500 w/o crashing, but clearly have some efficiency losses then. On the GPU MHz, I know under lighter conditions it is stable to past 3100 MHz, but the real issue is heat once past 500W. My Hotspot delta is as much as 30 C within seconds of a higher load and my 3DM scores such as PortRoyal show much higher temp numbers than many folks within the same scoring range. That is not an excuse, but simple preparation for water-cooling this card, as I do with all my CPUs and GPUs. The loop (1320x63 rads) is ready for it now, just need the block. The only thing I don't know yet is how to deal with the heat concentration of 4 nm compared to 7 nm or 8 nm - stick with Gelid GC Extreme (the thick, sticky stuff I love), or even try liquid metal...VRAM can look forward to thermal putty again...


Yeah, your card is definitely better than my TUF. Sustaining those clocks with such high core temps and hot spot is far from what i can do. When you get your water block , im sure its going to be high(er) up there  Getting my strix tomorrow, final card. Crossing fingers , hoping it has a stronger core  It is what it is, lottery. Sometimes you win, sometimes you dont. Ordered a bykski tuf/strix block from formulamod.com because they had it in stock...i dont want to wait for 5090 to arrive before EK is done fiddling with 4090 block production


----------



## Xavier233

But my point is, you are playing lottery at this point, you are not really overclocking. Again an FE card can get a better score than a much more expensive Strix card. So again, what is the point if this is the new reality? To each their own of course, but adjusting to the reality might not be a bad idea. If you want a better OC, change your card. And if it has coil whine, thats an even better reason to do so.


----------



## N19htmare666

Clearly the colorful Neptune OC-V is the best card with 630w bios and 360aio but why are none of the tech tubers taking about it - no sponsorship?


----------



## Carillo

bmagnien said:


> Not sure why folks are opposed to LM on GPU dies. It’s been a no brainer for me, and the ratio of improved performance vs difficulty/risk is worth it to me. Just throw a few layers of kapton tape down over the smds around the die and you’re good to go.


I remember using LM WITHOUT kapton tape or any form of insulation on a GTX 1080. Worked fine for 2 days, then suddenly boom. First and hopefully last time my computer have literally exploded. Remember when i removed the cooler, all the vrm and chokes had transformed to liquid state 😂


----------



## mirkendargen

N19htmare666 said:


> Clearly the colorful Neptune OC-V is the best card with 630w bios and 360aio but why are none of the tech tubers taking about it - no sponsorship?


Colorful is only in Asia. The commonly known tech tubers are all NA/EU.


----------



## Carillo

Xavier233 said:


> But my point is, you are playing lottery at this point, you are not really overclocking. Again an FE card can get a better score tha na much more expensive Strix card. So again, what is the point if this is the new reality? To each their own of course, but adjusting to the reality might not be a bad idea. If you want a better OC, change your card. And if it has coil whine, thats an even better reason to do so.


Still don’t understand your point. You are saying if you don’t have the worlds best chip and memory, just give up, don’t even bother OC it ? Everyone should buy an FE because you watched jayztwocent compare the FE to the Strix and in his case the FE had better silicone? FIY, in Norway we can’t even buy your precious FE card, because the leather jacket says so.


----------



## Xavier233

Carillo said:


> Still don’t understand your point. You are saying if you don’t have the worlds best chip and memory, just give up, don’t even bother OC it ? Everyone should buy an FE because you watched jayztwocent compare the FE to the Strix and in his case the FE had better silicone? FIY, in Norway we can’t even buy your precious FE card, because the leather jacket says so.


Overclocking is not today what it was 5 or 10 years ago. 5-10 years ago you would get 15-20% from overclocking, and that felt great. It made you skip a generation or two. I used to do and enjoy it!

Today, OCing a CPU or GPU is pretty much useless. It already comes near maxed out from the factory. Those 4-5% gains are costing you 100-200 watts extra, and for what? That difference is not even noticeable in games anyway. The last 109 pages of this thread serves as a proof. Then you get a card that is FE which is faster out of the box than a Strix costing $500 more.


----------



## Carillo

Xavier233 said:


> Overclocking is not today what it was 5 or 10 years ago. 5-10 years ago you would get 15-20% from overclocking, and that felt great. It made you skip a generation or two. I used to do and enjoy it!
> 
> Today, OCing a CPU or GPU is pretty much useless. It already comes near maxed out from the factory. Those 4-5% gains are costing you 100-200 watts extra, and for what? That difference is not even noticeable in games anyway. The last 109 pages of this thread serves as a proof. Then you get a card that is FE which is faster out of the box than a Strix costing $500 more.


So why did you join overclock.net in 2022 ?


----------



## KedarWolf

Carillo said:


> I remember using LM WITHOUT kapton tape or any form of insulation on a GTX 1080. Worked fine for 2 days, then suddenly boom. First and hopefully last time my computer have literally exploded. Remember when i removed the cooler, all the vrm and chokes had transformed to liquid state 😂


I never use LM but only because my motherboard is horizontal and my video card is mounted vertical.

It's great for LM on the CPU though, but don't think I will when I get my memory for my 7950x, the last thing I need.

I get the memory Friday.


----------



## N19htmare666

mirkendargen said:


> Colorful is only in Asia. The commonly known tech tubers are all NA/EU.


Sounds like a good excuse for a holiday or get a friend to send one over  still it seems odd that there is hardly any coverage of the Neptune. 

Even the waterforce isn't shown among the sponsorship tech tubers. But I guess it does have custom fan headers and a low PL.


----------



## LukeOverHere

Hey guys, I only game at 4K on my 4K 120Hz LG, how much am i shooting myself in the foot running my 4090 with my Intel 8700k @4.9Ghz & 32GB of 3600MHz Memory? My plan was to only upgrade the GPU, but i keep seeing posts that would suggest my 4K setup may be bottlenecked with my 8700K? Is there any facts i can work with to warrant the upgrade? The CPU is already stupid quick, but if I’m losing 20-30FPS then its a consideration. *Pending arrival of my 4090


----------



## J7SC

Carillo said:


> I remember using LM WITHOUT kapton tape or any form of insulation on a GTX 1080. Worked fine for 2 days, then suddenly boom. First and hopefully last time my computer have literally exploded. Remember when i removed the cooler, all the vrm and chokes had transformed to liquid state 😂


I used LM on a dual-GPU AMD card back in the day when LM was not that well known. The card was mounted vertically, and that freaked me out a bit (gravity+liquids) - so I built up this extra protective layer of MX4 around the dies as a barrier - worked great with the w-block but cleaning it up later on when reverting it back to the stock cooler was an absolute messy nightmare...cost me half a bottle of isopropanol


----------



## Panchovix

LukeOverHere said:


> Hey guys, I only game at 4K on my 4K 120Hz LG, how much am i shooting myself in the foot running my 4090 with my Intel 8700k @4.9Ghz & 32GB of 3600MHz Memory? My plan was to only upgrade the GPU, but i keep seeing posts that would suggest my 4K setup may be bottlenecked with my 8700K? Is there any facts i can work with to warrant the upgrade? The CPU is already stupid quick, but if I’m losing 20-30FPS then its a consideration. *Pending arrival of my 4090


It is a bottleneck in some games at 4K, even the 9900K is sometimes lol

But, it depends of the game to be honest


----------



## ZealotKi11er

LukeOverHere said:


> Hey guys, I only game at 4K on my 4K 120Hz LG, how much am i shooting myself in the foot running my 4090 with my Intel 8700k @4.9Ghz & 32GB of 3600MHz Memory? My plan was to only upgrade the GPU, but i keep seeing posts that would suggest my 4K setup may be bottlenecked with my 8700K? Is there any facts i can work with to warrant the upgrade? The CPU is already stupid quick, but if I’m losing 20-30FPS then its a consideration. *Pending arrival of my 4090


It depends on the game. 8700K will be fine for AAA max setting and RT but not if you play esport at 4K.


----------



## Mad Pistol

LukeOverHere said:


> Hey guys, I only game at 4K on my 4K 120Hz LG, how much am i shooting myself in the foot running my 4090 with my Intel 8700k @4.9Ghz & 32GB of 3600MHz Memory? My plan was to only upgrade the GPU, but i keep seeing posts that would suggest my 4K setup may be bottlenecked with my 8700K? Is there any facts i can work with to warrant the upgrade? The CPU is already stupid quick, but if I’m losing 20-30FPS then its a consideration. *Pending arrival of my 4090


You can always get a faster CPU, but unfortunately for you, that means ripping out the entire CPU/Mobo/RAM config. You "should" be fine at 4K, but as the reviewers have said, the 4090 rips through 4K games like it's nothing. It's rather hilarious.


----------



## ZealotKi11er

Xavier233 said:


> Overclocking is not today what it was 5 or 10 years ago. 5-10 years ago you would get 15-20% from overclocking, and that felt great. It made you skip a generation or two. I used to do and enjoy it!
> 
> Today, OCing a CPU or GPU is pretty much useless. It already comes near maxed out from the factory. Those 4-5% gains are costing you 100-200 watts extra, and for what? That difference is not even noticeable in games anyway. The last 109 pages of this thread serves as a proof. Then you get a card that is FE which is faster out of the box than a Strix costing $500 more.


Idk about that. Overclocking on AMD is much different. Before all the power increase, voltage increase my 6900 XT gets about 23xxMHz core @ 283w. Look what it can do with all the knobs tuned up under water.


----------



## LukeOverHere

Mad Pistol said:


> You can always get a faster CPU, but unfortunately for you, that means ripping out the entire CPU/Mobo/RAM config. You "should" be fine at 4K, but as the reviewers have said, the 4090 rips through 4K games like it's nothing. It's rather hilarious.


Thats the dilemma for sure haha…. I think ill try and pace myself, a fresh install of windows when the new 4090 arrives, then see how it performs with my current 8700k for 4K gaming. If in time i find i have restricted the 4090, then it might be time for the 8700k to be replaced along with the mobo and ram….


----------



## Arizor

Just some interesting stuff from using DLSS3 in Plague Tale: Requiem:


DLSS3 (frame generation) is very impressive, it is hard, sometimes I'd argue impossible to discern the frame generation from the native rendering.
In terms of quality, I would compare the DLSS3 frame generation, turned on in isolation, as comparable to DLSS2 'Quality' mode.
In terms of performance, DLSS3 is a _much bigger_ boost to performance than using DLSS2.
For example, using DLSS2 Quality, DLSS3 turned off, I actually get very little boost to performance in this title. In fact, I might actually say it *runs faster at native 4k with DLSS2 turned off*. I imagine this might be due to DLSS2 quality upscaling from 1440p, thus introducing a higher CPU bottleneck, limiting the 4090. This will no doubt be different as we go lower down the 4-series stack.
Using DLSS3 exclusively (i.e. DLSS2 upscaling turned off, only Frame Generation on) I get a _big boost_, almost doubling my frames from native 4K in some instances.
The latency is not noticeable in this title, at 120 frames with Frame Generation on. This starts from a 70-90 native 4k render, so no doubt that makes a difference. I'll be interested to see how this changes across titles.


----------



## Joneszilla

Bought the Tuf 4090 and the red light where the power cable connects stays on when I boot, I am assuming this means its not getting power but can someone please confirm? GPU fans dont turn on and displayport not working. Swapped power cables multiple times and still the light stays on. thx


----------



## BigMack70

LukeOverHere said:


> Thats the dilemma for sure haha…. I think ill try and pace myself, a fresh install of windows when the new 4090 arrives, then see how it performs with my current 8700k for 4K gaming. If in time i find i have restricted the 4090, then it might be time for the 8700k to be replaced along with the mobo and ram….


I'm on a 9900KS @ 5.2 GHz and I thought for sure I would not run into bottlenecks, but some games are bottlenecked below 120fps at 4k on my CPU. So far, Elden Ring and Halo Infinite both cannot lock to 120fps at 4k on my rig (hovering at or just above 100fps is common), and A Plague Tale Requiem has significant drops down to 60fps when there's tons of rats and characters on the screen - not sure how faster CPUs are faring with this one, so maybe it's an optimization issue, but it's very noticeable. 

And then there's lots of games where either there is no bottleneck from the CPU, or it's so minimal that it makes no difference, or it happens at such ridiculous framerates that it won't matter unless you are on a 240 Hz screen.

I'd say get your 4090 and see how it fares in your games. I'm going to upgrade my CPU at some point in the not too distant future, but I don't feel compelled to go shopping immediately. Some time in 2023 most likely.


----------



## mirkendargen

Arizor said:


> Just some interesting stuff from using DLSS3 in Plague Tale: Requiem:
> 
> 
> DLSS3 (frame generation) is very impressive, it is hard, sometimes I'd argue impossible to discern the frame generation from the native rendering.
> In terms of quality, I would compare the DLSS3 frame generation, turned on in isolation, as comparable to DLSS2 'Quality' mode.
> In terms of performance, DLSS3 is a _much bigger_ boost to performance than using DLSS2.
> For example, using DLSS2 Quality, DLSS3 turned off, I actually get very little boost to performance in this title. In fact, I might actually say it *runs faster at native 4k with DLSS2 turned off*. I imagine this might be due to DLSS2 quality upscaling from 1440p, thus introducing a higher CPU bottleneck, limiting the 4090. This will no doubt be different as we go lower down the 4-series stack.
> Using DLSS3 exclusively (i.e. DLSS2 upscaling turned off, only Frame Generation on) I get a _big boost_, almost doubling my frames from native 4K in some instances.
> The latency is not noticeable in this title, at 120 frames with Frame Generation on. This starts from a 70-90 native 4k render, so no doubt that makes a difference. I'll be interested to see how this changes across titles.


Yeah I played with this last night too, the only area I could see something noticeable is there's a kind of halo of distortion around heads from hair....but it's there somewhat even without frame generation. The whispy hair strands cause a weird effect The peak in image quality is definitely DLAA (DLSS that renders at native res, then upsamples that to above output res as poor man's SSAA)+frame generation...which is "only" 100FPS. All other options cap my 120hz refresh rate.


----------



## DennyA

I mean, if y'all are looking for _excuses_ to upgrade your CPU, just load Microsoft Flight Sim with the settings on Ultra and fly over NYC. Frame rate on my 10900K without DLSS/frame doubling turned on is pretty much identical with my 3080 10GB -- about 42 fps at 5,120x1440. (Of course, turning on DLAA/frame generation gives me 75-ish, so still happy with the card!) Totally CPU-bound, so it'll be interesting to see what going to a 13900K is going to do for MSFS.


----------



## yzonker

J7SC said:


> Well put !  ...We seem to go through this every single time when a new GPU family is released, and it is par for the course.
> 
> I find it helpful to push a card to and slightly beyond its limits with 3DM and other benchies so I know where those limits are for my specific sample - and it is getting more complicated than in the 'old days', since the latest boost algorithms have more complex sensor inputs and limits to content with.
> 
> Specifically on my Gigabyte Gaming OC, I now know the 'best efficiency range' on VRAM is somewhere between +1390 and +1440ish (also depending to some extent on the specific test). I can clock VRAM up to 1500 w/o crashing, but clearly have some efficiency losses then. On the GPU MHz, I know under lighter conditions it is stable to past 3100 MHz, but the real issue is heat once past 500W. My Hotspot delta is as much as 30 C within seconds of a higher load and my 3DM scores such as PortRoyal show much higher temp numbers than many folks within the same scoring range. That is not an excuse, but simple preparation for water-cooling this card, as I do with all my CPUs and GPUs. The loop (1320x63 rads) is ready for it now, just need the block. The only thing I don't know yet is how to deal with the heat concentration of 4 nm compared to 7 nm or 8 nm - stick with Gelid GC Extreme (the thick, sticky stuff I love), or even try liquid metal...VRAM can look forward to thermal putty again...


You know, I'm not sure using the best putty/pads is going to be of benefit. What I'm finding is the VRAM stability falls off really quickly as these new 2Gb chips cool down. 

When I was running benches last weekend, once I lowered the ambient temp quite a bit (using my chiller blow cool air through the case) my previous game stable +1800 would artifact and crash In 3DMark. I had to drop to +1700 to 1750. I retested +1800 later at normal ambient, completely stable again. 

So I'm not sure what I'm going to use for pads for the mem...


----------



## J7SC

DennyA said:


> I mean, if y'all are looking for _excuses_ to upgrade your CPU, just load Microsoft Flight Sim with the settings on Ultra and fly over NYC. Frame rate on my 10900K without DLSS/frame doubling turned on is pretty much identical with my 3080 10GB -- about 42 fps at 5,120x1440. (Of course, turning on DLAA/frame generation gives me 75-ish, so still happy with the card!) Totally CPU-bound, so it'll be interesting to see what going to a 13900K is going to do for MSFS.


...I got NY on 4K ultra plus extra traffic and detail in FS 2020...in fair weather, in a snowstorm, and various other conditions...indeed a great way to test out your CPU and RAM settings. One day, they'll actually fix the DX12 beta, and also the code's need to hammer one thread particularly hard all the time...Only had a few minutes w/ the 4090 on FS2020 so far, so haven't tried DLAA / frame gen yet.



yzonker said:


> You know, I'm not sure using the best putty/pads is going to be of benefit. What I'm finding is the VRAM stability falls off really quickly as these new 2Gb chips cool down.
> 
> When I was running benches last weekend, once I lowered the ambient temp quite a bit (using my chiller blow cool air through the case) my previous game stable +1800 would artifact and crash In 3DMark. I had to drop to +1700 to 1750. I retested +1800 later at normal ambient, completely stable again.
> 
> So I'm not sure what I'm going to use for pads for the mem...


For my previous two GPUs, putty worked superbly well and continuous to do so...Even beyond better temps, I just like the 'foolproof' exact surface contact via thermal putty's conformity when I already have to take the cooler off anyway (along with the factory thermal pads) to mount a water block.


----------



## Xavier233

if only I can efficiently get some of that cold air outside the house into the PC case directly (with a doorway on/off). It will give me 5 months of free AC cooling inside the case.


----------



## bottjeremy

HWiNFO64 chart after 1.5 hours of playtime in Battlefield 2042 with max settings at 5k resolution upscaled in drivers. So crisp looking...

Gigabyte Windforce overclocked to 3ghz and 24000mhz ram @ 1.1 volts with stock fan profile. ZERO lockups or issues. 

I'm absolutely loving this 4090. I think Nvidia made something special here.


----------



## Xavier233

bottjeremy said:


> HWiNFO64 chart after 1.5 hours of playtime in Battlefield 2042 with max settings at 5k resolution upscaled in drivers. So crisp looking...
> 
> Gigabyte Windforce overclocked to 3ghz and 24000mhz ram @ 1.1 volts with stock fan profile. ZERO lockups or issues.
> 
> I'm absolutely loving this 4090. I think Nvidia made something special here.
> View attachment 2576737


Whats more impressive is that even with a lower fan curve (quieter gaming), the GPU temps dont overshoot like crazy. Anything below 80C is completely safe IMO


----------



## RetroWave78

For anyone with a non-OC Tuf pre-ordered, I have confirmation with an owner that it has 600w on tap. The only difference is 40 MHz as Tech Yes City points out below, 40 MHz that can be attained in MSI AB. If you paid $1800 for the OC version consider yourself lucky given the scalped prices. The rest of us will be waiting for pre-order fulfillment which could be weeks, a month or more.


----------



## Mad Pistol

I played with overclocking just a bit. I was able to get just under 3Ghz on core on the dirty OC... with a Gigabyte Windforce, which is arguably the WORST overclocking card currently available. Once I play with the voltage a bit, I can probably hit 3Ghz.

Also, I just played the full first mission of Crysis: Remastered at 4K MAX SETTINGS (no DLSS), and it averaged 60-70 FPS with minimal dips below 60... like... how?!?!?!?!?

Nvidia, what have you done? The 4090 is a certified beast.


----------



## BigMack70

bottjeremy said:


> I think Nvidia made something special here.


They finally cracked the code for 4k120 gaming. I've been trying to push for a great 4k large screen experience since 2015 with Titan X SLI, and this is the first graphics card that truly makes smooth 4k gaming feel like the expected norm.


----------



## Mad Pistol

BigMack70 said:


> They finally cracked the code for 4k120 gaming. I've been trying to push for a great 4k large screen experience since 2015 with Titan X SLI, and this is the first graphics card that truly makes smooth 4k gaming feel like the expected norm.


4K high refresh rate gaming is a thing now. The 4090 just made it real and tangible.


----------



## LukeOverHere

BigMack70 said:


> I'm on a 9900KS @ 5.2 GHz and I thought for sure I would not run into bottlenecks, but some games are bottlenecked below 120fps at 4k on my CPU. So far, Elden Ring and Halo Infinite both cannot lock to 120fps at 4k on my rig (hovering at or just above 100fps is common), and A Plague Tale Requiem has significant drops down to 60fps when there's tons of rats and characters on the screen - not sure how faster CPUs are faring with this one, so maybe it's an optimization issue, but it's very noticeable.
> 
> And then there's lots of games where either there is no bottleneck from the CPU, or it's so minimal that it makes no difference, or it happens at such ridiculous framerates that it won't matter unless you are on a 240 Hz screen.
> 
> I'd say get your 4090 and see how it fares in your games. I'm going to upgrade my CPU at some point in the not too distant future, but I don't feel compelled to go shopping immediately. Some time in 2023 most likely.


I agree, ill test out my 8700K and see if i can get any more juice out of it, although i was pretty unlucky with the Silicone lottery as mine gets super flaky at 5Ghz which is why i locked it at 4.9Ghz. If my 4K games hammer with the new 4090 then ill be happy, the hard part is i start seeing crazy FPS increases/advantages with the new 13 Series Intel’s…. Might push me over the edge 😂 I never thought the CPU mattered much in 4K gaming as i was always told it was GPU bound…..Until the 4090 arrived and shattered my dreams haha


----------



## KedarWolf

I run a 5120x1440 screen at 240Hz. I need to put it at 120Hz to use DSR.

But I think it looks crisper at 240Hz which makes DSR unavailable, so I don't use it.


----------



## RetroWave78

bottjeremy said:


> HWiNFO64 chart after 1.5 hours of playtime in Battlefield 2042 with max settings at 5k resolution upscaled in drivers. So crisp looking...
> 
> Gigabyte Windforce overclocked to 3ghz and 24000mhz ram @ 1.1 volts with stock fan profile. ZERO lockups or issues.
> 
> I'm absolutely loving this 4090. I think Nvidia made something special here.
> View attachment 2576737


Memory Junction doesn't look so hot, I wouldn't run it like that long term. We are talking about a 3-5% gain vs un-overclocked or even undervolted. You're just wasting energy, also, silicon migration happens with more voltage. 

I could see if it was like a 10-15% difference but it's not, we are literally talking about 5% more performance requiring another 33% more power. That's why your memory Junction is 80C. 4090 FE memory junction is 72c. Nvidia asked Micron for cooler running memory this time around. 

This overclocking nonsense is literal monkey business. 

5% gain, 33% more power:









GeForce RTX 4090 Founders edition review


NVIDIA unveiled their much-anticipated Series 4000 graphics cards, beginning with the GeForce RTX 4090 founder edition. The new graphics card provides greater performance, more extensive raytracing fe... Overclocking




www.guru3d.com





Gamers Nexus saw 4090 pull 666.6w @ +33% PT.

450w = 100%

666.66w for 5% gain.

Do yourself a favor and have Hwinfo64 display power consumption, memory T-Junction, clocks and FPS and assign one profile as your OC profile and another as default 100% PT (or OC 100% PT) and another with an undervolt freq curve and assign all three to hotkey in MSI AB (under profiles) and once the cooler is heat saturated, switch between them on the fly and take note of the power usage, memory temps and FPS. 

We are talking 5% gain for over 33% more power (666.6w actual is much more than 33% over 450w).

Like this is literally ******ed.


----------



## Equinox654

BigMack70 said:


> They finally cracked the code for 4k120 gaming. I've been trying to push for a great 4k large screen experience since 2015 with Titan X SLI, and this is the first graphics card that truly makes smooth 4k gaming feel like the expected norm.


Yeah. Have you taken a look at the fill rates. It’s they are stupid high.


----------



## RetroWave78

KedarWolf said:


> I run a 5120x1440 screen at 240Hz. I need to put it at 120Hz to use DSR.
> 
> But I think it looks crisper at 240Hz which makes DSR unavailable, so I don't use it.


Samsung Odyssey G9 Neo? That's what I have, LOVE this panel! G-sync doesn't work at 120Hz, only 240 Hz, tried it in multiple titles. There is SLI-esque stutter present at 120 Hz that is absent with the display set to 240 Hz. The game itself is much lower, i.e. FH5 @ 90 FPS or Sekiro at 120 FPS. Don't know why this happens but I figured I would share it.


----------



## KedarWolf

RetroWave78 said:


> Samsung Odyssey G9 Neo? That's what I have, LOVE this panel! G-sync doesn't work at 120Hz, only 240 Hz, tried it in multiple titles. There is SLI-esque stutter present at 120 Hz that is absent with the display set to 240 Hz. The game itself is much lower, i.e. FH5 @ 90 FPS or Sekiro at 120 FPS. Don't know why this happens but I figured I would share it.


Yeah, Neo G9. 

And I got it on sale for $1000 CAD off regular price on sale including Canadian tax.

Before tax it was $1699 CAD, without the sale $2500 CAD, plus 13% tax.


----------



## Emmanuel

BigMack70 said:


> I'm on a 9900KS @ 5.2 GHz and I thought for sure I would not run into bottlenecks, but some games are bottlenecked below 120fps at 4k on my CPU. So far, Elden Ring and Halo Infinite both cannot lock to 120fps at 4k on my rig (hovering at or just above 100fps is common), and A Plague Tale Requiem has significant drops down to 60fps when there's tons of rats and characters on the screen - not sure how faster CPUs are faring with this one, so maybe it's an optimization issue, but it's very noticeable.
> 
> And then there's lots of games where either there is no bottleneck from the CPU, or it's so minimal that it makes no difference, or it happens at such ridiculous framerates that it won't matter unless you are on a 240 Hz screen.
> 
> I'd say get your 4090 and see how it fares in your games. I'm going to upgrade my CPU at some point in the not too distant future, but I don't feel compelled to go shopping immediately. Some time in 2023 most likely.


I'm pretty much in the same boat, but I think DDR5 needs to mature more, and by then the Intel Gen 14 will probably have better IMCs. I'll keep an eye on the 13900KS next year though.


----------



## J7SC

BigMack70 said:


> They finally cracked the code for 4k120 gaming. I've been trying to push for a great 4k large screen experience since 2015 with Titan X SLI, and this is the first graphics card that truly makes smooth 4k gaming feel like the expected norm.


...yeah, I added a C1 48 OLED late last spring and absolutely adore it - but (so far...) only the 4090 is really capable of delivering the kind of pic quality and fps to justify 4K 120 Hz. Perhaps AMD will follow suit in November, but if you add in NVidia's drivers and tech like DLSS3, I certainly don't have buyer's remorse, especially at the price I paid.


----------



## Mad Pistol

J7SC said:


> ...yeah, I added a C1 48 OLED late last spring and absolutely adore it - but (so far...) only the 4090 is really capable of delivering the kind of pic quality and fps to justify 4K 120 Hz. Perhaps AMD will follow suit in November, but if you add in NVidia's drivers and tech like DLSS3, I certainly don't have buyer's remorse, especially at the price I paid.


I got a CX 55" about 2.5 years back and was able to snag a 3080 FE around 2 years ago to power it via HDMI 2.1. I couldn't justify the $1500 price tag for the 3090 considering the miniscule uplift in performance it offered. Needless to say, neither the 3080 or 3090 were 4K120 GPUs. They were great, but neither of them were super capable in modern titles with lots of Ray Tracing.

The 4090, however, is super capable in all scenarios. The only game I've tried so far where I can't just crank the settings to max and get 60+ FPS at 4K is Cyberpunk 2077... and setting that game to DLSS Quality fixes that. It's honestly absurd how capable the 4090 is, and when you combine it with any high refresh OLED... it doesn't get much better than this (for the moment).


----------



## zhrooms

Original post updated

Took about 14-16 hours to update it, if you find any of the data still missing, send me a message or @ me here in the thread

As of typing this, the missing stuff:
Neptune default power limit​SG max power limit​Windforce max power limit (480W)​Master and/or Xtreme Waterforce PCB shots (with VRM details)​iCHILL X3 (or above) PCB shots (with VRM details)​XLR8 PCB shots (with VRM details)​


----------



## Arizor

Yeah jumped on an LG c2 and considering I got my TUF at the 4090 MSRP I regret nothing.

Also just a reminder as others have mentioned, for regular gaming you can easily sustain great clocks at just the 100% PL.

I’ve undervolted to 1.025 cap,100% PL and clocks stick at 2900-2950 with my memory at +1700 (24.5gbps), stays solid all day long.


----------



## LukeOverHere

Serious Question, I just found out that the Gainward Phantom 4090 only comes with the Nvidia power adapter with 3 inputs (Only the GS Phantom comes with the 4x Adapter) I would like to get a 4x adapter so i can flash the BIOS and push the overclock later on, i have plenty of headroom as I’m running an EVGA 1600W T2. Probably way too early to ask this question, but is there anywhere to order a 4x power adapter for these cards? Specifically Australia….but of course worldwide shipping will do the job haha.


----------



## DokoBG

Just installed the MSI 4090 Gaming Trio. What a beast, no coil whine at all !! Quiet as a rock - cool as a cucumber ! I am beyond impressed. 🥰

Ran some Port Royal runs, hits about 462W at the most with mild clock. GPU hits 58C at the most with an average fan curve. Really cool and quiet card, love it.


----------



## dante`afk

@Falkentyne 










without GI it would drop total draw by 210w!
also without the +33% PT it would hit the power limit


----------



## mirkendargen

zhrooms said:


> Original post updated
> 
> Took about 14-16 hours to update it, if you find any of the data still missing, send me a message or @ me here in the thread
> 
> As of typing this, the missing stuff:
> Neptune default power limit​SG max power limit​Windforce max power limit​Master and/or Xtreme Waterforce PCB shots (with VRM details)​iCHILL X3 (or above) PCB shots (with VRM details)​XLR8 PCB shots (with VRM details)​


It's less than ideal, but this is a grab of an Aorus Master from der8auer's nvlink video. It's enough to tell it's the Gaming OC PCB with the empty stages filled in.


----------



## zhrooms

mirkendargen said:


> It's less than ideal, but this is a grab of an Aorus Master from der8auer's nvlink video. It's enough to tell it's the Gaming OC PCB with the empty stages filled in.


Yes, that's where I figured out there were 24/4 power stages, but no details, we have to wait until someone gets their hand on it, since I'm assuming der8auer is the "only" person who got his hands on one so far.


----------



## mirkendargen

LukeOverHere said:


> Serious Question, I just found out that the Gainward Phantom 4090 only comes with the Nvidia power adapter with 3 inputs (Only the GS Phantom comes with the 4x Adapter) I would like to get a 4x adapter so i can flash the BIOS and push the overclock later on, i have plenty of headroom as I’m running an EVGA 1600W T2. Probably way too early to ask this question, but is there anywhere to order a 4x power adapter for these cards? Specifically Australia….but of course worldwide shipping will do the job haha.


cablemod.com
moddiy.com


----------



## Falkentyne

dante`afk said:


> @Falkentyne
> 
> View attachment 2576752
> 
> 
> without GI it would drop total draw by 210w!
> also without the +33% PT it would hit the power limit


Wow that's alot.


----------



## pompss

Good to be back after years. let do this , 4090 coming tomorrow


----------



## 8472

Finally got the TUF non-OC installed in my lian li 011 dynamic xl. As big as it is, I think reviewers were exaggerating as to how big these are. Yes, it's large but it's not laugh out loud large. It's incredibly light, it's lighter than my 3090.

The width is a problem. There's no way to get the side panel on without applying a huge 90 degree bend in the cable. I'll probably vertically mount it. 

It looks gorgeous! The red power light turns on if you only have 3 8-pins connected. They should change this since the light is annoying and having 3 cables connected is not an error an the card can still be used. 

The gpu holder is too short. It doesn't reach the card when mounted on the bottom of my case. 

Time for some benchmarks.


----------



## Arizor

I dunno they are pretty comically big, though I love the look, I'm quite sure when I shoved mine in the case I heard my case whine "what are you doooing stepbrother!"


----------



## J7SC

Arizor said:


> I dunno they are pretty comically big, though I love the look, I'm quite sure when I shoved mine in the case I heard my case whine "what are you doooing stepbrother!"


...yup, waayyy too big for my dual mobo (temporarily triple GPU) build per below. The irony is that once the air cooler is off, the PCB is actually much shorter than that of a 3090 or 6900. Speaking of cooler, for most gaming, it works great and quietly - but I did an hour earlier of Cyberpunk '77 4K / Ultra / Psycho Ray Tracing, that generated the most heat so far. I REALLY look forward to w-cool this thing...


----------



## Jordyn

zhrooms said:


> Original post updated
> 
> Took about 14-16 hours to update it, if you find any of the data still missing, send me a message or @ me here in the thread
> 
> As of typing this, the missing stuff:
> Neptune default power limit​SG max power limit​Windforce max power limit​Master and/or Xtreme Waterforce PCB shots (with VRM details)​iCHILL X3 (or above) PCB shots (with VRM details)​XLR8 PCB shots (with VRM details)​


Nice one. Appreciate the efforts.


----------



## Dijati

As far as i know the MSI Liquid X has 530Watts and the Waterforce also has 510 Watts and not 600. Best Regards



zhrooms said:


> Original post updated
> 
> Took about 14-16 hours to update it, if you find any of the data still missing, send me a message or @ me here in the thread
> 
> As of typing this, the missing stuff:
> Neptune default power limit​SG max power limit​Windforce max power limit​Master and/or Xtreme Waterforce PCB shots (with VRM details)​iCHILL X3 (or above) PCB shots (with VRM details)​XLR8 PCB shots (with VRM details)​


----------



## yzonker

New NVFlash up that's supports the 4090. 









NVIDIA NVFlash (5.792.0) Download


NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID




www.techpowerup.com


----------



## Shaded War

Anyone else have delayed package from newegg? My 4090 has been sitting in Ontario, CA for 5 days and newegg sent me an email telling me it will be late. Always seem to have problems with UPS.


----------



## Aladdin.a

Me with my 4090 after going back to a game that my 3090 couldn't handle


----------



## 8472

Welp, I can confirm that the coil whine on the TUF is real. Even when the frame rate is in the 60s and 70s it's still very noticeable. When the frame rate is 100-120 it sounds like crickets chirping near a pond at dusk. It really is a shame as this is a fantastic card. 

Might have to look at the MSI or Gigabyte options.


----------



## boganhobo

Dijati said:


> As far as i know the MSI Liquid X has 530Watts and the Waterforce also has 510 Watts and not 600. Best Regards


I can't get my Waterforce past the 500W mark.


----------



## Hulk1988

yzonker said:


> New NVFlash up that's supports the 4090.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA NVFlash (5.792.0) Download
> 
> 
> NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID
> 
> 
> 
> 
> www.techpowerup.com


Who has the balls to try it


----------



## RaMsiTo

yzonker said:


> New NVFlash up that's supports the 4090.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA NVFlash (5.792.0) Download
> 
> 
> NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID
> 
> 
> 
> 
> www.techpowerup.com


thanks, trying gigabyte bios right now.


----------



## Shaded War

8472 said:


> Welp, I can confirm that the coil whine on the TUF is real. Even when the frame rate is in the 60s and 70s it's still very noticeable. When the frame rate is 100-120 it sounds like crickets chirping near a pond at dusk. It really is a shame as this is a fantastic card.
> 
> Might have to look at the MSI or Gigabyte options.


I thought you were exaggerating and being dramatic. So I searched and found a video posted on youtube. It does indeed sound like a pond surrounded by crickets. 
Every GPU I'v owned since the 10 series has had coil whine, but never bad enough to warrant returning or exchanging it.

I'm hoping my MSI trio doesn't have coil whine when UPS finally decides to deliver it.


----------



## ArcticZero

J7SC said:


> ...yup, waayyy too big for my dual mobo (temporarily triple GPU) build per below. The irony is that once the air cooler is off, the PCB is actually much shorter than that of a 3090 or 6900. Speaking of cooler, for most gaming, it works great and quietly - but I did an hour earlier of Cyberpunk '77 4K / Ultra / Psycho Ray Tracing, that generated the most heat so far. I REALLY look forward to w-cool this thing...
> View attachment 2576764


This photo speaks to my near future too. Though mine is a measly Evolv X with a loop.

I am probably going to have to super ghetto mount mine as well when my Strix arrives. Literally no space as well due to the res being in the way. Maybe a diagonally protruding mount with an open case for a few days before a block arrives.


----------



## mattskiiau

Shaded War said:


> I'm hoping my MSI trio doesn't have coil whine when UPS finally decides to deliver it.


My Trio doesn't have coil whine, if that helps you sleep at night


----------



## renejr902

[


ZealotKi11er said:


> It depends on the game. 8700K will be fine for AAA max setting and RT but not if you play esport at 4K.
> 
> 
> Panchovix said:
> 
> 
> 
> It is a bottleneck in some games at 4K, even the 9900K is sometimes lol
> 
> But, it depends of the game to be honest
Click to expand...

Please guys, can you answer the same question about my 4790k and my 4090 and i have others questions too. Please see my complete post at page 105. I played only 4k ultra setting max 120hz.
Thanks


----------



## Arizor

Ok just flashed STRIX OC onto my TUF, let's see what we get...


----------



## Arizor

Quick update - STRIX OC flashed to TUF. 

Literally did nothing, didn't even crank fans up, bumped the clock to sit on 3060 core and +1700 mem, immediate best score I've got on Speedway. Might try ramping fans up...


----------



## Arizor

Looks like just about the best I can do on air using STRIX OC BIOS on TUF. Happy with it.


----------



## Shaded War

renejr902 said:


> [
> Please guys, can you answer the same question about my 4790k and my 4090 and i have others questions too. Please see my complete post at page 105. I played only 4k ultra setting max 120hz.
> Thanks


I had a few LGA 2011 CPUs from that era and saw a large boost on my RTX 3090 when I upgraded to 5800x. Closest thing I tried to a 4790k was an E5-1620V3, and it left allot of FPS on the table. Think 25% would be a reasonable estimate for the loss. Even the 6800k was holding it back a bit.

The newest CPUs are bottlenecking the 4090 at 1440p. So it's safe to say an 8 year old CPU will be an issue at any resolution.


----------



## Benjit

8472 said:


> Welp, I can confirm that the coil whine on the TUF is real. Even when the frame rate is in the 60s and 70s it's still very noticeable. When the frame rate is 100-120 it sounds like crickets chirping near a pond at dusk. It really is a shame as this is a fantastic card.
> 
> Might have to look at the MSI or Gigabyte options.


Think I dodged a bullet reading this, in preparation and not too fail like I did epically trying to get the 3080 on launch I managed to get a tuf oc and the inno3d ichill x3 and both were dispatched at the same time.

I got them last Thursday, decided with the £300 premium the tuf oc had I would send that back which I did. The ichill was smaller and allowed me to keep my PCI slot free and also use my existing 750w PSU after monitoring peak loads, no major noises too.

I hope you get it sorted.


----------



## xcx xcxvgyt

New nvflash doesn't work. It's flashing but the power limit stays at %100 max for me which is 450w+0w card it should be %100+%10 with new bios


----------



## Arizor

xcx xcxvgyt said:


> New nvflash doesn't work. It's flashing but the power limit stays at %100 max for me which is 450w+0w card it should be %100+%10 with new bios


it worked for me, STRIX bios with new power limits. This is flashing to a TUF though, so YMMV.


----------



## AvengedRobix

At the Moment byksi have only One waterblock for the Zotac "Apocalypse" but can't find any info or PCB photo of this card.. Zotac have two different PCB for trinity and amp Extreme..for this Apocalypse there Is another PCB or Is the same of amp Extreme? 🤔


----------



## sugi0lover

AvengedRobix said:


> At the Moment byksi have only One waterblock for the Zotac "Apocalypse" but can't find any info or PCB photo of this card.. Zotac have two different PCB for trinity and amp Extreme..for this Apocalypse there Is another PCB or Is the same of amp Extreme? 🤔


This is byski water block compatibility list. The first one is Apocalypse.











https://ko.aliexpress.com/item/1005004796369318.html?spm=a2g0o.productlist.main.27.56b160acZBJ9G5&algo_pvid=b45a739a-eb41-41c8-944a-e9200eb517e3&algo_exp_id=b45a739a-eb41-41c8-944a-e9200eb517e3-13&pdp_ext_f=%7B%22sku_id%22%3A%2212000030518523303%22%7D&pdp_npi=2%40dis%21KRW%21195758.0%21176179.0%21%21%21%21%21%40212240a316661688978603268d077f%2112000030518523303%21sea&curPageLogUid=OnTrGnaWDewr


----------



## RaMsiTo

Arizor said:


> it worked for me, STRIX bios with new power limits. This is flashing to a TUF though, so YMMV.


running on msi suprim x strix oc bios, 570w on tse


----------



## O-VIZ

RaMsiTo said:


> Curve by default 2940 mhz.


...same here


----------



## Arizor

RaMsiTo said:


> running on msi suprim x strix oc bios, 570w on tse


now we have the same BIOSand the same case too 🙏


----------



## spcysls

Has anyone flashed a zotac amp extreme yet?


----------



## Jimshown LMHF

Arizor said:


> Quick update - STRIX OC flashed to TUF.
> 
> Literally did nothing, didn't even crank fans up, bumped the clock to sit on 3060 core and +1700 mem, immediate best score I've got on Speedway. Might try ramping fans up...
> 
> View attachment 2576776


Why flash a tuf in strix, the tuf already having 600w pl? Just to increase the base clock? In manual oc it should not change anything at all


----------



## Benni231990

So now we can Flash across all cards?

So When my suprim x came i can Flash the strix oc bios with no problems?


----------



## sblantipodi

does it have sense to overclock and have higher frequency if then the temp cuts the frequency?

what are the temps that cuts frequency?


----------



## spcysls

I might try the Vulcan OC on the Zotac Extreme since it has an almost identical PCB. It's only a 550W bios though but it should still be a nice bump.


----------



## derthballs

spcysls said:


> Has anyone flashed a zotac amp extreme yet?


I cant find the bioses but im down to give it a go, its dual bios anyway so no harm done if it doesnt work.

Edit found the link 









TechPowerUp


Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.




www.techpowerup.com





Will report back shortly.


----------



## spcysls

Yup the Vulcan OC bios works on the Zotac Extreme


----------



## spcysls

550W still doesn’t seem like enough for Port Royal though, I can hear the fans slowing down when it hits the power limit even though they’re locked at 80%. Better check with rivatuner


----------



## xcx xcxvgyt

I tried 3 different bios but my power limit was always %100 MAX.

Plus, tried strix bios which has 500w base but didn't boot. Bios died. (luckly i have other card)

Must prob. Due to i have 3x8pin connector card sense it and lock max 450w. Maybe there are other hardware base lock dunno

Note: gamerock non oc model

Is there anyone able to get more power by bios flashing with 3*8pin connector?


----------



## spcysls

Yes your problem is that you have a 3x8. You could try using a 4x8 adapter.


----------



## derthballs

Ive flashed my Zotak Amp Aero onto the gigabyte 600w one, showing up as 600w now in gpu-z.

Its still only drawing 485w in port royale though, i dont know if thats down to me using 3 cables, with one spit into 2 so thats the max i can get, can someone with a zotak try please who has 4 seperate cables running to see whether it hits 600w in port royale?


----------



## Arizor

Jimshown LMHF said:


> Why flash a tuf in strix, the tuf already having 600w pl? Just to increase the base clock? In manual oc it should not change anything at all


VBIOS efficacy isn't all about power limits; there's a lot going on there the engineers carefully consider to differentiate their card's value proposition. Though my TUF BIOS could hit 600W, it would only move to the 500+ in specialised programs (e.g. Furmark, Quake RTX). The Strix OC bios has much tighter clocks and pushes into 520W much easier if I need it to.

edit: Also the Strix OC also has 600W regardless, I'd recommend folks give it a go - Asus RTX 4090 VBIOS


----------



## AvengedRobix

sugi0lover said:


> This is byski water block compatibility list. The first one is Apocalypse.
> View attachment 2576793
> 
> 
> 
> 
> https://ko.aliexpress.com/item/1005004796369318.html?spm=a2g0o.productlist.main.27.56b160acZBJ9G5&algo_pvid=b45a739a-eb41-41c8-944a-e9200eb517e3&algo_exp_id=b45a739a-eb41-41c8-944a-e9200eb517e3-13&pdp_ext_f=%7B%22sku_id%22%3A%2212000030518523303%22%7D&pdp_npi=2%40dis%21KRW%21195758.0%21176179.0%21%21%21%21%21%40212240a316661688978603268d077f%2112000030518523303%21sea&curPageLogUid=OnTrGnaWDewr


But for the zotac.usa account on reddit trinity and amp have different PCB.. how can be compatible?


----------



## yzonker

Here's the Strix bios (500/600w flavor) on my TUF OC. Looks like a 45mhz bump out of the box. Card power limits at about the same points in Kombustor and Path of Exile that I've shown previously.


----------



## derthballs

I only got a marginal bump in PR using the gigabyte one, ill flash back to it if i can find another pci-e cable


----------



## Jimshown LMHF

Arizor said:


> VBIOS efficacy isn't all about power limits; there's a lot going on there the engineers carefully consider to differentiate their card's value proposition. Though my TUF BIOS could hit 600W, it would only move to the 500+ in specialised programs (e.g. Furmark, Quake RTX). The Strix OC bios has much tighter clocks and pushes into 520W much easier if I need it to.
> 
> edit: Also the Strix OC also has 600W regardless, I'd recommend folks give it a go - Asus RTX 4090 VBIOS


I have no problem switching to 560w with my TUF on port royal.


----------



## Jimshown LMHF

yzonker said:


> Here's the Strix bios (500/600w flavor) on my TUF OC. Looks like a 45mhz bump out of the box. Card power limits at about the same points in Kombustor and Path of Exile that I've shown previously.
> 
> View attachment 2576803


For me there is nothing that this bios brings that we cannot already do by hand, if it does not concern specific things like vram timings for example. I just keep thinking that it does nothing except for those who want to use it stock.


----------



## StreaMRoLLeR

xcx xcxvgyt said:


> I tried 3 different bios but my power limit was always %100 MAX.
> 
> Plus, tried strix bios which has 500w base but didn't boot. Bios died. (luckly i have other card)
> 
> Must prob. Due to i have 3x8pin connector card sense it and lock max 450w. Maybe there are other hardware base lock dunno
> 
> Note: gamerock non oc model
> 
> Is there anyone able to get more power by bios flashing with 3*8pin connector?


 gg to your warranty. Manyak


----------



## spcysls

@derthballs Did you use the gigabyte gaming OC bios? I'll probably try that later on my amp extreme which has 4 cables. In your case the limit is definitely due to the cable configuration of 4 vs 3. The split one from your PSU doesn't really matter unless for safety if it's a bad psu or bad cable


----------



## yzonker

Interesting, the FE bios still isn't flashable, at least on my TUF,


----------



## warrior-kid

Sent back my unopened PNY XLR8 and expecting Suprim X later today. It sounds like the Strix BIOS can just replace the one selected by a jumper on the card, can anyone confirm?


----------



## derthballs

spcysls said:


> @derthballs Did you use the gigabyte gaming OC? I'll probably try that later on my amp extreme which has 4 cables. In your case the limit is definitely due to the cable configuration.


Yes I did, i have 3 pcie cables going to my zotak, but one split into two - ive ordered a new pcie cable so i can run 4 seperately.

I did get a bit of a boost, up from about 26600 to 27000 dead which i assume is just it getting a bit more out the power limit which is +33 rather than plus 10 but its already pretty close to the wire, new cable should come in a few days and ill try again, ive gone back to the zotak bios as it runs quieter.


----------



## Benni231990

Is it True that die Suprim X Liquid has a 600 watt Bios? Can anybody confirm that?

I Thought 530watt has the Liquid but in the 1 page here it stand 600watt?


----------



## RaMsiTo

warrior-kid said:


> Sent back my unopened PNY XLR8 and expecting Suprim X later today. It sounds like the Strix BIOS can just replace the one selected by a jumper on the card, can anyone confirm?


yes, I tried the same thing a few hours ago and it's perfect.


----------



## BluePaint

Benni231990 said:


> Is it True that die Suprim X Liquid has a 600 watt Bios? Can anybody confirm that?


530W see here on TPU GPU BIOS list: TechPowerUp


----------



## RaMsiTo

Benni231990 said:


> Is it True that die Suprim X Liquid has a 600 watt Bios? Can anybody confirm that?
> 
> I Thought 530watt has the Liquid but in the 1 page here it stand 600watt?


520w suprim x, 530w liquid.


----------



## Bilco

Is anyone running a strix 4090 in their pc dynamic XL? Does it fit horizontally?

According to Lian Li - 169mm clearance for horizontal GPUs
According to Tech Power up the strix is 149mm off the board. So you have about 20mm of clearance for the cabling 😬😬😬


----------



## menko2

Anyone has tried to put the 4090 with a 10900k for 4K&120fps?


----------



## spcysls

Yup so the Gigabyte Gaming OC bios works on the Zotac Amp Extreme and the slider goes to 133% which is 600W


----------



## IntelMistakes

Hello to all of you! I registered on this forum because I just got a Strix 4090 OC and I have a lot of questions.

My graphics card seems to work fine, or so I think. Temperatures are good (it hardly goes above 65 degrees with a fan speed of 45%) but I've seen that in 3DMark tests, my card is a bit lower compared to other reviews I've seen (JayzTwoCents for example).

In Port Royal, my graphics card does 24,000 to 25,500.
Time Spy: 35k approx.
TimeSpy Extreme: 19,500 approx.
SpeedWay: 9,500-9700.

There is a difference of less than 2k to 700 points. Is this normal? Should I be worried? Is it something to do with the power adapter?

I passed the 3DMark stress tests without any problems: I did TimeSpy and SpeedWay and both passed with 99.1%.

Is it also normal that the power limit in MSI Afterburner and GeForce experience are at 90%?

I have other doubts, but I would like to solve these first.

I have a 12900K and DDR4 32GB 5600Mhz.
Z690-F Strix

Thank you all!


----------



## rahkmae

IntelMistakes said:


> Hello to all of you! I registered on this forum because I just got a Strix 4090 OC and I have a lot of questions.
> 
> My graphics card seems to work fine, or so I think. Temperatures are good (it hardly goes above 65 degrees with a fan speed of 45%) but I've seen that in 3DMark tests, my card is a bit lower compared to other reviews I've seen (JayzTwoCents for example).
> 
> In Port Royal, my graphics card does 24,000 to 25,500.
> Time Spy: 35k approx.
> TimeSpy Extreme: 19,500 approx.
> SpeedWay: 9,500-9700.
> 
> There is a difference of less than 2k to 700 points. Is this normal? Should I be worried? Is it something to do with the power adapter?
> 
> I passed the 3DMark stress tests without any problems: I did TimeSpy and SpeedWay and both passed with 99.1%.
> 
> Is it also normal that the power limit in MSI Afterburner and GeForce experience are at 90%?
> 
> I have other doubts, but I would like to solve these first.
> 
> I have a 12900K and DDR4 32GB 5600Mhz.
> Z690-F Strix
> 
> Thank you all!


Which bios are you using? OC ore normal?


----------



## IntelMistakes

rahkmae said:


> Which bios are you using? OC ore normal?


Hi!

I don't know. I am using all defaults, with the switch on P Mode.

The maximum clock speeds are 2760 - 2775 Mhz.


----------



## NBPDC505

IntelMistakes said:


> Hello to all of you! I registered on this forum because I just got a Strix 4090 OC and I have a lot of questions.
> 
> My graphics card seems to work fine, or so I think. Temperatures are good (it hardly goes above 65 degrees with a fan speed of 45%) but I've seen that in 3DMark tests, my card is a bit lower compared to other reviews I've seen (JayzTwoCents for example).
> 
> In Port Royal, my graphics card does 24,000 to 25,500.
> Time Spy: 35k approx.
> TimeSpy Extreme: 19,500 approx.
> SpeedWay: 9,500-9700.
> 
> There is a difference of less than 2k to 700 points. Is this normal? Should I be worried? Is it something to do with the power adapter?
> 
> I passed the 3DMark stress tests without any problems: I did TimeSpy and SpeedWay and both passed with 99.1%.
> 
> Is it also normal that the power limit in MSI Afterburner and GeForce experience are at 90%?
> 
> I have other doubts, but I would like to solve these first.
> 
> I have a 12900K and DDR4 32GB 5600Mhz.
> Z690-F Strix
> 
> Thank you all!


You said the power limit is at 90%, can you confirm all 4 PCIe power cables are connected and making good contact? 90% on a Strix is right at 450w which is what would occur with only 3 plugs detected.


----------



## spcysls

For whatever reason the Vulcan 550W bios performs better than the Gigabyte 600W bios on my Zotac Amp Extreme even at the same clockspeeds and lower temperatures. Port Royal only pulls 537W anyway.


----------



## xcx xcxvgyt

I tried base 480w msi suprim bios to my 3*8 connector card and max power limit is %93 now at afterburner 

I hope 4*8 connector able to solve this

Maybe i can try to short some pins on the 600w side of the connector to fake it, need some research


----------



## IntelMistakes

NBPDC505 said:


> You said the power limit is at 90%, can you confirm all 4 PCIe power cables are connected and making good contact? 90% on a Strix is right at 450w which is what would occur with only 3 plugs detected.


Hello, all 4 connectors are securely connected. Is it possible that the connector to the PCB is faulty? The red light on the GPU doesn't light up indicating lack of power.


If I turn up the power limit with MSI afterburner to 100 I see that its TDP is 500W. I've read in this thread that someone has also had this happen, that it comes by default with the Power Limit at 90% but I wanted to make sure.


----------



## zhrooms

Benni231990 said:


> Is it True that die Suprim X Liquid has a 600 watt Bios? Can anybody confirm that?
> I Thought 530watt has the Liquid but in the 1 page here it stand 600watt?





BluePaint said:


> 530W see here on TPU GPU BIOS list: TechPowerUp





RaMsiTo said:


> 520w suprim x, 530w liquid.


It's 600W.. the "early reviewer" BIOS uploaded by TPU is not retail BIOS, Liquid X ships to consumers with a 600W BIOS
95.02.18.00.C4 (Build Date 2022-09-01) PG139 SKU 330 VGA BIOS MSINV510MH.290 450/600W
I have multiple people confirming by sharing screenshots of retail Liquid X cards, whatever is shown in the original post is correct


----------



## rahkmae

IntelMistakes said:


> Hi!
> 
> I don't know. I am using all defaults, with the switch on P Mode.
> 
> The maximum clock speeds are 2760 - 2775 Mhz.


Look for the bios switch on the card and switch to oc first...


----------



## UnfortunateAlly

menko2 said:


> Anyone has tried to put the 4090 with a 10900k for 4K&120fps?


Was looking for the same answer over the past few days and only found this Russian YouTube channel that tested COD Warzone, Metro Exodus, PUBG, Spider-Man Remastered and A Plague Tale: Requiem with the 10900k and RTX 4090:



https://www.youtube.com/channel/UCU2P9REQFLT127V0IFdEOEg/videos


----------



## yzonker

Bilco said:


> Is anyone running a strix 4090 in their pc dynamic XL? Does it fit horizontally?
> 
> According to Lian Li - 169mm clearance for horizontal GPUs
> According to Tech Power up the strix is 149mm off the board. So you have about 20mm of clearance for the cabling 😬😬😬


Is this the same case? 


__
https://www.reddit.com/r/nvidia/comments/y7y12u


----------



## bottjeremy

zhrooms said:


> Original post updated
> 
> Took about 14-16 hours to update it, if you find any of the data still missing, send me a message or @ me here in the thread
> 
> As of typing this, the missing stuff:
> Neptune default power limit​SG max power limit​Windforce max power limit​Master and/or Xtreme Waterforce PCB shots (with VRM details)​iCHILL X3 (or above) PCB shots (with VRM details)​XLR8 PCB shots (with VRM details)​


Windforce max 106% for me.


----------



## pokabroma

menko2 said:


> Anyone has tried to put the 4090 with a 10900k for 4K&120fps?


Yep, I have a 4090 tug oc with a 10900k @4,9 (with a hyper 212 cpu Dissipator - redoing loop), and I’m seeing bottlenecks at around 70-75 fps in bf2042 and cp2077. Ffvii remake works perfectly fast, btw


----------



## jl434

Bilco said:


> Is anyone running a strix 4090 in their pc dynamic XL? Does it fit horizontally?
> 
> According to Lian Li - 169mm clearance for horizontal GPUs
> According to Tech Power up the strix is 149mm off the board. So you have about 20mm of clearance for the cabling 😬😬😬


I think it should fit.

I need to bend the 4*8 cable a little bit to allow my Gigabyte GOC(150mm) to fit in the O11D EVO, O11D EVO is same horizontal GPU clearance as dynamic XL


----------



## zhrooms

bottjeremy said:


> Windforce max 106% for me.


If you can share this page that'd be great. But yes, 450 x 1.06 = 477 (~480W).


----------



## nycgtr

Benjit said:


> Think I dodged a bullet reading this, in preparation and not too fail like I did epically trying to get the 3080 on launch I managed to get a tuf oc and the inno3d ichill x3 and both were dispatched at the same time.
> 
> I got them last Thursday, decided with the £300 premium the tuf oc had I would send that back which I did. The ichill was smaller and allowed me to keep my PCI slot free and also use my existing 750w PSU after monitoring peak loads, no major noises too.
> 
> I hope you get it sorted.


I have 3 tufs. Only one of them has coil whine that is noticeable but not bad with a case door on.


----------



## bottjeremy

zhrooms said:


> If you can share this page that'd be great. But yes, 450 x 1.06 = 477 (~480W).
> View attachment 2576815


----------



## stargit

spcysls said:


> Yup so the Gigabyte Gaming OC bios works on the Zotac Amp Extreme and the slider goes to 133% which is 600W



First post.

I have the same zotac card.
Is there any insrutions on how to do this?
Oh. My cpu is a 10900KF so no on board graphics.

Sorry about asking a lot


----------



## derthballs

stargit said:


> First post.
> 
> I have the same zotac card.
> Is there any insrutions on how to do this?
> Oh. My cpu is a 10900KF so no on board graphics.
> 
> Sorry about asking a lot


Download nvflash - backup your original bios first for safety, then nvflash64 -6 nameofbios.rom


----------



## stargit

derthballs said:


> Download nvflash - backup your original bios first for safety, then nvflash64 -6 nameofbios.rom



So I don't need to 2 GPUs to flash it? 
Saw on youtube I had to disable the 4090.

I'm a pensioner in the UK so not tech savi


----------



## IntelMistakes

rahkmae said:


> Look for the bios switch on the card and switch to oc first...


I have managed to solve the Power Limit problem. By moving the switch from P to Q, rebooting, and then moving the switch from Q to P again.

Now it is always set to 100%. But the 3DMark score is the same.

I guess it's because of the silicon? The difference is NOT big, now in Port Royal about 400 points.


----------



## CZonin

Anyone try flashing a Trio yet? Thinking about trying the Suprim X Liquid BIOS.


----------



## derthballs

stargit said:


> So I don't need to 2 GPUs to flash it?
> Saw on youtube I had to disable the 4090.
> 
> I'm a pensioner in the UK so not tech savi


no need to disable, you dont need two gpus


----------



## Benjit

nycgtr said:


> I have 3 tufs. Only one of them has coil whine that is noticeable but not bad with a case door on.


3 😳 you have a few machines then!


----------



## stargit

spcysls said:


> For whatever reason the Vulcan 550W bios performs better than the Gigabyte 600W bios on my Zotac Amp Extreme even at the same clockspeeds and lower temperatures. Port Royal only pulls 537W anyway.



So which bios do you think is better for higher clocks?


----------



## heptilion

NBPDC505 said:


> You said the power limit is at 90%, can you confirm all 4 PCIe power cables are connected and making good contact? 90% on a Strix is right at 450w which is what would occur with only 3 plugs detected.


This is same for my strix as well. It's normal


----------



## IntelMistakes

heptilion said:


> This is same for my strix as well. It's normal


How strange. The Power Limit on my Strix 4090 is 90% if the DualBIOS switch is set to Performance Mode. If I set it to Quiet Mode, it goes up to 100%.

Does anyone know anything?

And Can you confirm if the 3DMark score is correct?
Along with the clock speeds. Thank you.

Thanks!!


----------



## KedarWolf

Shaded War said:


> Anyone else have delayed package from newegg? My 4090 has been sitting in Ontario, CA for 5 days and newegg sent me an email telling me it will be late. Always seem to have problems with UPS.


My Newegg X670E Taichi took like 3 days but shipping from Toronto to Toronto.


----------



## KedarWolf

Arizor said:


> it worked for me, STRIX bios with new power limits. This is flashing to a TUF though, so YMMV.


You have a link to the new power limit BIOS?


----------



## Tadaschi

Shaded War said:


> Anyone else have delayed package from newegg? My 4090 has been sitting in Ontario, CA for 5 days and newegg sent me an email telling me it will be late. Always seem to have problems with UPS.


welcome to the club, i rush the checkout to guarantee my card and they send with the cheapest shipping possible. got it lauch day and it will be deliver tomorrow.... i guess

good news nvflash support 4090 now


----------



## yzonker

IntelMistakes said:


> How strange. The Power Limit on my Strix 4090 is 90% if the DualBIOS switch is set to Performance Mode. If I set it to Quiet Mode, it goes up to 100%.
> 
> Does anyone know anything?
> 
> And Can you confirm if the 3DMark score is correct?
> Along with the clock speeds. Thank you.
> 
> Thanks!!


Strix bios are 450/600w and 500/600w. The 500/600w bios defaults to 450w for whatever reason which is 90%.


----------



## long2905

AvengedRobix said:


> At the Moment byksi have only One waterblock for the Zotac "Apocalypse" but can't find any info or PCB photo of this card.. Zotac have two different PCB for trinity and amp Extreme..for this Apocalypse there Is another PCB or Is the same of amp Extreme? 🤔











enough options for you yet?


----------



## Spiriva

IntelMistakes said:


> How strange. The Power Limit on my Strix 4090 is 90% if the DualBIOS switch is set to Performance Mode. If I set it to Quiet Mode, it goes up to 100%.
> 
> Does anyone know anything?
> 
> And Can you confirm if the 3DMark score is correct?
> Along with the clock speeds. Thank you.
> 
> Thanks!!


This is what my strix looks like on P-mode.


----------



## BigMack70

Anyone know when Cyberpunk is getting its DLSS 3 update? I'm keen to try that tech out in a game where I can maybe make good use of it. DLSS 3 in A Plague Tale is useless for me until I upgrade my CPU - it increases framerates when CPU limited but has awful, unplayable frame pacing issues caused by the CPU being the bottleneck in the system.


----------



## ZealotKi11er

renejr902 said:


> [
> Please guys, can you answer the same question about my 4790k and my 4090 and i have others questions too. Please see my complete post at page 105. I played only 4k ultra setting max 120hz.
> Thanks


4790K will hold back 4090 1000% in any resolution. I had 3770K at 4.8GHz with DDR3-2400 CL10 and was getting low GPU utilization in BFV with 2080 Ti.


----------



## rjeftw

CZonin said:


> Anyone try flashing a Trio yet? Thinking about trying the Suprim X Liquid BIOS.


Curious about this myself, but I feel like we might need the 4x8 pin plug over the 3x8 since other bios have a much higher wattage than the stock trio bios. I did however manage to get a FE on this mornings Best Buy drop; picking this up Saturday morning.


----------



## BigMack70

ZealotKi11er said:


> 4790K will hold back 4090 1000% in any resolution. I had 3770K at 4.8GHz with DDR3-2400 CL10 and was getting low GPU utilization in BFV with 2080 Ti.


I mean... could always upgrade to an 8K TV and keep the good times rollin on the old CPU


----------



## Mad Pistol

BigMack70 said:


> I mean... could always upgrade to an 8K TV and keep the good times rollin on the old CPU


At 8K, you're holding back the 4090.


----------



## dante`afk

step aside strix, there's a new king in town;






GALAX GeForce RTX™ 4090 SG 1-Click OC


<div id="featurebullets_feature_div" class="feature" data-feature-name="featurebullets"> <div id="feature-bullets" class="a-section a-spacing-medium a-spacing-top-small"> <ul class="a-vertical a-spacing-none"> <li><b>NVIDIA Ada Lovelace Streaming Multi




www.galax.com













GALAX TecLab OC team breaks multiple world records with GeForce RTX 4090 and LN2, up to 3.45 GHz overclock - VideoCardz.com


NVIDIA RTX 4090 OC up to 3.45 GHz GALAX OC TecLab team from Brazil has been working on overclocking GeForce RTX 4090 graphics card. As NVIDIA released its flagship RTX 40 GPU, reviewers and board partners were finally allowed to share the test results featuring custom designs. One of such cards...




videocardz.com


----------



## Nizzen

dante`afk said:


> step aside strix, there's a new king in town;
> 
> 
> 
> 
> 
> 
> GALAX GeForce RTX™ 4090 SG 1-Click OC
> 
> 
> <div id="featurebullets_feature_div" class="feature" data-feature-name="featurebullets"> <div id="feature-bullets" class="a-section a-spacing-medium a-spacing-top-small"> <ul class="a-vertical a-spacing-none"> <li><b>NVIDIA Ada Lovelace Streaming Multi
> 
> 
> 
> 
> www.galax.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GALAX TecLab OC team breaks multiple world records with GeForce RTX 4090 and LN2, up to 3.45 GHz overclock - VideoCardz.com
> 
> 
> NVIDIA RTX 4090 OC up to 3.45 GHz GALAX OC TecLab team from Brazil has been working on overclocking GeForce RTX 4090 graphics card. As NVIDIA released its flagship RTX 40 GPU, reviewers and board partners were finally allowed to share the test results featuring custom designs. One of such cards...
> 
> 
> 
> 
> videocardz.com


Did they use Galax SG?

Edit: Looks like a hardware modded card


----------



## Roacoe717

Cheers,


----------



## derthballs

Has anyone with a Zotak Amp Aero who's flashed is actually showing usage of more than the 485w on the original bios when running anything? Im trying to figure out if me being restricted is having 3 (one being split into 2) rather than 4 dedicated pci-e cables going into it.


----------



## Equinox654

Nvflash is updated:
NVIDIA NVFlash (5.780.0) Download | TechPowerUp 

*5.780.0 (October 19th, 2022)*

Support for Ada Lovelace GeForce / RTX 4090


----------



## Xavier233

I am very interested in undevolting the 4090. Anyone was able to undervolt it successfully for the same performance, but less heat and GPU temps? Please post your settings or screenshots


----------



## CZonin

rjeftw said:


> Curious about this myself, but I feel like we might need the 4x8 pin plug over the 3x8 since other bios have a much higher wattage than the stock trio bios. I did however manage to get a FE on this mornings Best Buy drop; picking this up Saturday morning.


I got a 4x8 pin extension in from CableMod yesterday. Definitely looking to test things out with a higher power limit BIOS.


----------



## ZealotKi11er

Xavier233 said:


> I am very interested in undevolting the 4090. Anyone was able to undervolt it successfully for the same performance, but less heat and GPU temps? Please post your settings or screenshots


Its very simple. By overclocking you are technically tunning lower voltage at set frequency in the curve. You can OC first and see how far your chip can go and then lower the power level until you are at same perf as stock clocks.


----------



## mariushauko1994

derthballs said:


> Has anyone with a Zotak Amp Aero who's flashed is actually showing usage of more than the 485w on the original bios when running anything? Im trying to figure out if me being restricted is having 3 (one being split into 2) rather than 4 dedicated pci-e cables going into it.


i just flashed my trinity non oc with the gigabyte OC bios and at least it shows with 600w. Havent seen what my highest powerdraw is yet


----------



## yzonker

derthballs said:


> Has anyone with a Zotak Amp Aero who's flashed is actually showing usage of more than the 485w on the original bios when running anything? Im trying to figure out if me being restricted is having 3 (one being split into 2) rather than 4 dedicated pci-e cables going into it.


You need both a higher PL bios and an adapter that supports 600w. Either a 4x8pin adapter, Cablemod, or one of the adapters supplied by a PSU maker like the one Corsair is selling. Bottom line is the 2 sense pins have to be grounded is my understanding.


----------



## mariushauko1994

yzonker said:


> You need both a higher PL bios and an adapter that supports 600w. Either a 4x8pin adapter, Cablemod, or one of the adapters supplied by a PSU maker like the one Corsair is selling. Bottom line is the 2 sense pins have to be grounded is my understanding.


Just did a run now. And according to afterburner it topped at about 520w in Port royal at around 3000-3030mhz.


----------



## mirkendargen

mariushauko1994 said:


> Just did a run now. And according to afterburner it topped at about 520w in Port royal at around 3000-3030mhz.


1.05v or 1.1v?


----------



## Xavier233

ZealotKi11er said:


> Its very simple. By overclocking you are technically tunning lower voltage at set frequency in the curve. You can OC first and see how far your chip can go and then lower the power level until you are at same perf as stock clocks.


Yes I know the principles of it, but am looking for like a good configuration that works on the 4090: undervolt at which voltage, and across which clock range works best? There are not many youtube how-tos on 4090s yet for undervolting that I can find


----------



## yt93900

Now we're talking, maybe it will make that 360mm on the Waterforce finally worthwile!









+++++
Hold up, the Waterforce Xtreme has no dual BIOS but the el cheapo Gaming OC does?!?
Well, on one hand with such a big radiator it wouldn't make much sense for quiet vs OC BIOS'es, but still.


----------



## Carillo

So, got the Strix OC today. Only got to run some quick tests before i left for work. But from what i could tell , it clearly has the best silicone out of the 3 cards. 3105mhz Port Royal with 78 degree hot spot ( yes my office was way hotter today). The memory kept scaling over +1600, and i did run Time Spy with 3150mhz core. All testing done with 1100mV. I did not save or post any scores as i did those tests with either mem or core oc , not at the same time. Anyway, thats my findings so far


----------



## derthballs

mariushauko1994 said:


> i just flashed my trinity non oc with the gigabyte OC bios and at least it shows with 600w. Havent seen what my highest powerdraw is yet


This is what i did but my power draw still sticks at 485w


----------



## cletus-cassidy

RetroWave78 said:


> For anyone with a non-OC Tuf pre-ordered, I have confirmation with an owner that it has 600w on tap. The only difference is 40 MHz as Tech Yes City points out below, 40 MHz that can be attained in MSI AB. If you paid $1800 for the OC version consider yourself lucky given the scalped prices. The rest of us will be waiting for pre-order fulfillment which could be weeks, a month or more.


Thanks for this. I bought a Tuf OC at launch on Newegg, mostly because I seem to check out slowly (I had a bunch of gift cards) and was afraid the non-OC Tuf would sell out while I was halfway through check out. Still, I was feeling like a chump paying an extra $200 for effectively nothing, so maybe it wasn't such a terrible choice after all.


----------



## yt93900

Can't really use the Gaming OC VBIOS on the Waterforce because the G-OC has Fan Stop, GRRRR. Would have to set custom fan curves.


----------



## Flumpenlicht

CZonin said:


> Anyone try flashing a Trio yet? Thinking about trying the Suprim X Liquid BIOS.


Just did it. Flashed the MSI Suprim x liquid BIOS to my Trio. 
Power Limit is now at 530 W with the 3x 8 pin.


----------



## J7SC

dante`afk said:


> step aside strix, there's a new king in town;
> 
> 
> 
> 
> 
> 
> GALAX GeForce RTX™ 4090 SG 1-Click OC
> 
> 
> <div id="featurebullets_feature_div" class="feature" data-feature-name="featurebullets"> <div id="feature-bullets" class="a-section a-spacing-medium a-spacing-top-small"> <ul class="a-vertical a-spacing-none"> <li><b>NVIDIA Ada Lovelace Streaming Multi
> 
> 
> 
> 
> www.galax.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GALAX TecLab OC team breaks multiple world records with GeForce RTX 4090 and LN2, up to 3.45 GHz overclock - VideoCardz.com
> 
> 
> NVIDIA RTX 4090 OC up to 3.45 GHz GALAX OC TecLab team from Brazil has been working on overclocking GeForce RTX 4090 graphics card. As NVIDIA released its flagship RTX 40 GPU, reviewers and board partners were finally allowed to share the test results featuring custom designs. One of such cards...
> 
> 
> 
> 
> videocardz.com


If only I could get my Gig-GOC down to -146.4 C...🥶


----------



## yt93900

Waterforce Xtreme 360mm w. Gaming OC VBIOS, slider at 133%:









Couldn't get it to power throttle in Port Royal, needed Quake II RTX for it....
So far the VBIOS flash has not really been worth it because of the bad fan controls, only viable for 3Dmark runs etc. Will do some benches and go back to the OG bios.


----------



## jim2point0

Roacoe717 said:


> Cheers,
> View attachment 2576827


Did you have some kind of alert set? I have no idea when these are going to pop up.


----------



## PisaroOne

Azulath said:


> Well, this does seem more in line with Jay's problem than mine. At least I'm assuming that sine the card does not even light up.
> 
> However, I'm curious if this is some issue with the Gigabyte MB, we are both on AM4 after all....


So i bought a second 4090, a zotac 4090 trinity. that card works perfectly. So i guess my Gainward 4090 was broken.


----------



## CZonin

Okay so flashed this Suprim X Liquid BIOS (link) on my Trio (non-X) with a 4x8pin extension from CableMod. 


Stock BIOS: 106% power limit / +1250 mem / +90 core = 26300 in PR
Suprim X Liquid BIOS: 110% power limit / +1100 mem / +50 core = 2700 in PR
Will do some more testing later, but so far so good.


----------



## Stray..

Hi, im currently on a Gygabyte Windforce 4090, im not used to flashing bios. Im just wondering if it would be safe to flash it for the strix bios without having to risk (not sure if its the right word to use) bricking my card.


----------



## 8472

FE cards have been popping up on bestbuy's site for the past few hours if anyone is interested.


----------



## Mad Pistol

Stray.. said:


> Hi, im currently on a Gygabyte Windforce 4090, im not used to flashing bios. Im just wondering if it would be safe to flash it for the strix bios without having to risk (not sure if its the right word to use) bricking my card.


Be careful. The Windforce has a much lower VRM setup than the STRIX, so you might risk destroying the card.


----------



## Mad Pistol

8472 said:


> FE cards have been popping up on bestbuy's site for the past few hours if anyone is interested.


Currently in line for one. We will see if I get lucky.

EDIT: as soon as I posted that, sold out.


----------



## yt93900

I think we can safely say the 5090 will be $2750 at launch looking how they still sell like hotcakes even without miners.


----------



## Stray..

Mad Pistol said:


> Be careful. The Windforce has a much lower VRM setup than the STRIX, so you might risk destroying the card.


Alright, thanks for letting me know, i wont risk destroying it, i was just making sure wether or not it wouldve been safe, not gonna take any chances thats for sure lol. Its a pain as it is to find a 4090 in stock where i live (Canada)


----------



## bmagnien

Which strix bios are people using. The one from the TPU list?


----------



## Stray..

bmagnien said:


> Which strix bios are people using. The one from the TPU list?


Yes, correct.


----------



## BigMack70

yt93900 said:


> I think we can safely say the 5090 will be $2750 at launch looking how they still sell like hotcakes even without miners.


Maybe. My guess is that it will depend on how well the 4090 sells outside of the launch window. It's normal for new top tier GPUs at almost any price to sell out and be hard to buy for a couple weeks. It's not normal that those GPUs remain sold out for two years and only available at prices far above MSRP, which was the situation crypto mining created. I expect demand for the 4090 to be weak at $1600+ by early next year. Obviously could be wrong, but if stock is plentiful and sales volume low at $1600 over the life of the card, I wouldn't expect a dramatic price increase for the next series. 

I don't think we'll ever see a top tier GPU launch below $1k, though. Those days are clearly done, and Nvidia very much view the $700 3080 as a colossal mistake.


----------



## bezerKa

Yeah once the hype days die down, and the dream scalpers realise they can't sell them like 30 series did, and more become available on the market (easy to pick up a 4090 right now tbh), then yea the FE's will be more readily available. My guess is once new AMD cards come out, and 4080/4070, the 40 FE's will be easy enough to get. My guess Jan/Feb next year.


----------



## xcx xcxvgyt

I scored 29 633 in Port Royal


AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Hi everyone, why this is not listed on 3dmark list any idea?


----------



## yt93900

Seems fishy, 1400 points above mine and lower clocks, pretty much.








Result not found







www.3dmark.com


----------



## bezerKa

xcx xcxvgyt said:


> I scored 29 633 in Port Royal
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Hi everyone, why this is not listed on 3dmark list any idea?


Try deleting score, and try upload again.


----------



## bezerKa

yt93900 said:


> Seems fishy, 1400 points above mine and lower clocks, pretty much.


Know what you mean, Seen a couple like that now....


----------



## Nd4spdvn

xcx xcxvgyt said:


> Hi everyone, why this is not listed on 3dmark list any idea?


It is, you just want to draw our attention. And beat my no4 score, lol.


----------



## Carillo

Delete


----------



## xcx xcxvgyt

bezerKa said:


> Try deleting score, and try upload again.


Thank you it worked


----------



## J7SC

I wouldn't want to speculate on pricing of next-gen top-end cards coming out in ~ 2024...too many industry factors (competition from AMD, Intel) and other external ones like inflation, war, pestilence, asteroids...

Also, AMD has been making great strides in areas related to high-end enterprise cards (AI) as of late and that will also impact on technologies applied to 'gamer' cards. Then there is their multichip integration lead. Seems reasonable to expect that competition will be fierce.

I have been watching the local retailer where several of us here (including myself) purchased their 4090 Giga-Gaming-OC...I knew their initial stock number, and it sold out fairly briskly. Early days, but restocking a hot seller seems a bit slow, IMO. Whether supply channel or transport issues, retailers may have been cautious on their pre-release inventory orders for 4090s after the RTX3k inventory debacles, and a more uncertain economic outlook.


----------



## Baasha

was in line at BB multiple times only to have it say "SOLD OUT" after like 5 - 10 minutes. This is infuriating.


----------



## yt93900

bezerKa said:


> Know what you mean, Seen a couple like that now....


It's a 4th score on Hall of Fame at the moment, I'd say impossible unless someone found a loophole to boost the scores by tweaking the drivers, nr.23 on the list is even below 3GHz core on his 4090. Almost certainly driver tweaking though, I moved the quality slider to max performance and got another +100pts








I scored 28 299 in Port Royal


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## stargit

derthballs said:


> Has anyone with a Zotak Amp Aero who's flashed is actually showing usage of more than the 485w on the original bios when running anything? Im trying to figure out if me being restricted is having 3 (one being split into 2) rather than 4 dedicated pci-e cables going into it.



My Amp never reaches 485W.
440W is the highest I have seen


----------



## BigMack70

Baasha said:


> was in line at BB multiple times only to have it say "SOLD OUT" after like 5 - 10 minutes. This is infuriating.


I really don't understand Nvidia's decision to gate their FE cards behind Best Buy's unreliable website/storefront. Nvidia is clearly attempting to copy Apple, where they provide a product which they have controlled from top to bottom with the intent to provide what they believe is the best possible experience to the end user. However, they have made the FE model the most infuriating experience to attempt to purchase of almost any 4090. Doesn't make sense to me.


----------



## xcx xcxvgyt

yt93900 said:


> It's a 4th score on Hall of Fame at the moment, I'd say impossible unless someone found a loophole to boost the scores by tweaking the drivers, nr.23 on the list is even below 3GHz core on his 4090.


I was the one beat 60k firestrike score with 6800xt 2 years ego.. 

So i must be veryy veryy skilled to find out loopholes or i'm good at oc. 

I need to short 12vhpwr sense pins to get this score because i have 3*8 connector and this is only the begining of tweaking

So, it's not Just a.... Whatever you said.


----------



## yt93900

Your clocks are lower than other guys, using a tweaked 3x8 pin adapter makes it even less believable to be frank.
Sounds impossible to have 1400pts of GPU difference at the same clocks unless PR is being bugged at the moment with scoring.


----------



## bezerKa

yt93900 said:


> It's a 4th score on Hall of Fame at the moment, I'd say impossible unless someone found a loophole to boost the scores by tweaking the drivers, nr.23 on the list is even below 3GHz core on his 4090. Almost certainly driver tweaking though, I moved the quality slider to max performance and got another +100pts
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 299 in Port Royal
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


yeah definitely some weird scores being uploaded


----------



## bezerKa

People may be finding some unique methods, who knows. Score still says valid I guess


----------



## spin5000

Arizor said:


> Using DLSS3 exclusively (i.e. DLSS2 upscaling turned off, only Frame Generation on) I get a _big boost_, almost doubling my frames from native 4K in some instances.


So you're saying that DLSS 3.0 doesn't truly exist but rather it's just an "umbrella term" meaning DLSS 2.0 + fake frame insertion feature enabled?

Maybe I misunderstood a few review videos I've seen the past 1 or 2 weeks but I always thought DLSS 3.0, like 2.0 and 1.0 and like FSR 2.0 and 1.0, is the actual downscaling/upscaling method itself while the new fake frame insertion feature is simply an added option that can be enabled _in addition to_ DLSS (but only if using DLSS v3.0)........In other words, regardless of the fake frame insertion feature, I thought the actual DLSS itself has been further developed just like it did when going from DLSS 1.0 to 2.0 and FSR 1.0 to 2.0...So is that not the case? Is the DLSS itself in "DLSS 3.0" still just actually DLSS 2.0?


----------



## BigMack70

spin5000 said:


> You'e saying that DLSS3 is not any sort of downscaling/upscaling mode and that it's purey the new fake frame insertion feature and that the upscaling/downscaling itself is DLSS 2.0...
> 
> Maybe I misunderstood a few review videos I've seen the past 1 or 2 but I always thought DLSS 3.0, like 2.0, like 1.0 i the actual downscaling/upscaling method itself while the new fake frame insertion is simply an added option that can be enabled but only enabled if using 3.0 version.


DLSS 3 has different toggles which can be on or off independent of one another. So you can run DLSS frame generation either with or without DLSS 2.0 upscaling, which has been re-labeled 'DLSS Super Resolution".


----------



## derthballs

stargit said:


> My Amp never reaches 485W.
> 440W is the highest I have seen


Have you flashed your bios though? Im hoping someone who has flashed can actually see a bigger power draw, i can flash to the gigabyte and it shows as 600w in gpu-z but i still dont draw more than 485w in port royale. 

Ive a new cable on the way so i can test with 4 independent cables rather than 3 with one shared for the 4th to see if that then draws more.


----------



## spin5000

BigMack70 said:


> DLSS 3 has different toggles which can be on or off independent of one another. So you can run DLSS frame generation either with or without DLSS 2.0 upscaling, which has been re-labeled 'DLSS Super Resolution".


Yes, "toggles" ie. extra DLSS features, like Frame Generation.

I'm not talking about extra DLSS features. I'm talking about the actuall DLSS itself. Going from DLSS 1.0 to 2.0, we got improved framerates at the same image quality or improved image quality at the same framerates due to the improvements in the down/up-sampling in DLSS 2.0 compared to 1.0 (and in FSR 2.0 compared to 1.0). So, besides a feature (or "toggle"), has the DLSS ITSELF in DLSS 3.0 undergone any improvements/updates/developments? I'm not talking about features / "toggles" we can add _alongside_ it. I'm talking about the DLSS upscaling system itself. If the answer is no, then DLSS 3.0 is a quite deceiving term as it's actually just DLSS 2.0 with extra features that can be enabled alongside it (like Frame Generation).


----------



## yzonker

J7SC said:


> If only I could get my Gig-GOC down to -146.4 C...🥶


You can. There's even blocks (pots) available for it probably. Glad I could help.


----------



## CZonin

CZonin said:


> Okay so flashed this Suprim X Liquid BIOS (link) on my Trio (non-X) with a 4x8pin extension from CableMod.
> 
> 
> Stock BIOS: 106% power limit / +1250 mem / +90 core = 26300 in PR
> Suprim X Liquid BIOS: 110% power limit / +1100 mem / +50 core = 2700 in PR
> Will do some more testing later, but so far so good.


I ended up having some issues controlling the fans which makes sense in hindsight. Trying out the Suprim X BIOS now and my fan issues are gone. Going to work on my OC now.


----------



## J7SC

yzonker said:


> You can. There's even blocks (pots) available for it probably. Glad I could help.


...I guess I'll start small - by Friday our long-awaited rain and temp drops are forecast...in fact, over the next few weeks it might turn into an atmospheric river again - after a long rainless period...I won't be able to hit the lows as this is a steel-concrete high-rise, but still, a drop in ambient of ~ 8 C might be achievable 











...I am tempted though to use this oldie GPU pot, but the VRAM would be an issue re. space


----------



## Baasha

BigMack70 said:


> I really don't understand Nvidia's decision to gate their FE cards behind Best Buy's unreliable website/storefront. Nvidia is clearly attempting to copy Apple, where they provide a product which they have controlled from top to bottom with the intent to provide what they believe is the best possible experience to the end user. However, they have made the FE model the most infuriating experience to attempt to purchase of almost any 4090. Doesn't make sense to me.


The 2080 Ti & Titan RTX were sold directly through Nvidia's website which worked great - Best Buy is an absolute joke and although it worked for a few, it hasn't worked for most. I was "in line" multiple times on launch day and today and every single time it sat there spinning and then went to 'Sold Out.'

How are we ever going to get the cards at MSRP if they allow bots to buy them within 10 seconds? Some guy on here said he got the GFE link and bought the 4090 FE and said he "will gladly sell it to me" but then ghosted me completely.  

I am just not sure what to do at this point - I waited a whole month to get my first 3090s and it was painful - I had to go through a friend of a friend to get them. The process is downright ridiculous it's taken all the fun out of it for me. I was so looking forward to streaming with my new 4090s and now who knows when I'll get them. 

I probably should find another hobby. I even offered people $100 to buy one for me and no takers. Nobody is willing to help and it's aggravating. 

/rant


----------



## BigMack70

Baasha said:


> The 2080 Ti & Titan RTX were sold directly through Nvidia's website which worked great - Best Buy is an absolute joke and although it worked for a few, it hasn't worked for most. I was "in line" multiple times on launch day and today and every single time it sat there spinning and then went to 'Sold Out.'
> 
> How are we ever going to get the cards at MSRP if they allow bots to buy them within 10 seconds? Some guy on here said he got the GFE link and bought the 4090 FE and said he "will gladly sell it to me" but then ghosted me completely.
> 
> I am just not sure what to do at this point - I waited a whole month to get my first 3090s and it was painful - I had to go through a friend of a friend to get them. The process is downright ridiculous it's taken all the fun out of it for me. I was so looking forward to streaming with my new 4090s and now who knows when I'll get them.
> 
> I probably should find another hobby. I even offered people $100 to buy one for me and no takers. Nobody is willing to help and it's aggravating.
> 
> /rant


Yup. I got 1080 Ti and 2080 Ti cards direct from Nvidia right at launch. I got a 3090 on the first restock a week or two after launch direct from Nvidia before they stopped selling direct. Great experiences with each of those purchases.

I didn't even have Best Buy loaded in my browser for this launch to try for a FE card, even though an FE card would have been my preference since I like the aesthetic design. The BB website is absolutely atrocious. I just used Newegg and grabbed the first $1600 model I could.


----------



## rjeftw

Just gotta stay on the grind fellas; I managed to grab a MSI Trio launch day at Best Buy then a FE today. I remember it took me about 42 days to nab my 3080FE after that launch.


----------



## originxt

Managed to snag a 4090 Fe today due for pick up on 10/25, massive drop and the purchase was extremely smooth other than 2 fraud blocks from 2 different credit cards lol. Retro needs to get off the internet and go touch some grass lol.


----------



## LunaP

For anyone that got their GeForce experience notifications was it just FE or just any card in general that BestBuy happened to be carrying near u ? Waiting patiently while I read reviews on blocks / which work for what since I'm aiming for copper types (chromium if not)

Seems like Gigabyte OC , Tuf, and FE are best bets atm for that.


----------



## sblantipodi

What is your score in 4k optimized?


----------



## yzonker

J7SC said:


> ...I guess I'll start small - by Friday our long-awaited rain and temp drops are forecast...in fact, over the next few weeks it might turn into an atmospheric river again - after a long rainless period...I won't be able to hit the lows as this is a steel-concrete high-rise, but still, a drop in ambient of ~ 8 C might be achievable
> 
> View attachment 2576860
> 
> 
> 
> ...I am tempted though to use this oldie GPU pot, but the VRAM would be an issue re. space
> View attachment 2576861


BTW, I do have a Bykski block on the way hopefully. Just ordered it today. Coming from China of course.


----------



## RetroWave78

Mad Pistol said:


> Yea. This is not the proper place to post conspiracy theories.


A video of an nvidia rep in an earnings conference stating they are going to constrict Lovelace supply (that's what reduce sell in means) to "let channel inventory correct" (translation: to sell 30 series overstock) is a "conspiracy theory"?

Care to qualify that assertion?

Go ahead, I'll wait here.

I mean if we can't have a civil debate about a topic without throwing around baseless epithets then we are indeed truly gone.


----------



## stahlhart

bmagnien said:


> 🙄 I can only report so many times guys, gonna need some help with this one.


Easier to just add them to the ignore list, like I did.


----------



## Nizzen

50+ confirmed in stock....


https://www.komplett.no/product/1218883/datautstyr/pc-komponenter/skjermkort/gainward-geforce-rtx-4090-phantom


People that don't have 4090, don't want it enough LOL

😇


----------



## J7SC

yzonker said:


> BTW, I do have a Bykski block on the way hopefully. Just ordered it today. Coming from China of course.


...so far, haven't seen one for the Gaming OC / Master (other than pre-order at EKWB); hoping for a Phanteks, Bykski or Alphacool soon...

btw, check out the prices for the EK Strix block and related per spoiler... 


Spoiler


----------



## mariushauko1994

derthballs said:


> Have you flashed your bios though? Im hoping someone who has flashed can actually see a bigger power draw, i can flash to the gigabyte and it shows as 600w in gpu-z but i still dont draw more than 485w in port royale.
> 
> Ive a new cable on the way so i can test with 4 independent cables rather than 3 with one shared for the 4th to see if that then draws more.


i just flashed my zotac trinity 3x8pin with gigabyte OC bios. saw 540w and over in afterburner. But im not sure thats the actual powerdraw though. However my port royal score went up alot from my runs yesterday with 450w limit.


----------



## gerardfraser

Xavier233 said:


> Yes I know the principles of it, but am looking for like a good configuration that works on the 4090: undervolt at which voltage, and across which clock range works best? There are not many youtube how-tos on 4090s yet for undervolting that I can find


RTX 4090 Undervolt curve 2790Mhz Core 1000Mhz Memory 

Scorn Unique Puzzle game.Undervolt voltages at start of video.
To bring up GPU voltage curve either hit curve editor or Ctrl+F on MSI Afterburner. 

Of course you can do a curve anyway you want.
Settings shown in video 
950mV at 2805Mhz 
810mV at 2220Mhz 
735mV at 1980


----------



## MrTOOSHORT

Picking up a Gigabyte OC 4090 today, excited!

Have to work tonight, so I'll install it tomorrow.


----------



## Sheyster

Baasha said:


> The 2080 Ti & Titan RTX were sold directly through Nvidia's website which worked great - Best Buy is an absolute joke and although it worked for a few, it hasn't worked for most. I was "in line" multiple times on launch day and today and every single time it sat there spinning and then went to 'Sold Out.'
> 
> How are we ever going to get the cards at MSRP if they allow bots to buy them within 10 seconds? Some guy on here said he got the GFE link and bought the 4090 FE and said he "will gladly sell it to me" but then ghosted me completely.
> 
> I am just not sure what to do at this point - I waited a whole month to get my first 3090s and it was painful - I had to go through a friend of a friend to get them. The process is downright ridiculous it's taken all the fun out of it for me. I was so looking forward to streaming with my new 4090s and now who knows when I'll get them.
> 
> I probably should find another hobby. I even offered people $100 to buy one for me and no takers. Nobody is willing to help and it's aggravating.
> 
> /rant


You're not alone my friend. It's very frustrating indeed. I would take the Strix at 2K+tax if one were available but I won't pay a scalper.


----------



## renejr902

Can someone post the average fps with a fully overclocked 4090 with the cyberpunk benchmark included in the game at ultra maximum graphics with psycho ray tracing NO DLSS ? Thanks. I found only one benchmark of a review that use psycho RT.The average was 38.7 fps. ( with ultra RT instead of psycho 43 fps on strix oc and 43 with msi suprim x and 41 with gigabyte oc and 46 if overclocked with msi suprim x, 46 is not too bad ) I will maybe play the game with no dlss, im curious, how much fps we can have. Tell us the minimum fps 1% too if possible. Check how many watts you have i saw around 550 watts during some benchmark, its a good benchmark to test watts. Thanks guys, its very appreciated


----------



## dr/owned

4090 delivers tomorrow from Amazon (it was originally like November 3rd delivery estimate). Shoutout tax free and 5% cash back.

But tomorrow I'm on vacation out of state for the next 2 weeks so it's going to sit in the box waiting for daddy to come back and molest it  Maybe a good thing that I get 2 weeks longer to wait for waterblocks to show up for MSI stuffs. I'm probably going to shunt it, probe to see if the header holes connect to SDA and SCL for EVC, and see if I can blow up the VRM because it's half as good as the Suprim. Amazon has 3 month return right now because holidays. Also reminds me I need to order a new EVC2 because mine somehow got in some broken state where it won't let me flash a new firmware on it and doesn't show up via USB.


----------



## Mad Pistol

renejr902 said:


> Can someone post the average fps with a fully overclocked 4090 with the cyberpunk benchmark included in the game at ultra maximum graphics with psycho ray tracing NO DLSS ? Thanks. I found only one benchmark of a review that use psycho RT.The average was 38.7 fps. ( with ultra RT instead of psycho 43 fps on strix oc and 43 with msi suprim x and 41 with gigabyte oc and 46 if overclocked with msi suprim x, 46 is not too bad ) I will maybe play the game with no dlss, im curious, how much fps we can have. Tell us the minimum fps 1% too if possible. Check how many watts you have i saw around 550 watts during some benchmark, its a good benchmark to test watts. Thanks guys, its very appreciated


That's with a +210/+1100 Core/memory overclock on a Gigabyte Windforce 4090.

GPU says 480-watts.
Power at the wall (UPS) says 630-watts total system draw.


----------



## Baasha

Have you guys tried Metro Exodus Enhanced Edition maxed out with RT? That should pull close to 600W iirc. Guys with custom BIOS on the 3090 Ti were pulling ~ 580W in that game.


----------



## renejr902

Mad Pistol said:


> That's with a +210/+1100 Core/memory overclock on a Gigabyte Windforce 4090.
> 
> GPU says 480-watts.
> Power at the wall (UPS) says 630-watts total system draw.
> 
> 
> View attachment 2576892


Thanks a lot, great result 

I suppose you can have 46 fps with ultra RT instead of psycho like the suprimx overclocked, if you have time you can check it too im curious too, it seems all cards have around the same performance when fully overclocked.


----------



## gerardfraser

renejr902 said:


> Can someone post the average fps with a fully overclocked 4090 with the cyberpunk benchmark included in the game at ultra maximum graphics with psycho ray tracing NO DLSS ? Thanks guys, its very appreciated


That is the FPS you get with 4090 with the settings you asked for.Of course some nut will swear he gets 100FPS on the same settings with his mafic 12900k and magic 4090


----------



## Jordyn

Baasha said:


> Have you guys tried Metro Exodus Enhanced Edition maxed out with RT? That should pull close to 600W iirc. Guys with custom BIOS on the 3090 Ti were pulling ~ 580W in that game.


Yea was boucing between ~550w and ~570w on the Giga GOC. The most I have seen in a gaming scenario so far.


----------



## Mad Pistol

renejr902 said:


> Thanks a lot, great result
> 
> I suppose you can have 46 fps with ultra RT instead of psycho like the suprimx overclocked, if you have time you can check it too im curious too, it seems all cards have around the same performance when fully overclocked.


I just play the game with DLSS Quality, and that punches the framerate up to 60-70 FPS average. For max settings, it's absolutely spectacular.


----------



## renejr902

gerardfraser said:


> That is the FPS you get with 4090 with the settings you asked for.Of course some nut will swear he gets 100FPS on the same settings with his mafic 12900k and magic 4090


Thanks a lot, great result too, you have a better min fps than Mad Pistol 

I suppose you can have 46 fps with ultra RT instead of psycho like the suprimx overclocked, if you have time you and Mad Pistol can check it too im curious too, it seems all cards have around the same performance when fully overclocked. I read all the pages of the topic, but i cant remember, which card model do you have ?


----------



## mirkendargen

I wouldn't personally buy a Corsair block, but throwing this out there for anyone that may feel differently



https://www.corsair.com/us/en/Categories/Products/Custom-Cooling/Blocks/GPU-Blocks/Hydro-X-Series-XG7-RGB-40-SERIES-GPU-Water-Block/p/CX-9020019-WW


----------



## renejr902

Mad Pistol said:


> I just play the game with DLSS Quality, and that punches the framerate up to 60-70 FPS average. For max settings, it's absolutely spectacular.


Yeah i played it a few hours at max setting with my evga 3090 ftw ultra and it looks awesome, but not a very good framerate even with dlss, i suppose my 4790k dont help me lol, i will upgrade cpu and motherboard before january. Buy i will buy a 4090 as soon as i find one, i try to buy the Asus strix if i find one soon enough


----------



## Antsu

Just won a raffle to be able to buy the base model non-OC TUF for 1999€ (1612€ without tax) from a local shop. Kind of disgusting that you need to get lucky to be able to buy the base model at MSRP, but I will not complain since I did indeed "get lucky" (I have spent a ton of money in that shop, don't know how "random" the raffle actually was, but again, I won't complain )
Now I need to start bugging Aquacomputer about making a block ASAP, hehe. 
I was kind of leaning on getting the Strix, but now the difference is 600€ so it's impossible to justify unless I win the lottery. Hope it isn't a complete dud.
E: Already sent Aquacomputer an e-mail asking if they will even make the block, and if so, could they give some sort of an ETA. I'll update you guys when they respond.


----------



## Mad Pistol

renejr902 said:


> Thanks a lot, great result too, you have a better min fps than Mad Pistol
> 
> I suppose you can have 46 fps with ultra RT instead of psycho like the suprimx overclocked, if you have time you and Mad Pistol can check it too im curious too, it seems all cards have around the same performance when fully overclocked. I read all the pages of the topic, but i cant remember, which card model do you have ?


First run must have had some sort of stutter. This time I got a much better min fps.


----------



## AdamK47

Gaming Trio maxes out at 480W. Flashed mine to the Suprim BIOS. Works just fine. Getting 520W now. Don't want/need any more than that. Fan profile is much more aggressive. I'll get use to the added noise.


----------



## AdamK47

One thing... Could someone post the Gaming Trio gaming BIOS? I got too eager and flashed to Suprim before saving the original.


----------



## ReightNineSeven

LuckyImperial said:


> The adapter offers 450W/600W depending on which sense pins are getting signals, so the fact that your capped at 350W doesn't lead me to believe it's the adapter that's causing you issues.
> 
> This new style adapter is bitter sweet. It's cool they offer the adapter, and they built some logic into it, but it sucks having to fight through teething issues. I'm sure a lot of people would rather just have four 8pin headers on the card.
> 
> I'll be watching what your issue for a resolution. It seems like a BIOS problem to me, or a VRM issue or something.
> 
> I'm still trying to get a hold of a MSI Liquid X for my SFF build. Crossing my fingers for some new inventory to start flowing in this week.


Ended up getting this resolved today. Shortly after my last post the PC shutdown and would no longer POST. Replaced PSU and no change. Ended up troubleshooting it to being an issue with either the CPU or memory. Swapped out the CPU and RAM and PC was up and running again. Initial tests now seem to show TDP correct.


----------



## schoolofmonkey

So for those who are wondering how older CPU's effect the performance of the 4090, I am currently running a 10900k, my Galax 4090 should be here tomorrow, I have also pre-ordered a 13900k/Hero/6000Mhz DDR5, so what I'll do is run some benchmarks over the weekend, then when the 13900k shows up re run the same benchmarks on the same OS drive I used for the 10900k so you can compare.

Just give me a list of games you're interested in seeing, I don't own MS Flight Simulator, but that's a given it has a FPS boost based off CPU...


----------



## renejr902

Mad Pistol said:


> First run must have had some sort of stutter. This time I got a much better min fps.
> 
> View attachment 2576913


Yeah much better min fps, 31,52 is good enough to be at least playable. I know you are maybe busy, but if you have time can you try the same benchmark with ultra maximum setting but with ultra ray tracing instead of psycho. Im curious how much more fps you can obtain. I saw one benchmark of a review and with a suprim x fully overclocked the result was 46 fps average in ultra ray tracing. Thanks a lot for everything, its so fun and so much appreciated.


----------



## LuckyImperial

ReightNineSeven said:


> Ended up getting this resolved today. Shortly after my last post the PC shutdown and would no longer POST. Replaced PSU and no change. Ended up troubleshooting it to being an issue with either the CPU or memory. Swapped out the CPU and RAM and PC was up and running again. Initial tests now seem to show TDP correct.



Interesting resolution. That's curious to me; that defective CPU or RAM hardware would influence what the GPU can achieve. Computers are intricate though...maybe a PCIE lane was dying on the CPU and something wasn't being "passed through" correctly. I can only speculate.


----------



## N19htmare666

yt93900 said:


> Waterforce Xtreme 360mm w. Gaming OC VBIOS, slider at 133%:
> View attachment 2576839
> 
> 
> Couldn't get it to power throttle in Port Royal, needed Quake II RTX for it....
> So far the VBIOS flash has not really been worth it because of the bad fan controls, only viable for 3Dmark runs etc. Will do some benches and go back to the OG bios.


We're you able to get a custom fan curve to fix it? 
Have you thought about the 600w msi liquid BIOS, or the 630w Neptune (also a 360 aio) BIOS? 

Wondering if there is still hope to get past the 500w waterforce limit...


----------



## N19htmare666

zhrooms said:


> It's 600W.. the "early reviewer" BIOS uploaded by TPU is not retail BIOS, Liquid X ships to consumers with a 600W BIOS
> 95.02.18.00.C4 (Build Date 2022-09-01) PG139 SKU 330 VGA BIOS MSINV510MH.290 450/600W
> I have multiple people confirming by sharing screenshots of retail Liquid X cards, whatever is shown in the original post is correct


You also have the waterforce down as 600w but it seems that is actually 500w?


----------



## LukeOverHere

schoolofmonkey said:


> So for those who are wondering how older CPU's effect the performance of the 4090, I am currently running a 10900k, my Galax 4090 should be here tomorrow, I have also pre-ordered a 13900k/Hero/6000Mhz DDR5, so what I'll do is run some benchmarks over the weekend, then when the 13900k shows up re run the same benchmarks on the same OS drive I used for the 10900k so you can compare.
> 
> Just give me a list of games you're interested in seeing, I don't own MS Flight Simulator, but that's a given it has a FPS boost based off CPU...


I’m running an 8700k on a dedicated 4K Setup, so this will be very interesting.


----------



## KillerBee33

LukeOverHere said:


> I’m running an 8700k on a dedicated 4K Setup, so this will be very interesting.


TimeSpy runs 10900K vs 12900K with 4090's


----------



## yzonker

Interesting, I did some Speedway runs and found core clock does very very little to improve the score. It's almost completely memory bandwidth limited. 

I did almost break 11k by bumping up my CPU/Mem speed. So it's slightly dependent on that. 









I scored 10 997 in Speed Way


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





But like others are noticing, no clue how to get 1000pts more with nearly the same clocks in looking at the current HOF leader. 









I scored 11 992 in Speed Way


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 4090 x 1, 32646 MB, 64-bit Windows 11}




www.3dmark.com


----------



## bmagnien

I wonder if Alphacool is going to comment on their Reference waterblock that they were promoting and taking preorders for for weeks before launch now that there’s still not a single card that’s Reference except the Inno3D cards that they literally partnered with to provide the built in blocks for. I questioned them on their forums and Twitter about selling a card with 0 confirmed compatibility and they were super defensive, essentially saying there’s always reference cards

also looks like they’re now coming out with Suprim, Palit, and FE blocks:


https://www.alphacool.com/download/compatibility%20list%20Nvidia.pdf


----------



## Mad Pistol

renejr902 said:


> Yeah much better min fps, 31,52 is good enough to be at least playable. I know you are maybe busy, but if you have time can you try the same benchmark with ultra maximum setting but with ultra ray tracing instead of psycho. Im curious how much more fps you can obtain. I saw one benchmark of a review and with a suprim x fully overclocked the result was 46 fps average in ultra ray tracing. Thanks a lot for everything, its so fun and so much appreciated.


Your wish is my command. 4K, max settings, but Ultra RT instead of Psycho.


----------



## changboy

I saw a ZOTAC GAMING GeForce RTX 4090 Trinity OC .
Price here is 2299$ but the one i saw is 2499$, what you think about this ?


----------



## Spicedaddy

changboy said:


> I saw a ZOTAC GAMING GeForce RTX 4090 Trinity OC .
> Price here is 2299$ but the one i saw is 2499$, what you think about this ?


I'd wait.


----------



## changboy

Coz in same time i can buy the waterblock for it from Byski.


----------



## renejr902

Mad Pistol said:


> Your wish is my command. 4K, max settings, but Ultra RT instead of Psycho.
> 
> View attachment 2576921


Great, thanks so much ! You are a Nice guy. And yes it runs better, its like 4 fps better in average, both ultra and psycho RT are at least playable, low min fps 2 fps better this time. I'm surprised how great a windforce can run when overclocked, even the Asus strix oc and msi suprim x cant do better. Yeah i will prefer to buy a Asus strix, but honestly any card would do the same job and give us the same performance when overclocked, + or - 1% at best. Have fun with your new card, i will let you know if i find one soon.


----------



## gerardfraser

renejr902 said:


> Thanks a lot, great result too, you have a better min fps than Mad Pistol
> 
> I suppose you can have 46 fps with ultra RT instead of psycho like the suprimx overclocked, if you have time you and Mad Pistol can check it too im curious too, it seems all cards have around the same performance when fully overclocked. I read all the pages of the topic, but i cant remember, which card model do you have ?


Here you go Gigabyte OC RTX 4090 Intel [email protected]


----------



## renejr902

gerardfraser said:


> Here you go Gigabyte OC RTX 4090 Intel [email protected]


Wow! Thanks so much, its a great screenshot you just made, i love it. Like i just said to Mad Pistol, ultra or psycho ray tracing are both at least playable to me, min fps is good enough, but 4 more fps on average is interesting, maybe i will play at ultra instead of psycho. I will just use dlss when i have too because fps is very low. Any card overclocked work fast like a strix or suprim overclocked, any card woll do the same performance, its interesting. Have a Nice evening, thanks again gerardfraser


----------



## EarlZ

Alemancio said:


> I havent seen anything about it, seems to have been low on stock thus no reviews?


I kinda expected reviewers to get this directly from the manufacturers.Not sure why this was not yet seeded for reviews while other cards been reviewed. I'm sure this is on par with something like the TUF/Strix/Suprim.


----------



## gecko991

Scales rather well.


----------



## changboy

Another one bought it , ya i take way too long to decide lol.

The thing is in my area no store will get 4090, they just begun get the 3000 serie lol.


----------



## Tadaschi

Got my suprim liquid x and put a ekwb waterblock 
on mora3 420 pro 
12900ks 
ddr5 6800 cl36
+220mhz clock
+1700 memory








I scored 27 751 in Port Royal


Intel Core i9-12900KS Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## alitayyab

derthballs said:


> Has anyone with a Zotak Amp Aero who's flashed is actually showing usage of more than the 485w on the original bios when running anything? Im trying to figure out if me being restricted is having 3 (one being split into 2) rather than 4 dedicated pci-e cables going into it.





changboy said:


> I saw a ZOTAC GAMING GeForce RTX 4090 Trinity OC .
> Price here is 2299$ but the one i saw is 2499$, what you think about this ?



Even if you only plan on running stock, the power delivery on this (+ the Palit and GB Windforce) are extremely watered down as compared to some of their other siblings. Heck, the Zotac amp extreme airo has 1.7x power delivery as compared to the trinity OC.

_*Power delivery link:*_





RTX 4090 VRM meta-analysis and FE/AIB comparison


I've been watching a lot of content on the 4090s and with Buildzoid coming out with his two board analysis videos I wanted a quick reference to all 4090 cards in terms of VRM arrangements. It's interesting seeing the wild variance from the Halo cards (Suprim, STRIX, AIRO) and the OC cards (Trio, ...




linustechtips.com





All of them theoretically meet Nvidia design standards (from what i understand Nvidia Okays all designs), but the heat output together with limited power delivery is probably going to kill these cards sooner as compared to some of the others.


----------



## lordkahless

LunaP said:


> For anyone that got their GeForce experience notifications was it just FE or just any card in general that BestBuy happened to be carrying near u ? Waiting patiently while I read reviews on blocks / which work for what since I'm aiming for copper types (chromium if not)
> 
> Seems like Gigabyte OC , Tuf, and FE are best bets atm for that.


I noticed on my Nvidia account that my country keeps changing to blank. Since the GFE 4090 sweepstakes is only open to certain countries I feared this might prevent me from winning. No matter how many times I update it to United States it reverts back to blank within minutes. I started an Nvidia support chat. They said it likely didn't matter as the GFE already feeds them telemetry where we are lol. So I asked them what is the trick to winning the invite to purchase. They said we need long established history of frequent use of the GFE. I said I have only installed it for the link. They said I would be unlikely to be a candidate. So the whole thing may be a waste of time to rely on as a means of getting the Founders edition. It is only for the Founders.


----------



## galsdgk

bmagnien said:


> I wonder if Alphacool is going to comment on their Reference waterblock that they were promoting and taking preorders for for weeks


I just got mine today. Should work great with the PNY 4090 once that ships.


----------



## J7SC

Back to the drawing board re. best (most efficient) VRAM settings. By only playing with VRAM and leaving PL & core at stock to *control temps* which also means 1.05 vCore instead of 1.10, the card consistently hit 450W. Using Speedway and Port Royal as the test base, best VRAM so far was +1472 MHz on the slider - also the last one I tried before I had to deal with s.th. else.

Before, with full-bore PL (up to 600W; 570+ observed) and full core oc, VRAM efficiency would tail off around 1404 to 1431. I think this is solely related to heat / Hotspot...ditto for max core oc. Now I want to dump this thing into a bathtub of LN2 🥶


----------



## motivman

i flashed suprim x bios to my trio, and also getting full 530W, but fan speeds are not running at maximum, so temps are way worse... guess this is no Bueno, until its on a waterblock... SMH


----------



## motivman

Tadaschi said:


> Got my suprim liquid x and put a ekwb waterblock
> on mora3 420 pro
> 12900ks
> ddr5 6800 cl36
> +220mhz clock
> +1700 memory
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 27 751 in Port Royal
> 
> 
> Intel Core i9-12900KS Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


there is ek waterblock for this?????????


----------



## bmagnien

galsdgk said:


> I just got mine today. Should work great with the PNY 4090 once that ships.


Where do you see that the PNY is reference? It’s not ok Alphacools own compatibility list


----------



## bmagnien

motivman said:


> there is ek waterblock for this?????????


I would expect much lower temps than 50c with a proper block and a mora. But EKs blocks for 4090 aren’t out yet so not sure what he’s referencing


----------



## gamerMwM

lordkahless said:


> I noticed on my Nvidia account that my country keeps changing to blank. Since the GFE 4090 sweepstakes is only open to certain countries I feared this might prevent me from winning. No matter how many times I update it to United States it reverts back to blank within minutes. I started an Nvidia support chat. They said it likely didn't matter as the GFE already feeds them telemetry where we are lol. So I asked them what is the trick to winning the invite to purchase. They said we need long established history of frequent use of the GFE. I said I have only installed it for the link. They said I would be unlikely to be a candidate. So the whole thing may be a waste of time to rely on as a means of getting the Founders edition. It is only for the Founders.


If you're in the US don't waste your time. When I contacted Nvidia they said it was a one-time promotion and it was over. Today some invites were sent out but it was for people in other countries, as I guess it was their turn. Don't expect another round of priority invites here, and if they keep dropping stock more often hopefully we won't need to rely on that anyway.


----------



## bmagnien

Some interesting new thermal pads just released this week:









Thermal Pad - TP-3 | Premium Performance Thermal Pad | ARCTIC


The ARCTIC Thermal Pad is a perfect and soft bridger of unevenness for an even cooling surface. Easy to use and safe to handle | Free shipping in DE…




www.arctic.de







Amazon.com


----------



## minittt

Hi folks anyone got the AORUS 4090 XTREME WATERFORCE 24G SKU: GV-N4090AORUSX W-24GD ?

I am picking 1 up this Sunday. Let me know if this is any good.


----------



## LunaP

lordkahless said:


> I noticed on my Nvidia account that my country keeps changing to blank. Since the GFE 4090 sweepstakes is only open to certain countries I feared this might prevent me from winning. No matter how many times I update it to United States it reverts back to blank within minutes. I started an Nvidia support chat. They said it likely didn't matter as the GFE already feeds them telemetry where we are lol. So I asked them what is the trick to winning the invite to purchase. They said we need long established history of frequent use of the GFE. I said I have only installed it for the link. They said I would be unlikely to be a candidate. So the whole thing may be a waste of time to rely on as a means of getting the Founders edition. It is only for the Founders.


Well that sucks, granted I kinda suspected it was more of a means to get people to install, hopefully bestbuy actually intends to restock but currently still stating they don't see any orders yet as it doesn't appear high in demand which sounds silly. Main reason I'm gunning for bestbuy is I have a store card w/ them and can't flat out just purchase one elsewhere without opening an account which I'm trying to avoid.



gamerMwM said:


> If you're in the US don't waste your time. When I contacted Nvidia they said it was a one-time promotion and it was over. Today some invites were sent out but it was for people in other countries, as I guess it was their turn. Don't expect another round of priority invites here, and if they keep dropping stock more often hopefully we won't need to rely on that anyway.


If that's true then rip.. I'll just keep refreshing daily then lol. Appreciate the heads up at least.


----------



## Antsu

bmagnien said:


> Some interesting new thermal pads just released this week:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thermal Pad - TP-3 | Premium Performance Thermal Pad | ARCTIC
> 
> 
> The ARCTIC Thermal Pad is a perfect and soft bridger of unevenness for an even cooling surface. Easy to use and safe to handle | Free shipping in DE…
> 
> 
> 
> 
> www.arctic.de
> 
> 
> 
> 
> 
> 
> 
> Amazon.com


Very interesting. They seem to be claiming 2x better heat transfer than Minus Pad 8, and those are currently twice the price of TP-3 for the same 100x100x1. Would be really nice if those actually deliver, I want my VRM etc. cool, but not this much. 








(Pic)


----------



## KickAssCop

Anyone got a TUF OC bios? I got a TUF card and am looking to get the OC bios so I can feel special.


----------



## Arizor

KickAssCop said:


> Anyone got a TUF OC bios? I got a TUF card and am looking to get the OC bios so I can feel special.


I just used the Strix OC BIOS - Asus RTX 4090 VBIOS 

Works fantastic. I have it undervolted very moderately and it smashes anything


----------



## stargit

derthballs said:


> Have you flashed your bios though? Im hoping someone who has flashed can actually see a bigger power draw, i can flash to the gigabyte and it shows as 600w in gpu-z but i still dont draw more than 485w in port royale.
> 
> Ive a new cable on the way so i can test with 4 independent cables rather than 3 with one shared for the 4th to see if that then draws more.


I haven't flashed it yet.
But I saw this post



spcysls said:


> For whatever reason the Vulcan 550W bios performs better than the Gigabyte 600W bios on my Zotac Amp Extreme even at the same clockspeeds and lower temperatures. Port Royal only pulls 537W anyway.


Wonder which bios is better.


----------



## J7SC

Arizor said:


> I just used the Strix OC BIOS - Asus RTX 4090 VBIOS
> 
> Works fantastic. I have it undervolted very moderately and it smashes anything
> View attachment 2576940


Nice ! If you can significantly increase VRAM compared to the previous TUF vbios, I wonder if the Strix vbios has looser timings and/or more VRAM voltage ? I might try it on my Giga-G-OC (dual vbios card) once water-cooled.


----------



## StreaMRoLLeR

Nd4spdvn said:


> It is, you just want to draw our attention. And beat my no4 score, lol.


Just ignore this trash xcx. Its known troll in turkey forums. His legit score is 28.300 and above its fake


----------



## Nizzen

xcx xcxvgyt said:


> I scored 29 633 in Port Royal
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Hi everyone, why this is not listed on 3dmark list any idea?


Because the run is bugged. Most likely due to very unstable vram.


----------



## KedarWolf

Nizzen said:


> Because the run is bugged. Most likely due to very unstable vram.


I did a search for 5900x and RTX 4090 and it's the #1 bench.

It's #4 behind 3 Intel Core i9-12900K's.


----------



## Nizzen

KedarWolf said:


> I did a search for 5900x and RTX 4090 and it's the #1 bench.
> 
> It's #4 behind 3 Intel Core i9-12900K's.


Hmm, strange result anyway I think.


----------



## Arizor

J7SC said:


> Nice ! If you can significantly increase VRAM compared to the previous TUF vbios, I wonder if the Strix vbios has looser timings and/or more VRAM voltage ? I might try it on my Giga-G-OC (dual vbios card) once water-cooled.


yeah It’s interesting, I can get the same VRAM and core on the TUF bios but on STRIX bios it can go lower wattage and just gets higher benchmarks and frame rates (talking like 1-2% but I’ll take it).


----------



## 4ThreeX

Anyone wanna check if the minimum 30% fan speed limit changes if you flash Strix BIOS on the TUF? Would love to run it at 20-25% when fan stop isn't applicable, if possible.


----------



## Zogge

KickAssCop said:


> Anyone got a TUF OC bios? I got a TUF card and am looking to get the OC bios so I can feel special.


I will pick up a TUF OC tomorrow so I can extract the bios tomorrow evening.


----------



## IntelMistakes

Hello to all.


I noticed that if you use the DualBIOS switch on P with the Strix Card has the Power Limit set to 90%. However if you set it to Q, the default Power Limit is 100%.


Does anyone know why? BUG in the BIOS? Anyone with the same model of card with the same problem?


Thanks.


----------



## StreaMRoLLeR

KedarWolf said:


> I did a search for 5900x and RTX 4090 and it's the #1 bench.
> 
> It's #4 behind 3 Intel Core i9-12900K's.


The problem is he wont accept it cuz its known troll. He just invaded here.


----------



## Arizor

4ThreeX said:


> Anyone wanna check if the minimum 30% fan speed limit changes if you flash Strix BIOS on the TUF? Would love to run it at 20-25% when fan stop isn't applicable, if possible.


looks like 30% minimum or fan stop just testing here using FanControl.


----------



## 4ThreeX

That's too bad, but thanks a lot for testing!


----------



## cstkl1




----------



## Hulk1988

*I am also wondering how people can cheat/exploit 3DMark currently. Unrealistic and fishy results in SpeedWay and Port Royal. People having less GPU and Mem clock but 6-10% more points.They even beat easily 4090 LN2 records with a Air 4090.*


----------



## StreaMRoLLeR

Hulk1988 said:


> *I am also wondering how people can cheat/exploit 3DMark currently. Unrealistic and fishy results in SpeedWay and Port Royal. People having less GPU and Mem clock but 6-10% more points.They even beat easily 4090 LN2 records with a Air 4090.*


unless they explain what did they do, its fake. For example his legit score is 28.300 but jumped 1k with lower clocks. He thinks this forum is fool lol


----------



## yzonker

Nizzen said:


> Hmm, strange result anyway I think.


They're bashing on you, but you are probably 100% correct. My highest score just over 29k was done that way. If you move the VRAM offset up to just the right point, PR will artifact badly. If it completes, it's worth about 500pts. I did it twice.


----------



## KedarWolf

There is a LOD BIAS cheat but I'm pretty sure 3DMark checks for that and if you use it, your run gets flagged as invalid.


----------



## Benni231990

zhrooms said:


> It's 600W.. the "early reviewer" BIOS uploaded by TPU is not retail BIOS, Liquid X ships to consumers with a 600W BIOS
> 95.02.18.00.C4 (Build Date 2022-09-01) PG139 SKU 330 VGA BIOS MSINV510MH.290 450/600W
> I have multiple people confirming by sharing screenshots of retail Liquid X cards, whatever is shown in the original post is correct


so can anybody Upload This BIOS?


----------



## yt93900

I did some testing and moving the slider in Nvidia driver to "Performance" yields extra 100pts so I'm at 28299 now, further tweaking the 3D settings indeed causes 3Dmark to invalidate the score stating the LOD has been changed. Still nowhere near the 29k+ score.


----------



## yzonker

KedarWolf said:


> There is a LOD BIAS cheat but I'm pretty sure 3DMark checks for that and if you use it, your run gets flagged as invalid.


Yes, that's correct.


----------



## TheNaitsyrk

Hello there.

I have Suprim X 4090, and it looks like Suprim Liquid X has 600W BIOS.

It's not available on TechPowerUp yet, but when it is, is it possible to flash that BIOS to my normal 4090 Surpim X? It should work in theory as it's the same card but one has AIO on it.


----------



## yzonker

J7SC said:


> Nice ! If you can significantly increase VRAM compared to the previous TUF vbios, I wonder if the Strix vbios has looser timings and/or more VRAM voltage ? I might try it on my Giga-G-OC (dual vbios card) once water-cooled.


I didn't see any improvement in VRAM last night with the Strix bios. Limit was still in the 1800-1900 range on my card. That's what I ran for the Speedway run I posted.


----------



## yzonker

TheNaitsyrk said:


> Hello there.
> 
> I have Suprim X 4090, and it looks like Suprim Liquid X has 600W BIOS.
> 
> It's not available on TechPowerUp yet, but when it is, is it possible to flash that BIOS to my normal 4090 Surpim X? It should work in theory as it's the same card but one has AIO on it.


No, it's 530w. You need the Gigabyte OC or any Asus bios.


----------



## TheNaitsyrk

yzonker said:


> No, it's 530w. You need the Gigabyte OC or any Asus bios.


The 1st post states that 600W is available. I'm confused.


----------



## StreaMRoLLeR

*Answer from 3D Mark support team. This moron xyx troll should be ashamed himself. ( hes score got REKT )*


----------



## LukeOverHere

Ok, im looking at you experienced people for a more technical explanation on this for self learning, im not new to overclocking or flashing, i messed around with my 3080 with 10 different BIOS's & MSI Afterburner is my 'Go-To', but im not really educated when it comes to the actual boards (So i must have been lucky haha). I have ordered a Gainward Phantom 4090 (Not the Phantom GS, just the standard Phantom card, as there is a Non-GS version) What im trying to understand is the effect i will have flashing other BIOS's now that i am looking at the different PWM, GPU Stages and VRAM Stages. If the Phantom is built the same as the Phantom GS, and this info is up-to-date, mine should have the following:

PWM: uP9512U
GPU Stages: 16×50A (800A) NCP302150
VRAM Stages: 3×50A (150A) NCP302150
BIOS: 500W Max

If so, should i be using the above data as a reference point, and only flashing other BIOS's with a similar setup with a higher 'Power Limit' if possible, to prevent cooking the card. I don't shoot for crazy OC's, but i do tend to push for as much juice as possible until its stable, and then pull it back slightly so its rock solid for gaming, but i obviously want the card to last as well.

Hopefully im making sense, but what im asking is, as an example; if the STRIX bios is a 600W, and the GPU stages are 24×70A (1680A), and the VRAM Stages are 4×70A (280A), this is significantly higher Amps than my Gainward card, so should i be cautious with my selection of BIOS's to flash? and what effect does this actually have when manufacturing the boards using different GPU/VRAM Stages? Im assuming NVIDIA approves all of the designs anyway so they must be stable?


----------



## KedarWolf

yt93900 said:


> I did some testing and moving the slider in Nvidia driver to "Performance" yields extra 100pts so I'm at 28299 now, further tweaking the 3D settings indeed causes 3Dmark to invalidate the score stating the LOD has been changed. Still nowhere near the 29k+ score.


You can't change the Negative LOD Bias settings, all the rest you can.

Try Nvidia Settings these in the Imgur link. The Imgur link is just a preview. You need to actually click on the thumbnail to see everything.

And use Nvidia Inspector, Run As Admin, enable unknown settings in the top menu, do those tweaks too.









Releases · Orbmu2k/nvidiaProfileInspector


Contribute to Orbmu2k/nvidiaProfileInspector development by creating an account on GitHub.




github.com







http://imgur.com/a/HuzyQj2


----------



## Arizor

Think this is the best I can get with my TUF, 5900X holding me back from some points I feel. Still very happy with it, I could probably squeeze more following @KedarWolf 's excellent guide above, but I just can't be bothered


----------



## TheNaitsyrk

Did anyone with Surpim X try to flash Strix BIOS onto it? And if you did, does it work well?


----------



## RaMsiTo

TheNaitsyrk said:


> Did anyone with Surpim X try to flash Strix BIOS onto it? And if you did, does it work well?


it works but with the maximum fans you lose 400-500 rpm


----------



## TheNaitsyrk

RaMsiTo said:


> it works but with the maximum fans you lose 400-500 rpm


Ah so that's fine, when water block comes out it can easily be done


----------



## dante`afk

My bykski block was shipped from China today


----------



## Nico67

LukeOverHere said:


> Ok, im looking at you experienced people for a more technical explanation on this for self learning, im not new to overclocking or flashing, i messed around with my 3080 with 10 different BIOS's & MSI Afterburner is my 'Go-To', but im not really educated when it comes to the actual boards (So i must have been lucky haha). I have ordered a Gainward Phantom 4090 (Not the Phantom GS, just the standard Phantom card, as there is a Non-GS version) What im trying to understand is the effect i will have flashing other BIOS's now that i am looking at the different PWM, GPU Stages and VRAM Stages. If the Phantom is built the same as the Phantom GS, and this info is up-to-date, mine should have the following:
> 
> PWM: uP9512U
> GPU Stages: 16×50A (800A) NCP302150
> VRAM Stages: 3×50A (150A) NCP302150
> BIOS: 500W Max
> 
> If so, should i be using the above data as a reference point, and only flashing other BIOS's with a similar setup with a higher 'Power Limit' if possible, to prevent cooking the card. I don't shoot for crazy OC's, but i do tend to push for as much juice as possible until its stable, and then pull it back slightly so its rock solid for gaming, but i obviously want the card to last as well.
> 
> Hopefully im making sense, but what im asking is, as an example; if the STRIX bios is a 600W, and the GPU stages are 24×70A (1680A), and the VRAM Stages are 4×70A (280A), this is significantly higher Amps than my Gainward card, so should i be cautious with my selection of BIOS's to flash? and what effect does this actually have when manufacturing the boards using different GPU/VRAM Stages? Im assuming NVIDIA approves all of the designs anyway so they must be stable?


Different cards have bios differences due different ports, ie Strix having an extra HDMI, different numbers of fans, AIO cards and different voltage regulators. Sometimes these things can be a problem, other times not, for example port changes may just mean some of your ports won't work, and fans don't matter if you are waterblocking it etc. Didn't really see any issues with voltage regulators last gen, even with 2 or 3 8pins, it was usual just a rail balance issue.
Also if you have dual bios, you can always boot of the second, switch and lash over the bad one.


----------



## TheNaitsyrk

Nico67 said:


> Different cards have bios differences due different ports, ie Strix having an extra HDMI, different numbers of fans, AIO cards and different voltage regulators. Sometimes these things can be a problem, other times not, for example port changes may just mean some of your ports won't work, and fans don't matter if you are waterblocking it etc. Didn't really see any issues with voltage regulators last gen, even with 2 or 3 8pins, it was usual just a rail balance issue.
> Also if you have dual bios, you can always boot of the second, switch and lash over the bad one.


Yeah I remember I flashed my 3090 Strix with Kingpin Bios and it would say that it's at 276W all the time (but it lied) and only showed true colours in FurMark.

But Suprim X I had before worked brill with any BIOS.


----------



## LukeOverHere

Nico67 said:


> Different cards have bios differences due different ports, ie Strix having an extra HDMI, different numbers of fans, AIO cards and different voltage regulators. Sometimes these things can be a problem, other times not, for example port changes may just mean some of your ports won't work, and fans don't matter if you are waterblocking it etc. Didn't really see any issues with voltage regulators last gen, even with 2 or 3 8pins, it was usual just a rail balance issue.
> Also if you have dual bios, you can always boot of the second, switch and lash over the bad one.


Thanks for the reply, much appreciated. When it arrives ill see what i can achieve. I know nvflash does have an element of protection built in to prevent failed flashing, i can also grab the ‘GS’ version of the gainward BIOS and OC that as well which is very cautious in the scheme of things…and i fully understand about the fan RPM issues, i had the same with my 3080, some of the higher wattage BIOS’s ended up performing worse as i couldn’t crank the fans up at the full RPM so were pulling the clocks back, this is part of the reason i played around with so many BIOS’s until i was happy. I only recently started thinking about the actual board and how they are built, putting a bit more thought into it… especially with the prices of these cards……before i was just ‘flashing’ away without care haha


----------



## Jimshown LMHF

For those who are wondering why their port royal scores are invalidated, it's simple, most of your benchmarks are not legit if you go above 29k without having a maintained frequency of more than 3075/3090mhz. It is at the level of your vram setting that there is a problem, you create vram errors by being too high, which does not crash the bench but gives too high results. ECC is disabled in the drivers. I finally have my answer on all these scores without any sense and totally illogical.


----------



## RaMsiTo

Jimshown LMHF said:


> For those who are wondering why their port royal scores are invalidated, it's simple, most of your benchmarks are not legit if you go above 29k without having a maintained frequency of more than 3075/3090mhz. It is at the level of your vram setting that there is a problem, you create vram errors by being too high, which does not crash the bench but gives too high results. ECC is disabled in the drivers. I finally have my answer on all these scores without any sense and totally illogical.


and what explanation do you give to this test









Result not found







www.3dmark.com


----------



## Jimshown LMHF

RaMsiTo said:


> and what explanation do you give to this test
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


The same explanation as for your Port Royal, you're on a bug spot of your Vram. But I wouldn't decide on Speedway until I did some approved testing on it. What is certain is that the partitions of Port Royal are distorted. UL benchmark invalid by hand at the moment may be a new systeminfo soon that detects this directly.


----------



## StreaMRoLLeR

Jimshown LMHF said:


> The same explanation as for your Port Royal, you're on a bug spot of your Vram. But I wouldn't decide on Speedway until I did some approved testing on it. What is certain is that the partitions of Port Royal are distorted. UL benchmark invalid by hand at the moment may be a new systeminfo soon that detects this directly.


Yes. I am in talks with support atm. Can you provide your feedback to support aswell ?

All of fake scores are under investigation atm


----------



## Jimshown LMHF

Streamroller said:


> Yes. I am in talks with support atm. Can you provide your feedback to support aswell ?
> 
> All of fake scores are under investigation atm


I have already submitted a ticket to UL with my research. I was able to try 6 different rtx 4090s and quickly deduce that it was a Vram bug. If this is close to my heart it is because it upsets the competitiveness on Hwbot, the HOF does not matter to me but it is full of fake scores


----------



## Jimshown LMHF

Streamroller said:


> Yes. I am in talks with support atm. Can you provide your feedback to support aswell ?
> 
> All of fake scores are under investigation atm


I can give a big average, since everyone is often between 1425 and 1525mhz of memory. Core average: 3000mhz around 28k PR 
Core average: 3050mhz around 28k6 
Core average: 3100mhz around 29k1.
This without memory bug.


----------



## yzonker

Jimshown LMHF said:


> The same explanation as for your Port Royal, you're on a bug spot of your Vram. But I wouldn't decide on Speedway until I did some approved testing on it. What is certain is that the partitions of Port Royal are distorted. UL benchmark invalid by hand at the moment may be a new systeminfo soon that detects this directly.


Actually no, Speedway doesn't seem to have the VRAM bug. It either runs or crashes. I never see a single artifact. There are probably other tricks we don't know. You're over simplifying.


----------



## Jimshown LMHF

yzonker said:


> Actually no, Speedway doesn't seem to have the VRAM bug. It either runs or crashes. I never see a single artifact. There are probably other tricks we don't know. You're over simplifying.


We can not say anything because the bugs are not particularly visible on Port Royal.


----------



## RaMsiTo

yzonker said:


> Actually no, Speedway doesn't seem to have the VRAM bug. It either runs or crashes. I never see a single artifact. There are probably other tricks we don't know. You're over simplifying.


I have not had artifacts in that test, it must be something else.

I think the correct punctuation would be this








I scored 11 113 in Speed Way


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Jimshown LMHF

RaMsiTo said:


> I have not had artifacts in that test, it must be something else.
> 
> I think the correct punctuation would be this
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 113 in Speed Way
> 
> 
> Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


OK bugged scores too for Speedway


----------



## yzonker

It really depends on what you consider invalid. Does moving the mem slider up until you get the highest score count as invalid? What if it just artifacts right at the end? 

Then there's all of those settings @KedarWolf posted, some of which definitely make a difference. 

I honestly don't know where the line is when 3DMark lists the run as valid?

This is just the nature of competition where people will push the rules to the limit. That's what makes it fun for me, same as the motorsports stuff I do as my other crazy expensive hobby. Lol.


----------



## Jimshown LMHF

yzonker said:


> It really depends on what you consider invalid. Does moving the mem slider up until you get the highest score count as invalid? What if it just artifacts right at the end?
> 
> Then there's all of those settings @KedarWolf posted, some of which definitely make a difference.
> 
> I honestly don't know where the line is when 3DMark lists the run as valid?
> 
> This is just the nature of competition where people will push the rules to the limit. That's what makes it fun for me, same as the motorsports stuff I do as my other crazy expensive hobby. Lol.


In this case we force people to activate the ecc? Many will fall from above.
Just be logical. Doing 1000 more points with 5 MHz more memory, you have to be honest.
Suddenly the hall of fame is full of bugged scores.


----------



## msky73

Inno3d X3 OC and Alphacool waterblock. Basically the same what soon-to-come Frostbite will be.


----------



## KickAssCop

Arizor said:


> I just used the Strix OC BIOS - Asus RTX 4090 VBIOS
> 
> Works fantastic. I have it undervolted very moderately and it smashes anything
> View attachment 2576940


Thanks for sharing. I forgot how to flash the bios. Can you post the command line here?
It says there is a PCI subsystem mismatch when I tried to put the STRIX bios.


----------



## stargit

Has anyone with the Zotac Amp found a better bios yet?

Thanks


----------



## AdamK47

motivman said:


> i flashed suprim x bios to my trio, and also getting full 530W, but fan speeds are not running at maximum, so temps are way worse... guess this is no Bueno, until its on a waterblock... SMH


Any chance you can post the original Gaming Trio gaming (not silent) BIOS? It's not on TechPowerUp. I updated my Trio to the Suprim X BIOS. Forgot to save the original. Probably going to keep the Suprim X BIOS, but would like to have the original just in case. Thanks!

BTW - There is a gaming version of the Suprim X BIOS with a more aggressive fan profile. It's on TechPowerUp. Ends in .46.


----------



## BigMack70

I caved. Got a 13600k + MSI Z790 DDR4 mobo coming Monday. Didn't want to do a full platform upgrade because I've got a nice high end DDR4-4400 kit of RAM, and the 13900k is only a couple % faster in games. DDR4 with a 13600k should be fine.

Money's just burning through my wallet these days... I want my 4k 120fps locked down


----------



## long2905

msky73 said:


> View attachment 2577000
> 
> 
> Inno3d X3 OC and Alphacool waterblock. Basically the same what soon-to-come Frostbite will be.


how do you like it so far? any number you can share? tried flashing another vbios yet?


----------



## Aneurotic

AdamK47 said:


> Any chance you can post the original Gaming Trio gaming (not silent) BIOS? It's not on TechPowerUp. I updated my Trio to the Suprim X BIOS. Forgot to save the original. Probably going to keep the Suprim X BIOS, but would like to have the original just in case. Thanks! BTW - There is a gaming version of the Suprim X BIOS with a more aggressive fan profile. It's on TechPowerUp. Ends in .46.


 I backed it up. DO you have a place to upload it to?


----------



## derthballs

stargit said:


> Has anyone with the Zotac Amp found a better bios yet?
> 
> Thanks


Ive tried a few but waiting for a new cable as no difference using 3 with 1 cables even though it shows up at 600w with the gigabyte bios


----------



## zhrooms

N19htmare666 said:


> You also have the waterforce down as 600w but it seems that is actually 500w?


What makes you say that? It's obviously 600W since even the Gaming OC is 600W, looks like Gigabyte is just going with 600W on all of their cards (except the Windforce which appear to be non-OC).



Benni231990 said:


> So can anybody upload this BIOS?


It will be available here when GPU-Z is updated which will allow people to upload their BIOSes directly to the BIOS collection.


----------



## heptilion

IntelMistakes said:


> How strange. The Power Limit on my Strix 4090 is 90% if the DualBIOS switch is set to Performance Mode. If I set it to Quiet Mode, it goes up to 100%.
> 
> Does anyone know anything?
> 
> And Can you confirm if the 3DMark score is correct?
> Along with the clock speeds. Thank you.
> 
> Thanks!!


Hi Can confirm switching to Q change power target to 100 but P is still at 90% Although Q mode is at 100% it's still capped to 450W but max is 133% where as P mode is 120% max which makes sense.

Port Royal GPU score: 25700
Time Spy: 35800
TimeSpy Extreme: 19500
SpeedWay: 9950

GPU clock is at 2760


----------



## KickAssCop

Installed Strix bios on vanilla TUF. Saved 400$. Yay!


----------



## motivman

AdamK47 said:


> Any chance you can post the original Gaming Trio gaming (not silent) BIOS? It's not on TechPowerUp. I updated my Trio to the Suprim X BIOS. Forgot to save the original. Probably going to keep the Suprim X BIOS, but would like to have the original just in case. Thanks!
> 
> BTW - There is a gaming version of the Suprim X BIOS with a more aggressive fan profile. It's on TechPowerUp. Ends in .46.


change the extension to .rom, had to change to .pdf to upload here.


----------



## AdamK47

motivman said:


> change the extension to .rom, had to change to .pdf to upload here.


Oh, nice. Thanks!


----------



## bottjeremy

BigMack70 said:


> I caved. Got a 13600k + MSI Z790 DDR4 mobo coming Monday. Didn't want to do a full platform upgrade because I've got a nice high end DDR4-4400 kit of RAM, and the 13900k is only a couple % faster in games. DDR4 with a 13600k should be fine.
> 
> Money's just burning through my wallet these days... I want my 4k 120fps locked down


I ended up getting a 13700KF to put in my Z690 DD4 board. Both CPU will run very nicely.


----------



## PLATOON TEKK

In stock for now
MSI Suprim GeForce RTX 4090 24GB GDDR6X PCI Express 4.0 Video Card RTX 4090 SUPRIM LIQUID X 24G
Are you a human?

Edit: out of stock now, was in stock for a good 20 mins though. Guess supply is improving.


----------



## changboy

Just bought a Gigabyte Aorus 4090 Waterforce from Canada Computer, i will need to drive around 65km to take it tomorrow. A chance i didnt bought the Zotac yesterday for about the same price.


----------



## BeZol

Hulk1988 said:


> *I am also wondering how people can cheat/exploit 3DMark currently. Unrealistic and fishy results in SpeedWay and Port Royal. People having less GPU and Mem clock but 6-10% more points.They even beat easily 4090 LN2 records with a Air 4090.*


I made like 50+ Time Spy Extreme benchmarks at the first 2 days.

On the second day I was fine-tuning the gpu (to get the most at Graphics Test 1), and suddenly at the 4.-5. benchmark of that day the whole GT1 run was like a pixelfight (right after the start).

Half of the screen/image was covered with virtual pixelwalls of dark-grey, BUT the benchmark did not crash, and the whole time I had like +10% fps... Interestingly GT2 did not make such pixelparty and I had the normal fps-numbers.
This GT1-pixelparty made me free +700 Graphics Score, because normally I cann't get more than 20.800 GS (Gainward Phantom GS with 515W peak). Sometimes there are tiny pixel-errors, then it can jump to 20.922.

So the benchmark was valid, you can check the details tab of every benchmark to check the fps-numbers, you are going to clearly see if someone had this pixelparty or not.

It's a shame, that something like this can be valid and cann't be filtered out by 3DMark-program itself...

I can imagine, that propably same pixelparties are happening in the another benches too.
BTW I couldn't replicate the same behaviour, so it's like random stuff when your gpu is pushed to the edges.

Finally: those people are going to have the most Graphics Score in Time Spy Extreme, whose GT1 and GT2 benchs are going to do pixelparty. Thats like +700 and +700 free graphics score with the same gpu and memory settings  (at Time Spy Extreme).

Luckily I saw valid 21.720 GS with an average gpu-clock of 3104MHz, but anything above that must be checked by the fps-numbers to know if it had pixelparty or not... (but at some scores strangely it's not loading!)


----------



## J7SC

BigMack70 said:


> I caved. Got a 13600k + MSI Z790 DDR4 mobo coming Monday. Didn't want to do a full platform upgrade because I've got a nice high end DDR4-4400 kit of RAM, and the 13900k is only a couple % faster in games. DDR4 with a 13600k should be fine.
> 
> Money's just burning through my wallet these days... I want my 4k 120fps locked down


...I'm still hanging on the edge of the cliff at the bottom of that slippery slope -🥴 

- one of my two X570 is more of a work setup with a 3950X (custom water-cooled) with very good DDR4 (actual 3800 CL14, Samsung-B)....so I'm thinking about getting a 5800X3D to replace the 3950X; probably the least expensive, worthwhile upgrade...the 5950X combo would stay for now until either AM5 refresh or Intel Meteor Lake since I do not have a current Intel setup, just older ones. I'll wait though until FS2020 gets the SU11 patch officially released re. their RTX 4K goodies and allegedly much faster fps.


----------



## BigMack70

bottjeremy said:


> I ended up getting a 13700KF to put in my Z690 DD4 board. Both CPU will run very nicely.


Hadn't seen that review; thanks for that. Canceled my 13600k order and got the 13700kf. Looks like when overclocked, it's going to be more of the sweet spot than the 13600k for high end performance.


----------



## GRABibus

bottjeremy said:


> I ended up getting a 13700KF to put in my Z690 DD4 board. Both CPU will run very nicely.
> 
> View attachment 2577022


TPU made some deep benchmarks with 4090 with 2 CPU's and several games.

Their conclusion is that even Zen 3 CPU's will bottleneck the GPU at 1440p, even 4K sometimes.

This measn if you own a 5950X as me, it becomes necessary to updgrade to 13900K to get all benefits o fthe 4090.









RTX 4090 & 53 Games: Ryzen 7 5800X vs Core i9-12900K Review


We test the NVIDIA GeForce RTX 4090 with 53 games at three resolutions, comparing the AMD Ryzen 7 5800X against the Intel Core i9-12900K. The idea here is to get a feel for how much graphics performance is lost by a weaker processor.




www.techpowerup.com


----------



## bezerKa

BigMack70 said:


> Hadn't seen that review; thanks for that. Canceled my 13600k order and got the 13700kf. Looks like when overclocked, it's going to be more of the sweet spot than the 13600k for high end performance.


What are you upgrading from currently?


----------



## bmagnien

GRABibus said:


> TPU made some deep benchmarks with 4090 with 2 CPU's and several games.
> 
> Their conclusion is that even Zen 3 CPU's will bottleneck the GPU at 1440p, even 4K sometimes.
> 
> This measn if you own a 5950X as me, it becomes necessary to updgrade to 13900K to get all benefits o fthe 4090.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> RTX 4090 & 53 Games: Ryzen 7 5800X vs Core i9-12900K Review
> 
> 
> We test the NVIDIA GeForce RTX 4090 with 53 games at three resolutions, comparing the AMD Ryzen 7 5800X against the Intel Core i9-12900K. The idea here is to get a feel for how much graphics performance is lost by a weaker processor.
> 
> 
> 
> 
> www.techpowerup.com


Or you could swap your 5950 for a 5800x3d and you should be good to, at least for gaming.


----------



## BigMack70

bezerKa said:


> What are you upgrading from currently?


Upgrading from a 9900KS. Want to be able to hit 120fps in games like Elden Ring, Halo Infinite's campaign, and A Plague Tale Requiem.


----------



## Benni231990

zhrooms said:


> It will be available here when GPU-Z is updated which will allow people to upload their BIOSes directly to the BIOS collection.


Thanks a lot for your hardwork so lets hope gpuz have soon an update


----------



## bezerKa

BigMack70 said:


> Upgrading from a 9900KS. Want to be able to hit 120fps in games like Elden Ring, Halo Infinite's campaign, and A Plague Tale Requiem.


Ahh yea that'll be a nice bump from 9900ks to 13700k


----------



## BigMack70

bezerKa said:


> Ahh yea that'll be a nice bump from 9900ks to 13700k


I hope it'll be a tangible difference. To be honest, I'm still in a bit of shock that I wound up CPU limited at 4k with this GPU. I really didn't expect that could be possible. This card is nuts.


----------



## Sheyster

BigMack70 said:


> Upgrading from a 9900KS. Want to be able to hit 120fps in games like Elden Ring, Halo Infinite's campaign, and A Plague Tale Requiem.


I just ordered a basic 13700KF build. Also coming from a 9900K and upgrading for the same reasons. I was tempted to go with a higher end 13900KF build but it didn't make much sense as I only game on this rig and don't need the 8 extra E-cores.


----------



## originxt

My 10980xe will definitely bottleneck the 4090 even in 4k but hopefully not too much. Hard to find any data related to this cpu and benchmarks.


----------



## Hanks552

AdamK47 said:


> Any chance you can post the original Gaming Trio gaming (not silent) BIOS? It's not on TechPowerUp. I updated my Trio to the Suprim X BIOS. Forgot to save the original. Probably going to keep the Suprim X BIOS, but would like to have the original just in case. Thanks!
> 
> BTW - There is a gaming version of the Suprim X BIOS with a more aggressive fan profile. It's on TechPowerUp. Ends in .46.


With this suprim X Bios were you able to push 600w? Using the 4 connector cable?


----------



## Baasha

well.. I got through TWO orders for the RoG Strix 4090 OC on NewEgg - one "combo" with a crap PSU and the other just the GPU and BOTH ORDERS WERE VOIDED because of "out of stock."  

I did both orders in ~ 30 seconds. What is the solution here?


----------



## changboy

originxt said:


> My 10980xe will definitely bottleneck the 4090 even in 4k but hopefully not too much. Hard to find any data related to this cpu and benchmarks.


I also run with a 10980xe, my actual oc is 4.8ghz, i can put higher and also deactivate HT but iam not sure it will make big difference at 4k, cant find test about this with this cpu.


----------



## J7SC

yzonker said:


> It really depends on what you consider invalid. Does moving the mem slider up until you get the highest score count as invalid? What if it just artifacts right at the end?
> 
> Then there's all of those settings @KedarWolf posted, some of which definitely make a difference.
> 
> I honestly don't know where the line is when 3DMark lists the run as valid?
> 
> This is just the nature of competition where people will push the rules to the limit. That's what makes it fun for me, same as the motorsports stuff I do as my other crazy expensive hobby. Lol.


On RTX 4090 VRAM, I am still chasing the 'most efficient' setting as it depends on Hotspot and other temps as well, IMO. The other issue is that GDDR6X seems more 'non-linear' than other VRAM in that there seem to be specific speeds where repeatable scores won't move up much, if not down a little bit, but the next speed step up works better. I ran into that as well with my 3090 Strix / GDDR6X. BTW, I do not finish runs with any artifacts but hit 'escape', ie. in PortRoyal, the moment I see it - too concerned about breaking stuff.

---
Also a general comment on the earlier discussion...
If there is a question about someone's score in either 3DM or other HWBot benchies, it is better to not make insinuations ("I got beat, so there must be a cheat") and _ask them to prove _at least a potential_ negative_. Instead, address it to 3DMark and/or HWBot staff; the former at least have far more raw data per submission, not to mention a comparative database of similar setups. This is what some folks here have already done per their posts above, and it is the right way to proceed.

I am no stranger to benching, per spoiler below...I haven't subbed at HWBot for many, many years, and my scores below (under an older handle at HWBot) include some sub-zero as I had worked my way up to the Elite league. Back then there was an unwritten rule that you would not post sub-zero scores for HWBot at general benchmark threads, otherwise every newcomer here and elsewhere would either be depressed/angry, or throw 2.0v at every core...The more recent 3DM Port Royal below was just after the Titan RTX had been introduced. I show that one as Port Royal is very sensitive to system RAM (there was no resizable_BAR option in those days), and even the 2950X Threadripper could really lay down some good PR numbers by changing to the 'gaming' option in the MSI bios (affected UMA, NUMA for that and a handful of other TRs).



Spoiler


----------



## DirtyScrubz

KickAssCop said:


> Installed Strix bios on vanilla TUF. Saved 400$. Yay!


Was it easy to flash? I just bought a tuf from this mornings newegg drop. Could you link the vbios and nvflash version? thx


----------



## katates

I have received my gaming x trio today, this is the first time i am having coil whine, is this normal?


----------



## AdamK47

Hanks552 said:


> With this suprim X Bios were you able to push 600w? Using the 4 connector cable?


No, of course not.


----------



## Arizor

KickAssCop said:


> Thanks for sharing. I forgot how to flash the bios. Can you post the command line here?
> It says there is a PCI subsystem mismatch when I tried to put the STRIX bios.


nvflash -6 nameofbios.rom

the mismatch warning is quite normal.


----------



## changboy

katates said:


> I have received my gaming x trio today, this is the first time i am having coil whine, is this normal?


You not alone with msi trio with coil wine, check this comment from a new buyer at micro-center :
MSI GeForce RTX 4090 GAMING TRIO 24GB PCI-E w/ Triple DP, HDMI - PCI-E Video Cards - Reviews - Memory Express Inc.


----------



## bezerKa

BigMack70 said:


> I hope it'll be a tangible difference. To be honest, I'm still in a bit of shock that I wound up CPU limited at 4k with this GPU. I really didn't expect that could be possible. This card is nuts.



yeah it's a beast.

I'm skipping this gen. All games I play run awesome. Still on 10900k/3080ti


----------



## Shaded War

Wonder how my 5800x will fare with the 4090 at 4k compared to newest cpus. All the benchmarks only showing the 3d version.
My 4090 from newegg arrived mintues before I had to leave for work today, so I’ll be trying it tonight.


----------



## VideoGameLover

zhrooms said:


> It's 600W.. the "early reviewer" BIOS uploaded by TPU is not retail BIOS, Liquid X ships to consumers with a 600W BIOS
> 95.02.18.00.C4 (Build Date 2022-09-01) PG139 SKU 330 VGA BIOS MSINV510MH.290 450/600W
> I have multiple people confirming by sharing screenshots of retail Liquid X cards, whatever is shown in the original post is correct


Hi I just ordered a 4090 Suprim Liquid X and I was watching a review of the card on youtube and the reviewer said he could only get the power slider to 111. But I guess that was just a non retail bios he was on? So the retail 4090 suprim liquid x power slider can go up to 133? Is that true? Let me know thanks


----------



## Tideman

katates said:


> I have received my gaming x trio today, this is the first time i am having coil whine, is this normal?


I have the same with my TUF OC.

Not sure I can live with it. Any kind of workload causes it (rendering and gaming). My rig is on my desk.

If I see a gigabyte gaming oc in stock anywhere, I'm probably going to grab one and send my TUF back. I'm avoiding ASUS and MSI 90 series in the future.


----------



## Mad Pistol

katates said:


> I have received my gaming x trio today, this is the first time i am having coil whine, is this normal?


My Gigabyte Windforce 4090 has very similar coil whine. It sucks.


----------



## bmagnien

28.5k Port Royal - about the best I can manage so far without colder ambients or a block








I scored 28 501 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com




This is with daisy chained risers and an MP5works block strapped to the backplate for kicks (doubt it’s doing anything)


----------



## galsdgk

bmagnien said:


> Where do you see that the PNY is reference? It’s not ok Alphacools own compatibility list


Nobody has done PNY tear down yet. But they are always reference PCB. Alphacool says reference PCB works. If you don’t want to gamble, just wait. But you’re right that the documentation is lacking. Early days still.


----------



## pt0x-

Officially part of the "Obese GPU Club" with a Strix 4090 oc.

Replacing my Strix 2080 Ti, I love the card so far, less coil wine, but still noticeable (using the same chokes so to be expected). Super fast, unreal really.. There is just one slight problem, a known problem, but I just think it's funny so im going to post some pictures 

The thing (aka. Absolute Unit) will not fit an 011 Dynamic XL (!!!) with a res mounted to the side fans. I can run it, but literally cant put on the glass panel... because the card is sticking out.. There is no space to move it some slots backward. Waiting for Watercool to release their block so I can add it to my loop, just ran some baselines for now and will check if the card is any good for OCing later.


----------



## changboy

I dont know if i can unplug the tube from this gigabyte aorus extreme , i think i need put this rad outside and behind my tower if i keep it ??


----------



## yt93900

I'm more interested if that fan cable splitter is easily removable without having to disassemble the whole card. Not really wanting to try it on mine.
Regarding the tubes - 99% sure they are not removable, or maybe only once ;-)


----------



## changboy

yt93900 said:


> I'm more interested if that fan cable splitter is easily removable without having to disassemble the whole card. Not really wanting to try it on mine.
> Regarding the tubes - 99% sure they are not removable, or maybe only once ;-)


what you mean with card splitter ?


----------



## changboy

Why they didnt do this card with G1/4 connectors xD


----------



## yt93900

The cable that goes from the card to the 3 fans, how much effort does it take to remove it. I'd like to try to make my own cable with normal 4-pin PWM plugs, instead of the "GPU style" ones supplied.


----------



## changboy

yt93900 said:


> The cable that goes from the card to the 3 fans, how much effort does it take to remove it. I'd like to try to make my own cable with normal 4-pin PWM plugs, instead of the "GPU style" ones supplied.


Maybe not so hard to do. Its plug on the gpu board and goes to the fans.
Just need connector from the board.


----------



## changboy

What can be stupid its buy this card and buy a waterblock for it.


----------



## changboy

Is there lights on the card itself or its just black ? I cant see this online.
I dont have in my hand on t yet coz i need go at store tomorrow morning to grab it.


----------



## schoolofmonkey

pt0x- said:


> Officially part of the "Obese GPU Club" with a Strix 4090 oc.
> 
> Replacing my Strix 2080 Ti, I love the card so far, less coil wine, but still noticeable (using the same chokes so to be expected). Super fast, unreal really.. There is just one slight problem, a known problem, but I just think it's funny so im going to post some pictures
> 
> The thing (aka. Absolute Unit) will not fit an 011 Dynamic XL (!!!) with a res mounted to the side fans. I can run it, but literally cant put on the glass panel... because the card is sticking out.. There is no space to move it some slots backward. Waiting for Watercool to release their block so I can add it to my loop, just ran some baselines for now and will check if the card is any good for OCing later.
> 
> View attachment 2577068
> 
> View attachment 2577069
> 
> View attachment 2577070


I was thinking about replacing my 011 for the 011 XL, no point it seems


----------



## yt93900

changboy said:


> Is there lights on the card itself or its just black ? I cant see this online.
> I dont have in my hand on t yet coz i need go at store tomorrow morning to grab it.


No, no lights at all on the Waterforce Xtreme.


----------



## changboy

yt93900 said:


> No, no lights at all on the Waterforce Xtreme.


LOOL a shame !


----------



## yt93900

Depends, I'm happy it has no RGB so I don't have to install any other RGB bloatware to disable it.


----------



## changboy

See my set-up now with a 3090 ftw3 ultra :








Not sure i will keep it or just re-sale it 
Maybe it will be better for me just have another card and install a waterblock on it.


----------



## sweepersc

VideoGameLover said:


> Hi I just ordered a 4090 Suprim Liquid X and I was watching a review of the card on youtube and the reviewer said he could only get the power slider to 111. But I guess that was just a non retail bios he was on? So the retail 4090 suprim liquid x power slider can go up to 133? Is that true? Let me know thanks


Just got mine yesterday, the power slider goes up to 125%. I used nvidia-smi to check the power limits, and it is 480W(100%)/600W(125%). Have yet to see the card reach above 510W though.


----------



## X79guy

I currently have a Giga Gaming OC 4090 but its too large for my meshy. I have a 4090 Trio on the way, but after seeing the PCB shots it looks like the Gaming OC has the better electrical setup. Will I be downgrading by going to the Trio? I don't plan on overclocking as its going to be in a meshy as stated before.


----------



## J7SC

sweepersc said:


> Just got mine yesterday, the power slider goes up to 125%. I used nvidia-smi to check the power limits, and it is 480W(100%)/600W(125%). Have yet to see the card reach above 510W though.


...600W would likely only happen if other boost algorithm inputs among them general GPU, Hotspot and VRAM temps, allow for it, in addition to 1.1V. Highest I've seen on my Giga-Gaming-OC (133% / 600W) is ~ 575W per HWInfo. Hotspot spoiled the party...


----------



## bmagnien

Went from 26,167 at bone stock right out of the box, to 28,501 after about an hour of tweaking on the giga-oc. That’s a 9% gain, or the difference between 110 fps and 120 fps. Looking forward to a couple more percents once the water block comes in. Who said overclocking is dead this gen?


----------



## mirkendargen

bmagnien said:


> Went from 26,167 at bone stock right out of the box, to 28,501 after about an hour of tweaking on the giga-oc. That’s a 9% gain, or the difference between 110 fps and 120 fps. Looking forward to a couple more percents once the water block comes in. Who said overclocking is dead this gen?


Whiners that don't have cards themselves.


----------



## MikeGR7

X79guy said:


> I currently have a Giga Gaming OC 4090 but its too large for my meshy. I have a 4090 Trio on the way, but after seeing the PCB shots it looks like the Gaming OC has the better electrical setup. Will I be downgrading by going to the Trio? I don't plan on overclocking as its going to be in a meshy as stated before.


The electrical downgrade is not noteworthy but the cooler is inferior and you may have some coil whine.

If you ain't clocking it though then the cooler becomes irrelevant too so have fun.

I just hope you managed a better price because its an inferior model. A 100 - 150 euro discount would be logical.


----------



## N19htmare666

sweepersc said:


> Just got mine yesterday, the power slider goes up to 125%. I used nvidia-smi to check the power limits, and it is 480W(100%)/600W(125%). Have yet to see the card reach above 510W though.


Try kombustor 5200mb donut on a high resolution and see what it goes to?


----------



## Arizor

Just a reminder folks for games, if you’re on air and want to keep things cool, just set the clock to max 2800ish, memory to 12000 (+1500) and PL to 90%, you exceed stock performance and keep things in the 60s.


----------



## J7SC

bmagnien said:


> Went from 26,167 at bone stock right out of the box, to 28,501 after about an hour of tweaking on the giga-oc. That’s a 9% gain, or the difference between 110 fps and 120 fps. Looking forward to a couple more percents once the water block comes in. Who said overclocking is dead this gen?


Nice ! FYI, I emailed Phanteks about a water block for the Giga-G-OC, but no real info back other than 'may be; soon'...either Alphacool or Bykski might beat them to it. This card really needs a full water block for the final push (500W+) - I tested what my most efficient VRAM speed is, know what the max OC for the core is at 1.05v and 1.1v, but to put it all together with PL 133%, I need to defeat that Hotspot temp w/ the high ambient we have had. Might do a few more benchies on the weekend though as it is starting to cool down now.


----------



## X79guy

MikeGR7 said:


> The electrical downgrade is not noteworthy but the cooler is inferior and you may have some coil whine.
> 
> If you ain't clocking it though then the cooler becomes irrelevant too so have fun.
> 
> I just hope you managed a better price because its an inferior model. A 100 - 150 euro discount would be logical.


Would the MSI Supreme X (not liquid) also have the same coil whine? My Giga OC has nearly no coil whine right now.


----------



## Xavier233

X79guy said:


> Would the MSI Supreme X (not liquid) also have the same coil whine? My Giga OC has nearly no coil whine right now.


When you say nearly, is it getting better with time? At which FPS do you hear it, and if you cap it, do you still hear any Coilwhine?


----------



## MikeGR7

X79guy said:


> Would the MSI Supreme X (not liquid) also have the same coil whine? My Giga OC has nearly no coil whine right now.


It is the same design so yes but let's be honest, unless you have the system on your desk near you, the sound will be covered by fan noise.

Unless you get the Tuf 

I would not change the card you have tbh i whould change the case as it's cheap and having a good GPU with minimal or no coil is not to be taken for granted


----------



## X79guy

MikeGR7 said:


> It is the same design so yes but let's be honest, unless you have the system on your desk near you, the sound will be covered by fan noise.
> 
> Unless you get the Tuf
> 
> I would not change the card you have tbh i whould change the case as it's cheap and having a good GPU with minimal or no coil is not to be taken for granted


My system is on my desk, right near me. Its a m-itx build. Meshy. The meshy is the largest of the true ITX cases. I have a 900d build as well but this one needs to be portable.


----------



## X79guy

Xavier233 said:


> When you say nearly, is it getting better with time? At which FPS do you hear it, and if you cap it, do you still hear any Coilwhine?


My Giga OC has no coil whine until about 400 fps, which for me never happens since I play at 4k/240hz. I have returned many cards due to coil whine in the past so if the Trio has it bad, its going back.


----------



## X79guy

Also, the only cards that will fit my case requirements (case can't be changed), are the Giga Windforce, MSI suprim X, Msi Trio, and the FE. I don't think the FE would do well in a meshy since the back of the cards is against the motherboard tray.

From my findings it looks like the Windforce has the worst electrical setup of all the cards.


----------



## GraphicsWhore

X79guy said:


> Would the MSI Supreme X (not liquid) also have the same coil whine? My Giga OC has nearly no coil whine right now.


Any card can have coil whine because it's part of the nature of electronics, especially high-powered ones like GPUs.


----------



## Mad Pistol

I'm changing my tune on DLSS Frame Generation. It's magic.

I just played Spider Man maxed out at 4K using frame generation, and it stays locked at 120 FPS. It's perfectly smooth and looks fantastic!

The best part? GPU stays around 55-58C and only uses about 250-watts. That's significantly less than my RTX 3080.

The RTX 4090 is a freakin game changer. It has the ability to go stupidly high on performance, but for most cases, it just stays cool and quiet, even at 4K.


----------



## X79guy

GraphicsWhore said:


> Any card can have coil whine because it's part of the nature of electronics, especially high-powered ones like GPUs.


Any card _can_ have coil whine. They usually don't vary much though within the same model (ie 4090 TUF), as its the PCB design that determines the level of whine. My Giga OC has no whine up until about 400fps at high loads and even then its quiet. That was why I asked about the Suprim x vs Trio. It does appear that people are having the most whine with the Trio. The Suprim X has the same PCB design as I just found out so it too is likely to have unacceptable (to me) whine.

An interesting aside, it looks like this generation is going to be dominated by MSI in terms of electrical quality. Especially the Suprim line of cards. The amount of SMD's they use is likely the main contributor to their higher coil whine though. Trade offs. Always trade offs.


----------



## VideoGameLover

sweepersc said:


> Just got mine yesterday, the power slider goes up to 125%. I used nvidia-smi to check the power limits, and it is 480W(100%)/600W(125%). Have yet to see the card reach above 510W though.


Cool thanks for the reply. So this card is not power limited at all is that correct? So the other cards go up to 133 percent because they start at 450 watts? So 480W plus 125% is 600W so this card can get maxium power? Let me know thanks.


----------



## Xavier233

GraphicsWhore said:


> Any card can have coil whine because it's part of the nature of electronics, especially high-powered ones like GPUs.


Is there any "surgical" way to stop it?


----------



## J7SC

not sure if this had been posted here yet, but Digital Foundry has quite a detailed, free-standing vid on DLSS3


----------



## mirkendargen

Xavier233 said:


> Is there any "surgical" way to stop it?


Sometimes it can be caused by voltage fluctuations from the PSU and a new PSU or ferrite loops on the power cables can help.

Sometimes extra thermal pads on the VRMs to dampen vibration helps depending on how much there already is.


----------



## mattskiiau

No whine on my Trio under full load. Everyone's experience will be different, as usual.


----------



## schoolofmonkey

Weekend fun.
Got to give Galax, the box is sparkly


----------



## Arizor

My TUF has zero coil whine, this is it running heaven benchmark (so GPU maxxed churning hundreds of frames).


----------



## X79guy

The TUF has a lot more through-hole capacitors, and so I would expect the TUF to have the least amount of whine compared to the other AIB designs.

I will report back tomorrow with the comparison of the Trio with my Giga OC.


----------



## J7SC

Arizor said:


> My TUF has zero coil whine, this is it running heaven benchmark (so GPU maxxed churning hundreds of frames).


What 's your TUF's hotspot delta (max hotspot to max general GPU temp) in HWInfo or CPUz at 100% PL (stock) and again at 'full power' PL slider ? How about other folks' 4090s ? FYI, on my air-cooled Giga-G-OC it ranges from 12.4 C delta to 17 C+


----------



## Mad Pistol

I don't know if this has been said before, but the RTX 4090 feels like 8800 GTX levels of epic-ness. In Ray Tracing, it is over 2x faster than the 3090 Ti, and in rasterization at 4K, it's about 2x. This is a 4K120 card. I've played...

Control
Cyberpunk 2077
Watch Dogs: Legion
Spider Man
... and all of them can run 4K max settings + RT and not even flinch. If the framerate isn't high enough, set it to DLSS Quality, and it's way more than enough to have a fantastic experience. These cards are stupid fast, and I know it sounds crazy, but I feel like I actually got my money's worth for the $1600 I paid. The RTX 3080 blew me away, but this... this is even more crazy.

Good job Nvidia. The price sucks, but whatever.


----------



## DokoBG

mattskiiau said:


> No whine on my Trio under full load. Everyone's experience will be different, as usual.


Same, my Trio has literally ZERO coil whine under any load.


----------



## KedarWolf

DokoBG said:


> Same, my Trio has literally ZERO coil whine under any load.


My GTX 5900 has ZERO coil whine, and the number is 69.3% bigger than 4090.


----------



## th3illusiveman

Shaded War said:


> Wonder how my 5800x will fare with the 4090 at 4k compared to newest cpus. All the benchmarks only showing the 3d version.
> My 4090 from newegg arrived mintues before I had to leave for work today, so I’ll be trying it tonight.


your answer is here: RTX 4090 & 53 Games: Ryzen 7 5800X vs Core i9-12900K Review

So.... not good. Note however, he is running at 4000mts on his RAM which is IF-2000... and should not be stable unless he has a very golden chip - so these results may be off by abit.


----------



## Hulk1988

Streamroller said:


> *Answer from 3D Mark support team. This moron xyx troll should be ashamed himself. ( hes score got REKT )*
> View attachment 2576969
> View attachment 2576970


Where can I find this statement? Link or source would be great. Thank you


----------



## Shaded War

Just got my MSI 4090 Trio and put it in my rig only to have a black screen. RGB on the card was working. Wasted all that time putting the support bracket in the slots, then having to take it all out and run my 3090 FE again.

I just updated the bios on my motherboard to see if that will fix it. But newest bios version is from last April for this board. Going to run DDU then install 4090 again and see if it works, otherwise I don't know what to do.

UPDATE: Got it installed after doing the motherboard bios update and DDU restart. Time to reset all my mobo settings then test the GPU.


----------



## kx11




----------



## StreaMRoLLeR

Hulk1988 said:


> Where can I find this statement? Link or source would be great. Thank you


My previous post friend


----------



## Carillo

Just received tracking information for my Bykski TUF/Strix waterblock. DHL express from China is usually 2 working days


----------



## Shaded War

Got my MSI 4090 Trio going now and WOWWW it's amazing! I'v never had a GPU impress me this much in the 10+ years I'v been building PCs.

Had to do a bios update on mobo that reset all my fan curves, so I set every fan and the AIO pump to be dead silent RPMs to see how it goes. Never could pull this off with my 3090 FE and needed a bit more chassis airflow, but trying it with the 4090.

Gave it a quick and dirty OC of 106% power, +200 core, and +1400 memory. The core went to 2910Mhz and the freeking temps stopped at 62C, it took forever for the GPU fans to even turn on. When they did, it topped out arround 30% fan speed (completely silent) after 20 minutes of running my game. Even the CPU was keeiping at 70C with fans barely running and the GPU didn't seem to be cooking it like my 3090 FE did.

I tried +1500 memory and had instant artifacting. Dropped it to +1400 and it seemed fine, but will have to keep testing. Not sure how far I can push the core clock.


----------



## TheNaitsyrk

I have a question my dudes.

I have 4090 Suprim X.

I have ran Heaven 4 at 1440p ultra settings all max.

Maximum power consumption was only 350W, can someone explain why? I put sliders in Afterburner to max.

On that note, can someone tell me why I can't adjust voltage at all?

Core went to +220, +230 kinda works but fails, but +220 to core works fine.

(Memory goes to +1800 easily, and core clock 3045 as well (I don't think it's that high considering other people on 3DMark go as high as 3150 etc)
However, I feel that if I adjust the voltage and flash the BIOS it will go higher.)

With 12900KS at 5.8Ghz I managed to get 262FPS (Score of 6200 if I recall right) in Heaven 4 with all settings at ultra and at 1080p I got 402FPS (10200 score).


----------



## katates

X79guy said:


> The TUF has a lot more through-hole capacitors, and so I would expect the TUF to have the least amount of whine compared to the other AIB designs.
> 
> I will report back tomorrow with the comparison of the Trio with my Giga OC.


Since there is not much stock around, would it worth to replace my x trio?


----------



## chainbolt

mariushauko1994 said:


> i just flashed my trinity non oc with the gigabyte OC bios and at least it shows with 600w. Havent seen what my highest powerdraw is yet


I have the AMP, and I'malso considering to flash the gigabyte OC bios. After you flashed the giga BIOS, how was it with the dual BIOS function? Still there? I guess it's gone.


----------



## schoolofmonkey

Some quick info on the Galax 4090 SG.
With all 4 power connectors connected you will get 113% on the power slider, resulting in a max of 500w,










Some super quick benchmarks going from a 3090 to the 4090 on a 10900k:


----------



## Arizor

J7SC said:


> What 's your TUF's hotspot delta (max hotspot to max general GPU temp) in HWInfo or CPUz at 100% PL (stock) and again at 'full power' PL slider ? How about other folks' 4090s ? FYI, on my air-cooled Giga-G-OC it ranges from 12.4 C delta to 17 C+


At 100% delta is around 12C, at 133% delta is around 15C. I’d say it’s about average, but of course waterblock will bring this down considerably.


----------



## chainbolt

derthballs said:


> Has anyone with a Zotak Amp Aero who's flashed is actually showing usage of more than the 485w on the original bios when running anything? Im trying to figure out if me being restricted is having 3 (one being split into 2) rather than 4 dedicated pci-e cables going into it.


For what it matters, the Zotax 4090 BIOS is limting you to 495 Watt. And that is what I see when I max out my AMP.

I get 3000 MHz with lower voltage at around 450Watt, and adding more voltage resulting in higher power consumption as shown below does not change anything. The advertised 600 Watt on some cards is therefor rather meaningless, unless you mod the PCB to get beyond 1100mV.


----------



## derthballs

chainbolt said:


> For what it matters, the Zotax 4090 BIOS is limting you to 495 Watt. And that is what I see when I max out my AMP.
> 
> View attachment 2577125


Sorry you seem to misunderstand, i flashed to the gigabyte bios and it shows 600w in GPUZ but the highest draw i saw was still 490ish after flash.


----------



## chainbolt

derthballs said:


> Sorry you seem to misunderstand, i flashed to the gigabyte bios and it shows 600w in GPUZ but the highest draw i saw was still 490ish after flash.


Sorry, my misunderstanding. May I ask, why did you flash the giga BIOS, why not the FE BIOS?

As for the reason why you do not see more than the 495 Watt allowed by orginal ZOTAC Bios, I think some have already speculated it might be a matter of the ZOTAC cable that we got. As you probably know, there are some cables. e.g. Corsair, that connect only to 2 PSU sockets and are still good for (claimed) 600 Watt.


----------



## Tideman

Arizor said:


> My TUF has zero coil whine, this is it running heaven benchmark (so GPU maxxed churning hundreds of frames).


Yeah so theres no way I'm going to accept mine then (and I never figured coil whine would bother me). Totally unacceptable. My 7 case fans have to be maxed to fully drown it out.

I managed to find one Gigabyte Gaming OC in stock and it's on its way. I'll report back.


----------



## derthballs

chainbolt said:


> Sorry, my misunderstanding. May I ask, why did you flash the giga BIOS, why not the FE BIOS?
> 
> As for the reason why you do not see more than the 495 Watt allowed by orginal ZOTAC Bios, I think some have already speculated it might be a matter of the ZOTAC cable that we got. As you probably know, there are some cables. e.g. Corsair, that connect only to 2 PSU sockets and are still good for (claimed) 600 Watt.


You cant flash the FE bios.

Ive got a 4 leg power cable so it should do 600w, but im currently using 6 x pci-e cables with one of them split over 2 for the 4th till i get a replacement cable.


----------



## Glottis

Tideman said:


> Yeah so theres no way I'm going to accept mine then (and I never figured coil whine would bother me). Totally unacceptable. My 7 case fans have to be maxed to fully drown it out.
> 
> I managed to find one Gigabyte Gaming OC in stock and it's on its way. I'll report back.


I don't even understand how such huge disparity happens. Some cards from the same vendor have excessive coil whine, while other cards do not. I read some guesstimates that 30-50% of 4090 TUF have coil whine (based on people who tested multiple TUF cards). Does Asus not test this? Are they using different sourced components? And most importantly why is this a lottery on flagship halo product which should be perfect. Why media isn't covering this.


----------



## chainbolt

derthballs said:


> You cant flash the FE bios.
> 
> Ive got a 4 leg power cable so it should do 600w, but im currently using 6 x pci-e cables with one of them split over 2 for the 4th till i get a replacement cable.


Oh, I see. thanks for the info re FE, saves me some time.

For the cable, I am in the same situation: using 6 x pci-e cables with one of them split over 2. What replacement cable are you getting? 

After flashing thr giga BIOS, what happened with the original dual BIOS function?


----------



## derthballs

chainbolt said:


> Oh, I see. thanks for the info re FE, saves me some time.
> 
> For the cable, I am in the same situation: using 6 x pci-e cables with one of them split over 2. What replacement cable are you getting?
> 
> After flashing thr giga BIOS, what happened with the original dual BIOS function?


You can still press the button and switch to the zotak bios. Ive currently got the gigabyte on the fast bios and the zotak on the quiet bios.

I just ordered another pci-e corsair cable so i could run 4 seperately from my psu, just to see if i can draw the 600w with that, ill get the corsair 600w cable when stock is available again.


----------



## AvengedRobix

derthballs said:


> You can still press the button and switch to the zotak bios. Ive currently got the gigabyte on the fast bios and the zotak on the quiet bios.
> 
> I just ordered another pci-e corsair cable so i could run 4 seperately from my psu, just to see if i can draw the 600w with that, ill get the corsair 600w cable when stock is available again.


I've amp Extreme and 4 8pin connected.. flashed gigabyte BIOS and bench Easy at 600w 😉


----------



## IntelMistakes

heptilion said:


> Hi Can confirm switching to Q change power target to 100 but P is still at 90% Although Q mode is at 100% it's still capped to 450W but max is 133% where as P mode is 120% max which makes sense.
> 
> Port Royal GPU score: 25700
> Time Spy: 35800
> TimeSpy Extreme: 19500
> SpeedWay: 9950
> 
> GPU clock is at 2760


So, our cards are fine?

I have almost the same scores as you, both in P Mode and Q Mode.

With 12900K and 32 GB of DDR5 RAM at 5600Mhz.


Thanks.


----------



## chainbolt

AvengedRobix said:


> I've amp Extreme and 4 8pin connected.. flashed gigabyte BIOS and bench Easy at 600w 😉


Cool!

VGA Bios Collection | TechPowerUp

There are 2 Gigabyte BIOS listed. Which one did you flash? I guess the "default" BIOS 95.02.18.00.C1


----------



## AvengedRobix

chainbolt said:


> Cool!
> 
> VGA Bios Collection | TechPowerUp
> 
> There are 2 Gigabyte BIOS listed. Which one did you flash?


This Gigabyte RTX 4090 VBIOS


----------



## derthballs

AvengedRobix said:


> I've amp Extreme and 4 8pin connected.. flashed gigabyte BIOS and bench Easy at 600w 😉


Smashing, thanks for that - you actually see 600w in your power draw when benching? Looking forward to getting my cable now!


----------



## AvengedRobix

derthballs said:


> Smashing, thanks for that - you actually see 600w in your power draw when benching? Looking forward to getting my cable now!


Yes.. on timespy and Port royal.. now i'm waiting for a waterblock


----------



## VideoGameLover

Hi has anyone flashed their 4090 msi gaming trio with a 600 watt bios? If so which bios did you use. Thanks let me know.


----------



## heptilion

IntelMistakes said:


> So, our cards are fine?
> 
> I have almost the same scores as you, both in P Mode and Q Mode.
> 
> With 12900K and 32 GB of DDR5 RAM at 5600Mhz.
> 
> 
> Thanks.


I would think so.

I'm running a 5950x with 3800mhz tweaked timings.


----------



## BeZol

Done with some benchmarks.

Gainward Phantom GS RTX 4090 here.

GS bios with 450W TDP no tune --> 19.700 Graphics Score at Time Spy Extreme
GS bios with 450W TDP tuned +260 GPU +1400 mem --> 20.700 GS
GS bios with 515W max TDP tuned +260 GPU +1400 mem --> 21.022 GS
STRIX bios with 450W TDP no tune --> 19.700 GS (should be 500W...)
STRIX bios with 575W max TDP tuned +260 GPU +1400 mem --> 21.211 GS

-->

+11% TDP (GS bios) --> +1,5% performance
+20% TDP (Strix bios) --> +2,5% performance

Everyone can decide if it's worth overclocking or not...

Buy the cheapest possible RTX 4090 and you are good to go.
Do some fine tune without giving extra TDP and that's it.

Case anyone would need the GS bios for their non-GS gpu, find it attached, just remove the .pdf at the name-end.
(the GS-version got 4x8pin, not 3x8pin!)


----------



## yzonker

TheNaitsyrk said:


> I have a question my dudes.
> 
> I have 4090 Suprim X.
> 
> I have ran Heaven 4 at 1440p ultra settings all max.
> 
> Maximum power consumption was only 350W, can someone explain why? I put sliders in Afterburner to max.
> 
> On that note, can someone tell me why I can't adjust voltage at all?
> 
> Core went to +220, +230 kinda works but fails, but +220 to core works fine.
> 
> (Memory goes to +1800 easily, and core clock 3045 as well (I don't think it's that high considering other people on 3DMark go as high as 3150 etc)
> However, I feel that if I adjust the voltage and flash the BIOS it will go higher.)
> 
> With 12900KS at 5.8Ghz I managed to get 262FPS (Score of 6200 if I recall right) in Heaven 4 with all settings at ultra and at 1080p I got 402FPS (10200 score).


Heaven was probably voltage limited at 350w. You should be able to adjust voltage with the latest Afterburner beta.


----------



## TheNaitsyrk

yzonker said:


> Heaven was probably voltage limited at 350w. You should be able to adjust voltage with the latest Afterburner beta.


Do you know which version I need? Thanks in advance.


----------



## Toony90

There is an error in the list that is on the 1st page. Msi suprim liquid has a 530W bios, not 600W as it says.


----------



## yzonker

TheNaitsyrk said:


> Do you know which version I need? Thanks in advance.











MSI Afterburner 4.6.5 (Beta2) Download


MSI Afterburner 4.6.2 Download - Today we release an updated this Stable revision of Afterburner, this application successfully secured the leading position on graphics card utilities.




www.guru3d.com


----------



## yzonker

Toony90 said:


> There is an error in the list that is on the 1st page. Msi suprim liquid has a 530W bios, not 600W as it says.


According to some other posters, the actual shipping bios is 600w. We'll find out for sure whenever bios can be uploaded to TPU.


----------



## Anvi

Gilgam3sh said:


> I'm waiting to get my PNY 4090 delivered, anyone here using it and have some feedback on it? thanks
> 
> 
> 
> 
> 
> 
> GeForce RTX 4090 24GB XLR8 VERTO EPIC-X RGB Triple Fan | pny.com
> 
> 
> The GeForce RTX 4090 24GB XLR8 GPU (powered by NVIDIA Ada Lovelace) features enhanced RT and Tensor Cores, 8K HDR, & the world's fastest G6X memory.
> 
> 
> 
> 
> www.pny.com


*PNY RTX 4090 24GB XLR8 Gaming EPIC-X RGB *

450W power limit.
Unable to increase past 100% Haven't tested BIOS flash.

Very sturdy build quality, cooler is properly attached to the rear bracket to prevent sagging
Very quiet and cool card
I'm sure all 4090 cards are though

Has no fan stop function, minimum 30% speed (~1k rpm), but it's quite inaudible.
Boost clocks at stock 2685 - 2700 MHz
Overclocked GPU offset +250 ~2955 MHz
Overlocked gaming GPU temperature about 64 - 66 celcius degrees
MEM OC offset +1000 stable, +1500 is not stable
Memory is running a bit warm just like TUF cards.

GPU anti-sag bracket included, which seems to be pretty much identical to Lian-Li GB-001
Install in 2 bottom right ATX motherboard screw holes as long as there's no port or cable obstruction.
90 degree angle sata ports wont interfere
do not install in middle ATX motherboard screw holes, anti-sag bracket will get caught up in middle GPU fan


----------



## TheNaitsyrk

yzonker said:


> According to some other posters, the actual shipping bios is 600w. We'll find out for sure whenever bios can be uploaded to TPU.


That would be super amazing.

If anyone here's got this BIOS, please upload.


----------



## msky73

msky73 said:


> View attachment 2577000
> 
> 
> Inno3d X3 OC and Alphacool waterblock. Basically the same what soon-to-come Frostbite will be.


I didn't moved to water for OCing but for comfort having had the custom loop anyway. And Inno3D has only 14 power phases and is locked at 100% power limit so I run the stock BIOS.

The card runs 2820MHz +-15MHz on default curve. I moved it by 180Mhz + 1000Mhz memory (didn't try more due to PL). Under full load is 45-50C hot (I have 120+360+360 slim rads for GPU and 5950X). In games 35-50C strongly depending on load (I use 4K/144 Hz monitor and fps capped to 141).
Some 3DMark numbers:
Port Royal 25930 *Port Royal OC **27133*
Time Spy Extreme GPU 19606 *Time Spy Extreme GPU OC 20422*
DLSS3 57:175

Game with DLSS2 (Shadow Warrior 3) - I made a few comparisons before replacing 3090:


----------



## chainbolt

AvengedRobix said:


> This Gigabyte RTX 4090 VBIOS


Done! 


I flashed the 600 Watt BIOS to the Zotak Extreme, which is by default BIOS limited to 495 Watt.
Max power consumption went up from 495 Watt to 625 Watt
Gain is 15 MHz from 3000 MHz to 3015 MHz.
Gain in FPS is zero.
Conclusion 1: Not necessary: higher energy cost and higher temps for zero gain.
Conclusion 2: Undervolting is the way to go.


----------



## Benni231990

has somebody tryed to UV the card with 3GHZ and 1v- 1,025V?


----------



## TheNaitsyrk

chainbolt said:


> Done!
> 
> 
> I flashed the 600 Watt BIOS to the Zotak Extreme, which is by default BIOS limited to 495 Watt.
> Max power consumption went up from 495 Watt to 625 Watt
> Gain is 15 MHz from 3000 MHz to 3015 MHz.
> Gain in FPS is zero.
> Conclusion 1: Not necessary: higher energy cost and higher temps for zero gain.
> Conclusion 2: Undervolting is the way to go.


This would be because your temperature is 86C and you're dropping clocks.


----------



## lordkahless

I had been planning on a 4090 FE but it looks like I had an order go through for the Tuff (non-OC model). I haven't heard much on the Tuff. How is it doing for people? Its a 600 watt model correct? I plan on putting an EK block on it right away.


----------



## chainbolt

TheNaitsyrk said:


> This would be because your temperature is 86C and you're dropping clocks.


AFAIK, there is no throtteling at 86C yet.


----------



## TheNaitsyrk

chainbolt said:


> AFAIK, there is no throtteling at 86C yet.


It's not about throttling. The lower the temperature, the higher the card boosts, but because you're at 86C, you don't see any benefit.

For instance: My old 3090 Strix would go to 2145Mhz at 33C, but at 50C that would drop to 2105Mhz. If you catch my drift.

This is why everyone is WB their cards.

Similar to 12th and 13th gen i9's Thermal Velocity boost. If below 70C then you get your 5.8Ghz or whatever, but as soon as you exceed 70C it drops to 5.7Ghz. GPU is same but it's much more sensitive.


----------



## chainbolt

lordkahless said:


> I had been planning on a 4090 FE but it looks like I had an order go through for the Tuff (non-OC model). I haven't heard much on the Tuff. How is it doing for people? Its a 600 watt model correct? I plan on putting an EK block on it right away.


Do not get confused about the 600 Watt thing. First of all, you can flash such BIOS to any card. Secondly, it does not do anything of noticable value for gaming, other than raising your temperature. About the TUF, it has only 18 voltage power phases, while the better cards (and the FE) all have at least 20 or 24.


----------



## lordkahless

chainbolt said:


> Do not get confused about the 600 Watt thing. First of all, you can flash such BIOS to any card. Secondly, it does not do anything of noticable value for gaming, other than raising your temperature. About the TUF, it has only 18 voltage power phases, while the better cards (and the FE) all have at least 20 or 24.


Does the lower phase count make the card run hotter thus reducing the potential for 3000+ mhz clocks?


----------



## Hanks552

So, the ASUS TUF is still the best 4090 to buy?
Since is 600w and msrp 1599


----------



## Rbk_3

Anyone have a worse memory overclocker than me? Can only do +900 at +950 I start artifacting.


----------



## Mad Pistol

TheNaitsyrk said:


> I have a question my dudes.
> 
> I have 4090 Suprim X.
> 
> I have ran Heaven 4 at 1440p ultra settings all max.
> 
> Maximum power consumption was only 350W, can someone explain why? I put sliders in Afterburner to max.
> 
> On that note, can someone tell me why I can't adjust voltage at all?
> 
> Core went to +220, +230 kinda works but fails, but +220 to core works fine.
> 
> (Memory goes to +1800 easily, and core clock 3045 as well (I don't think it's that high considering other people on 3DMark go as high as 3150 etc)
> However, I feel that if I adjust the voltage and flash the BIOS it will go higher.)
> 
> With 12900KS at 5.8Ghz I managed to get 262FPS (Score of 6200 if I recall right) in Heaven 4 with all settings at ultra and at 1080p I got 402FPS (10200 score).


These GPUs only use the power that they need to operate. A lot of that 450-watt power budget deals with advanced features like Ray Tracing, which are traditionally very power hungry. Heaven is an old benchmark, so it doesn't use those features.


----------



## chainbolt

Hanks552 said:


> So, the ASUS TUF is still the best 4090 to buy?
> Since is 600w and msrp 1599


Doubtful. You can flash any 4090 with a 600 Watt Bios. And the TUF seems even to have slightly inferior circuitry compared to the better cards which have at least 20 or more voltage power phases. The TUF has only 18. It looks to me more like an Asus marketing gimmick to advertise this card with 600 Watt. But at the end, it's all about the silicon lottery anyway.


----------



## TheNaitsyrk

Hanks552 said:


> So, the ASUS TUF is still the best 4090 to buy?
> Since is 600w and msrp 1599


I think Suprim X has most, 26(GPU)+4(MEMORY). It's kind of let down with 520W BIOS. But can be flashed so not a problem.


----------



## cstkl1

this game even on 3090 with 12900k is difficult on 1440p. u need superstable. easily psu power draw 700-800w

nfs heat - strix 4090 4k max over 160-180 fps


----------



## yzonker

It'll be interesting to see where AMD really ends up. 









AMD Radeon RX 7000 GPUs Rumored For At Least A 2X Performance Uplift In Games


A leaker suggests AMD's next-generation Radeon parts will be a huge leap forward from the current-generation cards.




hothardware.com


----------



## Xavier233

Gigabyte 4090 OC (or similar) owners: When I set a fixed fan curve for my card, and I reboot, it somehow does not get applied, the fans are off. Am I missing some settings in the Gigabyte software?


----------



## Xavier233

yzonker said:


> It'll be interesting to see where AMD really ends up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AMD Radeon RX 7000 GPUs Rumored For At Least A 2X Performance Uplift In Games
> 
> 
> A leaker suggests AMD's next-generation Radeon parts will be a huge leap forward from the current-generation cards.
> 
> 
> 
> 
> hothardware.com


My issue with AMD is not the hardware. I had a 6800XT, and hardware wise, it is amazing. My biggest issue is the drivers. They will unfortunately never be on par with Nvidia, not now and not in the future, and the reason for that goes actually beyond AMD itself, but how games are developed, and how developers are willing to make AMD cards work the same way as Nvidia. For that reason, I will always favor Nvidia for gaming.


----------



## sniperpowa

TheNaitsyrk said:


> I think Suprim X has most, 26(GPU)+4(MEMORY). It's kind of let down with 520W BIOS. But can be flashed so not a problem.


I got the liquid x it’s 600w atleast mine is. 480w default +125% pl


----------



## AvengedRobix

chainbolt said:


> Done!
> 
> 
> I flashed the 600 Watt BIOS to the Zotak Extreme, which is by default BIOS limited to 495 Watt.
> Max power consumption went up from 495 Watt to 625 Watt
> Gain is 15 MHz from 3000 MHz to 3015 MHz.
> Gain in FPS is zero.
> Conclusion 1: Not necessary: higher energy cost and higher temps for zero gain.
> Conclusion 2: Undervolting is the way to go.


Depends on temp.. i use the the 600w BIOS only for benchmark and with oc i can achieve near 3100mhz


----------



## TheNaitsyrk

sniperpowa said:


> I got the liquid x it’s 600w atleast mine is. 480w default +125% pl


Could you provide the BIOS?

Also, after increasing voltage and switching to performance BIOS I managed to get 3135 consistent clock and after temp went up a bit to 52 it settled 3105. With waterblock this thing will be a beast.


----------



## VideoGameLover

sniperpowa said:


> I got the liquid x it’s 600w atleast mine is. 480w default +125% pl


Does the liquid x have 26(GPU)+4(MEMORY) as well?


----------



## GQNerd

sniperpowa said:


> I got the liquid x it’s 600w atleast mine is. 480w default +125% pl


Please upload the vbios or post proof.. I have the Liquid X and the OC TGP is 530w


----------



## TheNaitsyrk

Got some pics
























If it helps, after raising voltage and changing to performance BIOS, wattage in Heaven 4 increased by 50W to 400-410W.

CPU is 12900KS at 5.8Ghz, 2x16GB DDR5 6800Mhz CL30, 4090 Suprim X performance bios, latest MSI afterburner beta and all cranked to max. PSU is AX1600i


----------



## VideoGameLover

Miguelios said:


> Please upload the vbios or post proof.. I have the Liquid X and the OC TGP is 530w


If this is true is their a 600w firmware we can use for the liquid that will boost it to 600w?


----------



## GQNerd

VideoGameLover said:


> If this is true is their a 600w firmware we can use for the liquid that will boost it to 600w?


I assume the Gigabyte OC vbios would be a safe bet, but I haven't tried yet.. only had the Suprim for less than a day so far, lol


----------



## VideoGameLover

Miguelios said:


> I assume the Gigabyte OC vbios would be a safe bet, but I haven't tried yet.. only had the Suprim for less than a day so far, lol


Let me know if it works. That would be great if it does. The liquid would have the most phases and the high cpu draw then


----------



## 8472

Looks like Newegg delisted their MSI 4090s. Also there haven't been any drops from Newegg or Bestbuy today according to the tracker I'm following.


----------



## VideoGameLover

VideoGameLover said:


> Let me know if it works. That would be great if it does. The liquid would have the most phases and the high cpu draw then


It might not work tho because of the liquid cooling I hope it does tho.


----------



## 8472

More coil whine. Video is timestamped.


----------



## kx11

Yeah 240mm Aio is not cooling a damn 500w gpu


----------



## X79guy

That coil whine would have me putting that card back in the box and sending it back.


----------



## N19htmare666

TheNaitsyrk said:


> I have a question my dudes.
> 
> I have 4090 Suprim X.
> 
> I have ran Heaven 4 at 1440p ultra settings all max.
> 
> Maximum power consumption was only 350W, can someone explain why? I put sliders in Afterburner to max.
> 
> On that note, can someone tell me why I can't adjust voltage at all?
> 
> Core went to +220, +230 kinda works but fails, but +220 to core works fine.
> 
> (Memory goes to +1800 easily, and core clock 3045 as well (I don't think it's that high considering other people on 3DMark go as high as 3150 etc)
> However, I feel that if I adjust the voltage and flash the BIOS it will go higher.)
> 
> With 12900KS at 5.8Ghz I managed to get 262FPS (Score of 6200 if I recall right) in Heaven 4 with all settings at ultra and at 1080p I got 402FPS (10200 score).


Try msi kombustor 5200mb donut on a high resolution and see what it goes to?


----------



## J7SC

kx11 said:


> Yeah 240mm Aio is not cooling a damn 500w gpu


It's probably better than many air-coolers which are fine for most gaming at 450W or below, but still 500+ W needs water-cooling, IMO. Then there's the fact that most AIO rads are aluminum and single core, instead of copper/brass and double or triple core. Below on the left, a typical 360mm AIO rad, on the right, a triple core rad...


----------



## Mad Pistol

TheNaitsyrk said:


> Got some pics
> View attachment 2577172
> 
> View attachment 2577173
> 
> View attachment 2577174
> 
> If it helps, after raising voltage and changing to performance BIOS, wattage in Heaven 4 increased by 50W to 400-410W.
> 
> CPU is 12900KS at 5.8Ghz, 2x16GB DDR5 6800Mhz CL30, 4090 Suprim X performance bios, latest MSI afterburner beta and all cranked to max. PSU is AX1600i


That's a eureka moment for me. I wasn't aware the voltage was "unlocked" on my RTX 4090 (up to 1.1V, obviously).
Now benching at +255 on core, which leads to 3060 mhz. I'm over freakin 3Ghz on the worst of the worst RTX 4090 models. Hot damn!


----------



## TheNaitsyrk

Mad Pistol said:


> That's a eureka moment for me. I wasn't aware the voltage was "unlocked" on my RTX 4090 (up to 1.1V, obviously).
> Now benching at +255 on core, which leads to 3060 mhz. I'm over freakin 3Ghz on the worst of the worst RTX 4090 models. Hot damn!


You're welcome my dude


----------



## Sir Beregond

Mad Pistol said:


> That's a eureka moment for me. I wasn't aware the voltage was "unlocked" on my RTX 4090 (up to 1.1V, obviously).
> Now benching at +255 on core, which leads to 3060 mhz. I'm over freakin 3Ghz on the worst of the worst RTX 4090 models. Hot damn!


What model did you get?


----------



## Mad Pistol

TheNaitsyrk said:


> You're welcome my dude





Sir Beregond said:


> What model did you get?


Gigabyte Windforce RTX 4090.

BTW, as soon as I said that, I fired up Port Royal and got an immediate gut punch. Had to back it down to +210 on the core to get it to complete.

3060 Mhz is fine for a rasterization load, but as soon as you throw in RT, fuhgeddaboudit.

If it ain't stable on Port Royal, it ain't stable.


----------



## lordkahless

I have the 4090 Tuff non OC model on order. Can I flash the Founders Edition bios to it that is on techpowerup up to get the 600 watts or is there a Tuff OC 600 watt bios out there?


----------



## winterrr

Anyone tried flashing the Gigabyte Windforce with the Gigabyte Gaming OC vBIOS yet for the 600W power limit yet? Curious if the auto fan speeds still stays reasonable.


----------



## J7SC

Mad Pistol said:


> Gigabyte Windforce RTX 4090.
> 
> BTW, as soon as I said that, I fired up Port Royal and got an immediate gut punch. Had to back it down to +210 on the core to get it to complete.
> 
> 3060 Mhz is fine for a rasterization load, but as soon as you throw in RT, fuhgeddaboudit.
> 
> If it ain't stable on Port Royal, it ain't stable.


I use Port Royal, 3DM TS-Ex and Cyberpunk 2077 to check oc settings - was elated yesterday that VRAM +1500 is stable and most efficient (so far...haven't gone higher yet) even in CP2077.


----------



## BigMack70

Mad Pistol said:


> Gigabyte Windforce RTX 4090.
> 
> BTW, as soon as I said that, I fired up Port Royal and got an immediate gut punch. Had to back it down to +210 on the core to get it to complete.
> 
> 3060 Mhz is fine for a rasterization load, but as soon as you throw in RT, fuhgeddaboudit.
> 
> If it ain't stable on Port Royal, it ain't stable.


I had to drop my core offset to +190. Was stable at +210 in PR but was crashing in Guardians of the Galaxy.


----------



## dante`afk

that a logitech mouse on the left? asus sponsorship won't like this.



cstkl1 said:


> View attachment 2577163
> 
> 
> this game even on 3090 with 12900k is difficult on 1440p. u need superstable. easily psu power draw 700-800w
> 
> nfs heat - strix 4090 4k max over 160-180 fps


----------



## VideoGameLover

Miguelios said:


> Please upload the vbios or post proof.. I have the Liquid X and the OC TGP is 530w


Yes please upload the prof of this thanks so much.


----------



## VideoGameLover

sniperpowa said:


> I got the liquid x it’s 600w atleast mine is. 480w default +125% pl


Could you double check please. Their is another reviewer that can only get 530w.


----------



## Toony90

VideoGameLover said:


> Yes please upload the prof of this thanks so much.


Also I only have 530W in my suprim liquid x.


----------



## BeZol

BTW at Time Spy Extreme (which is a 4k benchmark) there is some cpu-limit too...

5950X all core OC 4.6GHz vs 5900X stock --> +200 Graphics Score with the 5900X
5900X stock vs 5900X PBO all core -15curve --> +200 Graphics Score with PBO

So I don't recommend using Time Spy Extreme, it's propably not 100% gpu-usage the whole time. (at least for my cpu-s, which aren't that bad)

AND/OR

it could be something with the memory, because the 5950X got 2x16GB, the 5900X got 4x8GB.
Both of them are tuned to 3600MHz CL14-15-15, and I remember, that for some games 4 stick memory is making more fps than 2 stick memory at cpu-bound situations. This could be here too.


----------



## VideoGameLover

Toony90 said:


> Also I only have 530W in my suprim liquid x.


Oh well I guess reports of it being 600w are false until someone can prove otherwise.


----------



## Toony90

VideoGameLover said:


> Oh well I guess reports of it being 600w are false until someone can prove otherwise.


I think so too, so I'm surprised that there is a 600W bios on the 1st page, where in every review and most Suprim Liquid owners, including me, we see the Power limit of 530W.


----------



## penguin1717

Hi guys. I think I found a way how to see what bin is your gpu. You need to see at what frequencies your GPU boosts at stock. In example my Rtx 4090 Suprim X boosts to 2820 max at stock. If we know that it has rated boost clock of 2625 it means that it has 195 boost over boost frequency from specs. If we divide 195 by 15 we see that it is 13 increments over boost by specs. Nvidia engraves stock V/F curve and AIBs just add their OC. The greater the difference is between rated boost clock and actual boost clock you have better bin. By looking across the net I think worst bins are 13 increments over boost by specs and the best ones will be 17 increments higher. Please try it and pist results.


----------



## RaMsiTo

penguin1717 said:


> Hola chicos. Creo que encontré una manera de ver qué contenedor es tu gpu. Necesita ver a qué frecuencias aumenta su GPU en stock. En el ejemplo, mi Rtx 4090 Suprim X aumenta a 2820 máx. en stock. Si sabemos que tiene un reloj de impulso nominal de 2625, significa que tiene un impulso de 195 sobre la frecuencia de impulso de las especificaciones. Si dividimos 195 por 15, vemos que son 13 incrementos por encima del impulso por especificaciones. Nvidia graba la curva V/F de stock y los AIB solo agregan su OC. Cuanto mayor sea la diferencia entre el reloj de impulso nominal y el reloj de impulso real, tendrá una mejor ubicación. Al mirar a través de la red, creo que los peores contenedores son 13 incrementos por encima del impulso de las especificaciones y los mejores serán 17 incrementos más altos. Por favor, pruébalo y pist resultados.
> [/COTIZAR]





penguin1717 said:


> Hi guys. I think I found a way how to see what bin is your gpu. You need to see at what frequencies your GPU boosts at stock. In example my Rtx 4090 Suprim X boosts to 2820 max at stock. If we know that it has rated boost clock of 2625 it means that it has 195 boost over boost frequency from specs. If we divide 195 by 15 we see that it is 13 increments over boost by specs. Nvidia engraves stock V/F curve and AIBs just add their OC. The greater the difference is between rated boost clock and actual boost clock you have better bin. By looking across the net I think worst bins are 13 increments over boost by specs and the best ones will be 17 increments higher. Please try it and pist results.



suprim X 2955mhz , 22 increments 😁.


----------



## KickAssCop

lordkahless said:


> I had been planning on a 4090 FE but it looks like I had an order go through for the Tuff (non-OC model). I haven't heard much on the Tuff. How is it doing for people? Its a 600 watt model correct? I plan on putting an EK block on it right away.


TUF works wonders. No need to block it. Mine is with Strix bios and it didn’t do a damn thing in improving its clock limits. Runs cool and at 57-60 C at 2715 MHz (with no need for overclocking). Maxes out at 2900 MHz and +1500 on the memory (same was case with TUF original bios).

Putting water on these cards is a complete waste of time and money. So is overclocking them.


----------



## Sheyster

lordkahless said:


> I have the 4090 Tuff non OC model on order. Can I flash the Founders Edition bios to it that is on techpowerup up to get the 600 watts or is there a Tuff OC 600 watt bios out there?


The TUF Non-OC also comes with a 600w BIOS, same as the OC version. Only difference is the small core clock OC.


----------



## penguin1717

RaMsiTo said:


> suprim X 2955mhz , 22 increments 😁.


No, you did it wrong. You must reset your GPU to defaults and then run let's say 3d mark TimeSpy. Then check your GPU max boost clock. I'm willing to bet it will be 2865 mhz. Try and see and report please. According to my calculations you are 16 increments which is very good.


----------



## mirkendargen

penguin1717 said:


> No, you did it wrong. You must reset your GPU to defaults and then run let's say 3d mark TimeSpy. Then check your GPU max boost clock. I'm willing to bet it will be 2865 mhz. Try and see and report please.


You should just be able to look at your curve in Afterburner at +0 and compare values at 1.05v (or 1.1V, whichever) normalized for factory overclocks (if you know them).


----------



## penguin1717

mirkendargen said:


> You should just be able to look at your curve in Afterburner at +0 and compare values at 1.05v (or 1.1V, whichever) normalized for factory overclocks (if you know them).


Yeah it is possible but you must know that curve changes by temp. We can measure this way but we must be certain that everyone uses same temperature when measuring. The reason I said it to measure it this way so it is easier to compare your results to reviews online cause nobody is showing their v/f curves, only boost frequencies at given temp.


----------



## TheNaitsyrk

Can't wait for a company EKWB or any other to drop a GPU block for Suprim X. It will go under water asap.


----------



## mirkendargen

penguin1717 said:


> Yeah it is possible but you must know that curve changes by temp. We can measure this way but we must be certain that everyone uses same temperature when measuring. The reason I said it to measure it this way so it is easier to compare your results to reviews online cause nobody is showing their v/f curves, only boost frequencies at given temp.


You can just look at the curve at idle when your card is cool. People are way more likely to have varying temps under load than idle.


----------



## penguin1717

mirkendargen said:


> You can just look at the curve at idle when your card is cool. People are way more likely to have varying temps under load than idle.


Yes you can but you can't compare your results with results of other people online which run their GPUs at stock. Tell me your GPU model and tell me your frequency at 1.050 volts at under 30 C and I will tell you your stock boost clock.


----------



## lordkahless

Sheyster said:


> The TUF Non-OC also comes with a 600w BIOS, same as the OC version. Only difference is the small core clock OC.


Really? I was watching the Hardware Unboxed review and the slider for power target only goes to 108% and I believe he states in his video that the non OC is not a 600W bios unless I misunderstood.

EDIT: OK I see he has a disclaimer in the comments that he made a mistake and the Gaming non-OC is also 600 watt bios just like you said.


----------



## RaMsiTo

penguin1717 said:


> No, you did it wrong. You must reset your GPU to defaults and then run let's say 3d mark TimeSpy. Then check your GPU max boost clock. I'm willing to bet it will be 2865 mhz. Try and see and report please. According to my calculations you are 16 increments which is very good.


I'm not doing it wrong, I commented on it several pages before, I've seen 5 suprim x.
and I have only seen 2955 on my unit.


----------



## penguin1717

RaMsiTo said:


> I'm not doing it wrong, I commented on it several pages before, I've seen 5 suprim x.
> and I have only seen 2955 on my unit.
> 
> View attachment 2577243


Please try to run your GPU at stock in lets say 3d mark time spy. You will see you'll hit max frequency of 2865. Try and see if you do not believe me. In 4090s difference between best bins and worst bins is about 60 mhz. Maybe 75 MHZ but I doubt it. That means that the silicone is pretty uniform.


----------



## yzonker

penguin1717 said:


> Hi guys. I think I found a way how to see what bin is your gpu. You need to see at what frequencies your GPU boosts at stock. In example my Rtx 4090 Suprim X boosts to 2820 max at stock. If we know that it has rated boost clock of 2625 it means that it has 195 boost over boost frequency from specs. If we divide 195 by 15 we see that it is 13 increments over boost by specs. Nvidia engraves stock V/F curve and AIBs just add their OC. The greater the difference is between rated boost clock and actual boost clock you have better bin. By looking across the net I think worst bins are 13 increments over boost by specs and the best ones will be 17 increments higher. Please try it and pist results.


There's a flaw in that theory. For example the Strix bios is 3 bins higher by default than the TUF OC bios. So unless you only compare on the same bios, the result will be skewed by different bios having different default VF curve offsets.


----------



## penguin1717

yzonker said:


> There's a flaw in that theory. For example the Strix bios is 3 bins higher by default than the TUF OC bios. So unless you only compare on the same bios, the result will be skewed by different bios having different default VF curve offsets.


But I said it, you must look at your boost in specs for your card (let's say via GPUz) and then compare it to your actual boost. Only the difference between you boost in specs and actual boost at stock matters. Why isn't anyone willing to try this method? You will see that all of you will fall between 12-17 increments range. which is 75 mhz. Try and see and you will see my theory is credible.


----------



## changboy

I drive 70km to go take my gigabyte waterforce extreme and once there they never found my card, so 30 minutes later they realize they gave my card to a guy who bought a gigabyte gaming oc *** !

So i bought the Gigabyte 4090 gaming oc and save 400$ on the price i was going to paid !
But the store lost 400$ hahaha. Im glad i got that one for less money and i will be able to put a waterblock on it.

Who make a waterblock for this Gigabyte ? Just EK ? Coz its twice the price of a Bykski ??


----------



## penguin1717

Ramsito have you tested my way?


----------



## mirkendargen

Yeah, 20 bins. Theory busted. Maybe the majority of cards are 12-17, but not all.


----------



## penguin1717

What is your GPU temp? Ok if you don't believe me run 3d Mark or Heaven and record your max GPU frequency and you will see. Why don't you want to check this via my method. I do not understand.


----------



## mirkendargen

changboy said:


> I drive 70km to go take my gigabyte waterforce extreme and once there they never found my card, so 30 minutes later they realize they gave my card to a guy who bought a gigabyte gaming oc *** !
> 
> So i bought the Gigabyte 4090 gaming oc and save 400$ on the price i was going to paid !
> But the store lost 400$ hahaha. Im glad i got that one for less money and i will be able to put a waterblock on it.
> 
> Who make a waterblock for this Gigabyte ? Just EK ? Coz its twice the price of a Bykski ??


Only EK so far, and they aren't even shipping it for a month. Bykski might be making one by then.


----------



## mirkendargen

penguin1717 said:


> What is your GPU temp? Ok if you don't believe me run 3d Mark or Heaven and record your max GPU frequency and you will see. Why don't you want to check this via my method. I do not understand.


Why would my max GPU speed at load mean anything? I might be at -140C on ln2 under load, I might be 85C under load with garbage case cooling. That isn't a reliable test at all. What I'm telling you is the curve burned on the chip, which is what you say the determining factor is.


----------



## changboy

mirkendargen said:


> Only EK so far, and they aren't even shipping it for a month. Bykski might be making one by then.


Thanks, then i will install it on air by moving a bit my reservoir for fit this giant card till i can buy a waterblock.


----------



## RaMsiTo

penguin1717 said:


> Ramsito have you tested my way?


I already checked it and since it is the same, 2955 mhz, I have tried bios strix oc and 2955 mhz continues to appear, with bios colorful the same.

1.05v 2880









1.1v 2955












5 suprim X

1 2955
2 2880
1 2850
1 2835


----------



## mirkendargen

RaMsiTo said:


> I already checked it and since it is the same, 2955 mhz, I have tried bios strix oc and 2955 mhz continues to appear, with bios colorful the same.
> 
> 5 suprim X
> 
> 1 2955
> 2 2880
> 1 2850
> 1 2835


Is that 1.05v or 1.1v?

Ok I don't feel as bad. I thought I was tied with the worst card you'd seen, I'm 2910 at 1.1v


----------



## penguin1717

mirkendargen said:


> Why would my max GPU speed at load mean anything? I might be at -140C on ln2 under load, I might be 85C under load with garbage case cooling. That isn't a reliable test at all. What I'm telling you is the curve burned on the chip, which is what you say the determining factor is.


Look man I'm telling you test this and you will see. Your Gpu temp at stock will be between 60 an 70 degrees C. Ok I understand that you don't agree with me but why don't you want to test this? This test needs only 2 minutes of your time. Try it and post results. And we will see if my theory is credible. If not I'll accept it.


----------



## GQNerd

Suprim Liquid X:

Stock Gaming vbios 530w

2910


----------



## penguin1717

RaMsiTo said:


> I already checked it and since it is the same, 2955 mhz, I have tried bios strix oc and 2955 mhz continues to appear, with bios colorful the same.
> 
> 1.05v 2880
> View attachment 2577250
> 
> 
> 1.1v 2955
> 
> View attachment 2577251
> 
> 
> 
> 
> 5 suprim X
> 
> 1 2955
> 2 2880
> 1 2850
> 1 2835
> 
> 
> 
> RaMsiTo said:
> 
> 
> 
> I already checked it and since it is the same, 2955 mhz, I have tried bios strix oc and 2955 mhz continues to appear, with bios colorful the same.
> 
> 1.05v 2880
> View attachment 2577250
> 
> 
> 1.1v 2955
> 
> View attachment 2577251
> 
> 
> 
> 
> 5 suprim X
> 
> 1 2955
> 2 2880
> 1 2850
> 1 2835
> 
> 
> You don't get it. My GPU sits in V/F curve at 27 C at 1.1 volt at 2895 mhz. Max boost at stock is 2820 mhz. Your GPU sits at 2955 at 1.1 volt so if you bench it at stock it will boost to 2880. Check and see if I am wrong. That is 17 increments which is top bin.
Click to expand...


----------



## mirkendargen

penguin1717 said:


> Look man I'm telling you test this and you will see. Your Gpu temp at stock will be between 60 an 70 degrees C. Ok I understand that you don't agree with me but why don't you want to test this? This test needs only 2 minutes of your time. Try it and post results. And we will see if my theory is credible. If not I'll accept it.


But 60-70c is already dropping multiple bins due to temperature. If I blasted my case with a window AC unit and got it down to 50c I'd clock 1-3 bins faster changing nothing else. You have to control for temperature and you're not, it's an unreliable test. Just reading the v/f curve seems pretty damn reliable, since we have someone that's looked at 5 samples of the same card. I may well be +17 bins at load at 65c because I'm actually +20 bins but dropping 3 from temp, but no one can tell that with certainty. Looking at a v/f curve can be compared with certainty.


----------



## bmagnien

KickAssCop said:


> Putting water on these cards is a complete waste of time and money. So is overclocking them.


lol. if you think +15% performance and silent operation is a waste, you're in the wrong forum kid.



penguin1717 said:


> Hi guys. I think I found a way how to see what bin is your gpu.


Lots of misguided advice and useless followups here trying to reinvent the wheel. There's already a great way to test your silicon quality: 3DMark.com - Share and compare scores from UL Solutions' benchmarks


----------



## penguin1717

[/QUOTE]
You don't get it. My GPU sits in V/F curve at 27 C at 1.1 volt at 2895 mhz. Max boost at stock is 2820 mhz. Your GPU sits at 2955 at 1.1 volt so if you bench it at stock it will boost to 2880. Check and see if I am wrong. That is 17 increments which is top bin.[/QUOTE]


----------



## penguin1717

mirkendargen said:


> But 60-70c is already dropping multiple bins due to temperature. If I blasted my case with a window AC unit and got it down to 50c I'd clock 1-3 bins faster changing nothing else. You have to control for temperature and you're not, it's an unreliable test. Just reading the v/f curve seems pretty damn reliable, since we have someone that's looked at 5 samples of the same card. I may well be +17 bins at load at 65c because I'm actually +20 bins but dropping 3 from temp, but no one can tell that with certainty. Looking at a v/f curve can be compared with certainty.


So we are saying the same thing. OK look at your V/F curve at temps under 30 so my 1.1 frequency is the 2895-2625 equals 270/15 equals 18. Ramsito is 2955-2625 which is 22. So I am willing to say that the range is somewhere between 17-22 increments so the difference between worst and best is 75 mhz at most.


----------



## mirkendargen

penguin1717 said:


> So we are saying the same thing. OK look at your V/F curve at temps under 30 so my 1.1 frequency is the 2895-2625 equals 270/15 equals 18. Ramsito is 2955-2625 which is 22. So I am willing to say that the range is somewhere between 17-22 increments so the difference between worst and best is 75 mhz at most.


He literally gave you a 120mhz difference in the data he showed 🤦‍♂️


----------



## penguin1717

mirkendargen said:


> He literally gave you a 120mhz difference in the data he showed 🤦‍♂️


He didn't say at which temps the readings were shown. Tell me which model do you have? If you have stock bios and tell me 1.1 v frequency and I'll tell you your bin. Believe it or not


----------



## mirkendargen

penguin1717 said:


> He didn't say at which temps the readings were shown. Tell me which model do you have? If you have stock bios and tell me 1.1 v frequency and I'll tell you your bin. Believe it or not


The v/f curve in AB doesn't change with temperature, it's static (unless you modify it). What the card actually clocks at at a given temperature is an offset of the requested bin above ~40c. Reading the curve will always say the same thing, which is what he did on 5 Suprim's.


----------



## penguin1717

mirkendargen said:


> The v/f curve in AB doesn't change with temperature, it's static (unless you modify it). What the card actually clocks at at a given temperature is an offset of the requested bin above ~40c. Reading the curve will always say the same thing, which is what he did on 5 Suprim's.


No you are wrong. Preheat the card and see. Curve is constantly changing with temps.


----------



## mirkendargen

penguin1717 said:


> No you are wrong. Preheat the card and see. Curve is constantly changing with temps.


No, I'm not. Curve is the same 2835 at 61c. I'm dropping 2 bins due to temp for a clock of 2805.


----------



## yzonker

penguin1717 said:


> But I said it, you must look at your boost in specs for your card (let's say via GPUz) and then compare it to your actual boost. Only the difference between you boost in specs and actual boost at stock matters. Why isn't anyone willing to try this method? You will see that all of you will fall between 12-17 increments range. which is 75 mhz. Try and see and you will see my theory is credible.


OK, fair. But the Strix boost clock is 2640,.my TUF is 2565. That's 5 bins, but my curve only increased 3 bins when I flashed it. So now by your method my card is worse due to a bios flash.


----------



## penguin1717

mirkendargen said:


> No, I'm not. Curve is the same 2835 at 61c. I'm dropping 2 bins due to temp for a clock of 2805.
> 
> View attachment 2577254


Ok now cool your card under 30 C and post V/f curve. You will see it changed.


----------



## mirkendargen

penguin1717 said:


> Ok now cool your card under 30 C and post V/f curve. You will see it changed.


Dude I literally already posted that. It's the same.


----------



## penguin1717

Yeah I see. I must admit I was wrong. Sorry. In Ampere there were fluctuations according to temps. Ok I must admit Ramsito data seems so wide.


----------



## mirkendargen

penguin1717 said:


> Yeah I see. I must admit I was wrong. Sorry. In Ampere there were fluctuations according to temps. Ok I must admit Ramsito data seems so wide.


It does seem like that 2955 card is the best of the best though, so I think it, or MAYBE 2970 are the best of the best golden samples. The real question is what the floor for bad silicon that is still deemed sellable is, and where the median is/what the distribution looks like.


----------



## yzonker

mirkendargen said:


> It does seem like that 2955 card is the best of the best though, so I think it, or MAYBE 2970 are the best of the best golden samples. The real question is what the floor for bad silicon that is still deemed sellable is, and where the median is/what the distribution looks like.


It's buried somewhere in this thread, but mine is only 2805! But somehow it seems to bench as well as most cards here in this thread. Ramisto's is a little better, but most aren't. 

I have no clue why the cards are so much different. Maybe some binning done to ensure stock speeds are stable, but as usual that means nothing for max OC.


----------



## N19htmare666

sniperpowa said:


> I got the liquid x it’s 600w atleast mine is. 480w default +125% pl


Please can you upload the bios?


----------



## yzonker

I dug up my curve,


----------



## Mad Pistol

By the way, the RTX 4090 is slightly faster than an iPhone.


----------



## EconomyFishFinger

Anyone had any difficulty getting the voltage slider to work on a Gigabyte Gaming OC with MSI afterburner?

Tried deleting profiles, reisntalling etc.
Also, this card is damn silent, got an EK block preordered and if it wasnt for the fact I want to push oc's as far as possible id totally cancel the order.


----------



## dk_mic

EconomyFishFinger said:


> Anyone had any difficulty getting the voltage slider to work on a Gigabyte Gaming OC with MSI afterburner?
> 
> Tried deleting profiles, reisntalling etc.
> Also, this card is damn silent, got an EK block preordered and if it wasnt for the fact I want to push oc's as far as possible id totally cancel the order.


you need the latest beta, 4.6.5 beta 2


----------



## EconomyFishFinger

dk_mic said:


> 4.6.5 beta 2


Thanks! Thought it may have been something simple like that. Appreciate it


----------



## Aldur

Has anyone tried to flash a 4090 Gigabyte Windforce with a 4090 Gaming OC BOIS? I have a windforce coming in the mail and I'd love to up the power limit to 600w.


----------



## akgis

MY Strix only boost stock to 2740 while still operating under 65º what gives?

What you guys use to force max power draw, I set voltage to 1,1, max power draw on Afterburner beta, and no mater the core mhz offsert I put the card dont want to pull more than 480-500, in both Quiet and Performance, Seems iam 500 limited on a Strix that by default is 600w right?

My power supply is Corsair HX1000


----------



## EconomyFishFinger

sblantipodi said:


> View attachment 2576868
> 
> 
> What is your score in 4k optimized?


I think somethings up with my Gigabyte Gaming OC?









I currently have +120 on the core and +1200 on the memory but my score is waaay lower? 
DDR5 at 6600mhz and 12900KF at 5.2Ghz

Not sure why its scoring low, also noticed that Dying light 2 and Cyberpunk was not giving the large improvements I was expecting


----------



## yzonker

EconomyFishFinger said:


> I think somethings up with my Gigabyte Gaming OC?
> View attachment 2577276
> 
> 
> I currently have +120 on the core and +1200 on the memory but my score is waaay lower?
> DDR5 at 6600mhz and 12900KF at 5.2Ghz
> 
> Not sure why its scoring low, also noticed that Dying light 2 and Cyberpunk was not giving the large improvements I was expecting


Your GPU utilization is really low. Do you have g-sync on or something else that might hold it back? Obviously your cpu is fast enough. It's getting bottlenecked somehow.


----------



## BigMack70

Got my 13700k up and running... It's a big improvement over the 9900k even before tweaking/overclocking it. 120fps now locked in elden ring and halo infinite at 4k. Big framerates boosts to those and several other games on my 240hz ultrawide. Almost locks to 120fps in A Plague Tale Requiem, but some drops down to about 100-110fps, which is fine.


----------



## EconomyFishFinger

yzonker said:


> Your GPU utilization is really low. Do you have g-sync on or something else that might hold it back? Obviously your cpu is fast enough. It's getting bottlenecked somehow.


Ahh, it was GSync, I dont know why I didnt even think of that!










Thanks buddy!


----------



## 8472

yt93900 said:


> RTX 4090 Aorus Xtreme w. 360mm AIO, OC'ed, fans at 100%:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 33 983 in Time Spy
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> Compared to previous 3090 Strix, OC'ed:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 20 667 in Time Spy
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 3090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Shame Gigabyte did a big stupid with the radiator fans, they have a custom connector!! Both the fans and the 3-way fan split cable. I was already removing them and mounting 12x25 Chromaxes until I noticed the connector is completely different. Now waiting for someone to tear his 4090 Xtreme down to see if the splitter can be replaced with a normal PWM one.
> Regarding the temps, the stock curve won't ramp them up from above the default 30% up until 56-58*C somewhere, ~58-60*C seem to be the default temp target.


Just to make sure, can the fans be removed from the splitter? If so, if I get this card I'll just tuck the loose cable out of the way somewhere. 

It's weird that almost all the other cards had professional reviews except for this one.


----------



## J7SC

penguin1717 said:


> Ok now cool your card under 30 C and post V/f curve. You will see it changed.


...this method has been > around for a while also as a pseudo ASIC % rating since GPUz stopped showing it ( and I don't think I was the first w/ that either). I have yet to try it with my 4090, but with the 6900XT and 3090 Strix, what would shift the curve was not temps but a different vbios (ie. one which used a different voltage spread).


----------



## AvengedRobix

RaMsiTo said:


> I'm not doing it wrong, I commented on it several pages before, I've seen 5 suprim x.
> and I have only seen 2955 on my unit.
> 
> View attachment 2577243


How to see more than 3000 on ab curve?


----------



## yt93900

8472 said:


> Just to make sure, can the fans be removed from the splitter? If so, if I get this card I'll just tuck the loose cable out of the way somewhere.
> 
> It's weird that almost all the other cards had professional reviews except for this one.


Yes, they can be removed. Look up the 3090Ti Waterforce Xtreme reviews, I think the cooler is the same.


----------



## sweepersc

I checked the Suprim Liquid X vbioses uploaded on TPU (95.02.18.40.4C) and it does say 530W. However, that is not the vbios my card came with, mine is 95.02.18.00.CD. Here's the Nvidia SMI screenshots:
Stock









Max PL








I also tried uploading in TPU but it said that registration from certain countries are no longer accepted. I put it in GDrive, here is the link . Can someone upload this on TPU please?

Regards,
Mark


----------



## akgis

sweepersc said:


> I checked the Suprim X uploaded on TPU (95.02.18.40.4C) and it does say 530W. However, that is not the vbios my card came with, mine is 95.02.18.00.CD. Here's the Nvidia SMI screenshots:
> Stock
> View attachment 2577279
> 
> 
> Max PL
> View attachment 2577280
> 
> I also tried uploading in TPU but it said that registration from certain countries are no longer accepted. I put it in GDrive, here is the link . Can someone upload this on TPU please?
> 
> Regards,
> Mark


Where you got that program to give you the limits? The comand line


----------



## arieldeboca

Hello guys. For STRIX 4090 owners: does it work well on "older" power supplies using the 3-wire power adapter?
Now I have a FTW3 3090 that gives restart problems, maybe due to transient voltage spikes. In the 4000 series, was this problem solved?


----------



## sweepersc

akgis said:


> Where you got that program to give you the limits? The comand line


Got it from an older driver, 457.30. Newer driver installations don't it, so you have to copy the Nvidia SMI folder into another location.


----------



## Nd4spdvn

sweepersc said:


> I checked the Suprim X uploaded on TPU (95.02.18.40.4C) and it does say 530W. However, that is not the vbios my card came with, mine is 95.02.18.00.CD.


Is this Suprim Liquid X or Suprim X aircooled version of the Suprim? And thank you btw for providing a Suprim 600W BIOS!!


----------



## Benni231990

**** we need the 600watt bios
Only the Liquid has the "Secret" 600 watt bios

But now the Big question is why has MSI limited the "New" Bios to 520/530?


----------



## RaMsiTo

AvengedRobix said:


> How to see more than 3000 on ab curve?


----------



## sweepersc

Nd4spdvn said:


> Is this Suprim Liquid X or Suprim X aircooled version of the Suprim? And thank you btw for providing a Suprim 600W BIOS!!


It's the Suprim Liquid X, I will edit the wording  Thanks


----------



## KedarWolf

akgis said:


> Where you got that program to give you the limits? The comand line


You can run it from C:\Windows\System32\


----------



## bottjeremy

BigMack70 said:


> Got my 13700k up and running... It's a big improvement over the 9900k even before tweaking/overclocking it. 120fps now locked in elden ring and halo infinite at 4k. Big framerates boosts to those and several other games on my 240hz ultrawide. Almost locks to 120fps in A Plague Tale Requiem, but some drops down to about 100-110fps, which is fine.


What's the OC look like? Almost nobody talking about it.


----------



## Nd4spdvn

Will the 600W Suprim Liquid X bios work fine on my Suprim X I wonder? Thinking of flashing it as the PCB is the same but I am worried a bit that it may not act right with the fans on the air-cooled Suprim x I have.


----------



## Mad Pistol

I had no idea Superposition even ran this fast...


----------



## Mad Pistol

And an overclock...
+210/+1500 core/mem
Voltage maxed, Fan at 90%

Not bad for a crappy little (well... big) Windforce 4090.


----------



## BigMack70

bottjeremy said:


> What's the OC look like? Almost nobody talking about it.


I am waiting on the lga 1700 retrofit kit for my h150i cooler before I start overclocking the CPU. Temps are ok on the lga 1151 standoffs but I don't want to push it. 

Probably won't know oc results for a week or two.


----------



## kaTus

4090 Strix + 13900K on Z790-E Strix









I scored 36 918 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## sweepersc

Nd4spdvn said:


> Will the 600W Suprim Liquid X bios work fine on my Suprim X I wonder? Thinking of flashing it as the PCB is the same but I am worried a bit that it may not act right with the fans on the air-cooled Suprim x I have.


I cranked the fan speed at maximum, here are the rpms. GPU Fan 1 is the fan speed on the radiator while GPU Fan 2 is the fan speed on the gpu shroud. 









Hope this helps

Regards,
Mark


----------



## mirkendargen

Nd4spdvn said:


> Will the 600W Suprim Liquid X bios work fine on my Suprim X I wonder? Thinking of flashing it as the PCB is the same but I am worried a bit that it may not act right with the fans on the air-cooled Suprim x I have.


I'm sure it's fine electrically but the fan control might be wonky.


----------



## long2905

msky73 said:


> I didn't moved to water for OCing but for comfort having had the custom loop anyway. And Inno3D has only 14 power phases and is locked at 100% power limit so I run the stock BIOS.
> 
> The card runs 2820MHz +-15MHz. Under full load is 45-50C hot (I have 120+360+360 slim rads for GPU and 5950X). In games 35-50C strongly depending on load (I use 4K/144 Hz monitor and fps capped to 141).
> Some 3DMark numbers:
> Port Royal 25930
> Time Spy Extreme GPU 19606
> DLSS3 57:175
> 
> Game with DLSS2 (Shadow Warrior 3) - I made a few comparisons before replacing 3090:
> View attachment 2577155


looking foward to the blocked version to be released and swap it into my rig


----------



## yzonker

Here's my TUF OC on the Strix OC bios. 3 bins higher. BTW, I also tried the Gigabyte OC bios. It was 2 bins higher (2835). All that bios did is mess up the fan curves for my card. None of them seem to be better than the stock bios as far as I can tell.


----------



## yzonker

One more, I'm a bios slut. MSI Suprim Liquid 530w bios. 2880 for this one. Highest of all I've tried.


----------



## schoolofmonkey

I'm curious, with this Corsair HX1000, could you use 3x8pin PCIe connectors then use the extra biggyback cable on the 4th adapter connector, would clear up one cable...lol


----------



## galsdgk

yzonker said:


> One more, I'm a bios slut. MSI Suprim Liquid 530w bios. 2880 for this one. Highest of all I've tried.
> 
> View attachment 2577323


Nice. Just curious, why don’t you add a positive core frequency offset? Like +100?


----------



## minittt

anyone noticed a review for this 4090 ?
AORUS GeForce RTX™ 4090 XTREME WATERFORCE 24G 
SKU: GV-N4090AORUSX W-24GD

I am trying to decide if I should get the ASUS Tuf 4090 OC or the Aorus 360 AIO one. Tia


----------



## 8472

schoolofmonkey said:


> I'm curious, with this Corsair HX1000, could you use 3x8pin PCIe connectors then use the extra biggyback cable on the 4th adapter connector, would clear up one cable...lol
> View attachment 2577331


FWIW, I have an AX850 and the TUF didn't like the extra connector. I couldn't get the system to post using two single cables and a two on one cable. Once I unplugged the piggyback connector on the cable everything booted up fine. 

It of course could be different for higher wattage PSUs, but in general I suspect that these cards want separate cables for the adapter. 

In the video below, MSI recommended not to do that. Video is timestamped.


----------



## J7SC

8472 said:


> FWIW, I have an AX850 and the TUF didn't like the extra connector. I couldn't get the system to post using two single cables and a two on one cable. Once I unplugged the piggyback connector on the cable everything booted up fine.
> 
> It of course could be different for higher wattage PSUs, but in general I suspect that these cards want separate cables for the adapter.
> 
> In the video below, MSI recommended not to do that. Video is timestamped.


I like the part about 'this might be a tight fit' in that case...


----------



## SilenMar

What does 12VHPWR mean? 12V High Power Wide Range?

These water-cooled GPUs are so afraid of the 12VHPWR exploding the PSU when it peaks at 1800W. So every BIOS is properly tunned down. Or they have worse VRM than the air-cooled GPUs.


----------



## mirkendargen

SilenMar said:


> What does 12VHPWR mean? 12V High Power Wide Range?


12v high power.


----------



## SilenMar

mirkendargen said:


> 12v high power.


It's not an actual question.


----------



## arieldeboca

What PSUS are you using with the 4090 (model and wattage)?


----------



## galsdgk

bmagnien said:


> Where do you see that the PNY is reference? It’s not ok Alphacools own compatibility list


Hey bro. Here is the confirmation you wanted. It’s reference PCB.


----------



## MrTOOSHORT

arieldeboca said:


> What PSUS are you using with the 4090 (model and wattage)?


EVGA 1600 T2. Just bought a Cablemod 16pin to x4 8pin cable. Should be cleaner looking than the one that came in the 4090 box.


----------



## mirkendargen

arieldeboca said:


> What PSUS are you using with the 4090 (model and wattage)?


AX1600i. Using the Corsair 12VHPWR cable for now, I might switch to a custom sleeved one for looks and length at some point.


----------



## arieldeboca

MrTOOSHORT said:


> EVGA 1600 T2. Just bought a Cablemod 16pin to x4 8pin cable. Should be cleaner looking than the one that came in the 4090 box.


With seasonic titanium 1000 (previous model to onseasonic) will it be enough?


----------



## X79guy

My 4090 Gaming Trio (non X) came in today. This is replacing my 4090 Giga OC as the Giga does not fit in my itx meshy. I can confirm that the Trio at 337mm long fits in the meshy, though just barely. I was worried that the Trio would have coil whine, and as I suspected it does. My giga OC has no whine till about 400fps at full load. The Trio is not the worst i've heard but its definitely there.

Beyond the coil whine though I have noticed that this Trio consistently scores 150 points more than my Giga OC in heaven benchmark. It boosts to 2700mhz out of the box. It does run about 6c hotter than my Giga OC but it is also more silent. Another thing to note is that the materials of the Trio are far superior to the Giga OC. The Giga feels and looks like cheap thin plastic. The Trio's plastic is much thicker and has texture.

I am going to have to decide whether I am able to get over this coil whine though.


----------



## schoolofmonkey

arieldeboca said:


> With seasonic titanium 1000 (previous model to onseasonic) will it be enough?


I'm running a Corsair HX1000 fine.


----------



## J7SC

arieldeboca said:


> What PSUS are you using with the 4090 (model and wattage)?


Giga-G-OC and 5950X draw on a relatively new Seasonic Prime Platinum PX1300


----------



## Arizor

For folks looking to undervolt to keep temps and wattage down whilst maintaining performance, I've been testing this for a few hours and it's rock solid stable on my TUF. Limited to 0.95v.


----------



## Nico67

penguin1717 said:


> Yeah I see. I must admit I was wrong. Sorry. In Ampere there were fluctuations according to temps. Ok I must admit Ramsito data seems so wide.


I would have thought the curve would have changed with temp too, but probably easier to tune if it doesn't, one less variable.



akgis said:


> MY Strix only boost stock to 2740 while still operating under 65º what gives?
> 
> What you guys use to force max power draw, I set voltage to 1,1, max power draw on Afterburner beta, and no mater the core mhz offsert I put the card dont want to pull more than 480-500, in both Quiet and Performance, Seems iam 500 limited on a Strix that by default is 600w right?
> 
> My power supply is Corsair HX1000


clk it up to 3ghz and run something demanding, I can get to 550w pretty easy.


----------



## EEE-RAY

People with the strix 4090 - what are your memory overclocks? 

I cranked the mem clock up by 250mhz at a time running port royal to check that the score continues to rise (i.e. not error correcting). got about +100 points in graphics score with every 250mhz, and stopped testing at 1750mhz. I dropped the clock back to 1500 for safety margin for every day use. No artefacts that I could see - is this a sound way to determine a daily running mem overclock?


----------



## Zogge

arieldeboca said:


> What PSUS are you using with the 4090 (model and wattage)?


AX1600i.


----------



## daniel_4795

Anyone know why the 4090 can’t support 4K at 165hz through hdmi 2.1 when the 3090 could. Driver 517.48 from nvidia supported the 165hz but latest driver doesn’t and 517.48 not compatible with 4090


----------



## mirkendargen

daniel_4795 said:


> Anyone know why the 4090 can’t support 4K at 165hz through hdmi 2.1 when the 3090 could. Driver 517.48 from nvidia supported the 165hz but latest driver doesn’t and 517.48 not compatible with 4090


Do you have to drop the color to 8bit?


----------



## Arizor

EEE-RAY said:


> People with the strix 4090 - what are your memory overclocks?
> 
> I cranked the mem clock up by 250mhz at a time running port royal to check that the score continues to rise (i.e. not error correcting). got about +100 points in graphics score with every 250mhz, and stopped testing at 1750mhz. I dropped the clock back to 1500 for safety margin for every day use. No artefacts that I could see - is this a sound way to determine a daily running mem overclock?


I’ve had a similar experience with my TUF - can go up to +1900 without issues but a bit scared to run too hot for too long, +1500 is a very solid 24gbps so I’m staying with that unless I’m doing some silly benchmarking.

I suspect we got lucky and they’ve used the 24gbps GDDR6X on quite a lot of boards.


----------



## TheNaitsyrk

Arizor said:


> I’ve had a similar experience with my TUF - can go up to +1900 without issues but a bit scared to run too hot for too long, +1500 is a very solid 24gbps so I’m staying with that unless I’m doing some silly benchmarking.
> 
> I suspect we got lucky and they’ve used the 24gbps GDDR6X on quite a lot of boards.


My one is the same, goes up and up and it's stable


----------



## daniel_4795

mirkendargen said:


> Do you have to drop the color to 8bit?


I will try this but didn’t on the 3090 card


----------



## EEE-RAY

Arizor said:


> I’ve had a similar experience with my TUF - can go up to +1900 without issues but a bit scared to run too hot for too long, +1500 is a very solid 24gbps so I’m staying with that unless I’m doing some silly benchmarking.
> 
> I suspect we got lucky and they’ve used the 24gbps GDDR6X on quite a lot of boards.


Ah ok. Is this an ASUS card thing? I was half asking because the mem clock seems to be higher than many other values ive seen i forums and reviews. 

I wonder if they have identical boards and heatsinks already set up for the 4090 and future 4090ti and will literally just swap the core depending on the load. I can't imagine ASUS using higher binned GDDR6X for no particular reason.


----------



## RaMsiTo

arieldeboca said:


> What PSUS are you using with the 4090 (model and wattage)?


Evga 1200w p2 en platinum.


----------



## VideoGameLover

Sheyster said:


> The TUF Non-OC also comes with a 600w BIOS, same as the OC version. Only difference is the small core clock OC.


The strix and oc version have more capacitors also.
What does more capacitors actual do? In frame rates people are saying that it does not improve a thing. My question is in theory what is it exposed to improve?


----------



## Hulk1988

minittt said:


> anyone noticed a review for this 4090 ?
> AORUS GeForce RTX™ 4090 XTREME WATERFORCE 24G
> SKU: GV-N4090AORUSX W-24GD
> 
> I am trying to decide if I should get the ASUS Tuf 4090 OC or the Aorus 360 AIO one. Tia


I had the Waterforce card and I cannot recommend it. The temperatures are similar compared to my Strix and Suprim X and the memory temperature was even higher on the Waterforce card.


----------



## EEE-RAY

VideoGameLover said:


> The strix and oc version have more capacitors also.
> What does more capacitors actual do? In frame rates people are saying that it does not improve a thing. My question is in theory what is it exposed to improve?


My card doesn't seem to even hit its base 500W power budget. Is voltage controllable in afterburner?


----------



## Krzych04650

J7SC said:


> Nice ! FYI, I emailed Phanteks about a water block for the Giga-G-OC, but no real info back other than 'may be; soon'...


I also emailed them on launch day and they said that more info is going to be provided later in the week. It's been almost 2 weeks since then...

I really like their 4090 blocks though, especially the ones with ports on the side. Really don't like all of those short cassette tape style blocks that others do, so Phanteks seem like only option for me. 4090 stock seems to be improving but I am not going to order anything until I know for sure what models Phanteks will support. 

Also from what I've read so far MSI and Asus are notorious for coil whine while Gigabyte reports are a lot better, so I may go with Giga-OC too. I don't like that it doesn't have power connector reclined into the PCB like Asus or MSI, which is important for clearance and aesthetics since you cannot bend 12VHPWR so easily, but there are going to be some 90-degree adapters available, so maybe that's not so much of a problem.


----------



## daniel_4795

Anyone got any other ideas why latest driver doesn’t support 4K at 165hz for 4090 when old driver and 3090 did?


----------



## 8472

Hulk1988 said:


> I had the Waterforce card and I cannot recommend it. The temperatures are similar compared to my Strix and Suprim X and the memory temperature was even higher on the Waterforce card.


Between the Strix and the Suprim X which one is quieter? Also have you noticed any coil whine on either of them?


----------



## TheNaitsyrk

8472 said:


> Between the Strix and the Suprim X which one is quieter? Also have you noticed any coil whine on either of them?


Suprim X. Suprim X is like 28.6DB and Strix around 30-31DB so almost no difference. No coil while on my Suprim X.


EEE-RAY said:


> My card doesn't seem to even hit its base 500W power budget. Is voltage controllable in afterburner?


There is, download newest beta from Guru3D, link few pages behind.


----------



## ArcticZero

I'm tempted to just forego waiting for the Strix/TUF to go back on stock here and just get a Suprim X water. I'm only concerned about custom loop block availability for when I want to add it to my loop instead of relying on the AIO. That and the odds of getting the 600w BIOS assuming it isn't made available online anytime soon.

How's memory OC results so far on Suprim X cards? Because from what I've been seeing on Asus cards it seems like +1500 is usually perfectly fine.


----------



## bmagnien

UNIGINE Superposition benchmark score


UNIGINE Superpsition detailed score page




benchmark.unigine.com





15k 8k superposition. full fans in a warm ambient room. these cards are bonkers and going to get better with blocks.


----------



## sniperpowa

N19htmare666 said:


> Please can you upload the bios?


It says bios reading not supported on this device lol.


N19htmare666 said:


> Please can you upload the bios?


It says bios reading not supported?


----------



## 8472

TheNaitsyrk said:


> Suprim X. Suprim X is like 28.6DB and Strix around 30-31DB so almost no difference. No coil while on my Suprim X.
> 
> There is, download newest beta from Guru3D, link few pages behind.


Nice, is that with the quiet bios? It looks like it might be the warmest card when using its quiet fan profile.


----------



## Arizor

EEE-RAY said:


> Ah ok. Is this an ASUS card thing? I was half asking because the mem clock seems to be higher than many other values ive seen i forums and reviews.
> 
> I wonder if they have identical boards and heatsinks already set up for the 4090 and future 4090ti and will literally just swap the core depending on the load. I can't imagine ASUS using higher binned GDDR6X for no particular reason.


Nah wouldn’t think it’s unique to ASUS. Probably a lot of AIBs doing so, maybe more ASUS examples simply because OCers tend toward buying ASUS and posting on forums than other brands 😂


----------



## GQNerd

Suprim Liquid X, stock Gaming BIOS 530w
OC: VF Curve for Core, +1500 Mem

Top 25.. not bad. 

Still waiting for my new board and RAM.. might be able to squeeze a few more pts out


----------



## derthballs

AvengedRobix said:


> Yes.. on timespy and Port royal.. now i'm waiting for a waterblock


I got my 4th seperate cable but i still only see max 495w in benching, cant get it above the zotak bios, either the cable they supply maxes out at 495w (although not sure how) or im flashing it wrong - gpuz reports 600w but it remains 495w.


----------



## changboy

MrTOOSHORT said:


> EVGA 1600 T2. Just bought a Cablemod 16pin to x4 8pin cable. Should be cleaner looking than the one that came in the 4090 box.


Can you link me where you bought your cable ?


----------



## MrTOOSHORT

changboy said:


> Can you link me where you bought your cable ?


*





CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store







store.cablemod.com




*


----------



## AvengedRobix

derthballs said:


> I got my 4th seperate cable but i still only see max 495w in benching, cant get it above the zotak bios, either the cable they supply maxes out at 495w (although not sure how) or im flashing it wrong - gpuz reports 600w but it remains 495w.


it's normal, the bios of Zotac was only 495W. You need to flash another bios if you want more. Have you flashed the gigabyte bios?


----------



## dante`afk

So Bykski is recalling all blocks that they sold for the FE card. Turns out they used the wrong naming convention for the product as it’s supposed to be for the REF design


----------



## dk_mic

don't know if this is news, but alphacool approved MSI suprim and Palit Gamerock cards https://www.alphacool.com/download/compatibility list Nvidia.pdf

hope it fits on the gaming trio's as well..


----------



## BigMack70

arieldeboca said:


> What PSUS are you using with the 4090 (model and wattage)?


I'm using a corsair AX1000i. No issues.


----------



## bmagnien

dk_mic said:


> don't know if this is news, but alphacool approved MSI suprim and Palit Gamerock cards https://www.alphacool.com/download/compatibility list Nvidia.pdf
> 
> hope it fits on the gaming trio's as well..


They didn’t ‘approve’ those, they’re allegedly producing blocks for them, which are ‘coming soon’ on that list.


----------



## dk_mic

bmagnien said:


> They didn’t ‘approve’ those, they’re allegedly producing blocks for them, which are ‘coming soon’ on that list.


it literally says "compatibility approved"


----------



## KedarWolf

The #1 reason I ordered a Strix is because Optimus Water Cooling is coming out with a block for them and the main reason I NEVER ordered an FE is because the 3090 series you could only use the stock BIOS on them. OWC makes a block for them too.


----------



## bmagnien

dk_mic said:


> it literally says "compatibility approved"


Ok fair enough. But they are coming soon and they are 3 different blocks that are coming soon (suprim, FE, gamerock). So they’re making new blocks for new cards. Approved to me sounds like approving compatibility for current cards. Which is only their reference block which has a comical dirth of compatible cards


----------



## KedarWolf

bmagnien said:


> Ok fair enough. But they are coming soon and they are 3 different blocks that are coming soon (suprim, FE, gamerock). So they’re making new blocks for new cards. Approved to me sounds like approving compatibility for current cards. Which is only their reference block which has a comical dirth of compatible cards


Optimus blocks are sooooo good, gaming on my 3090 with one pump and one 360 rad, I get like 50C core and 55C memory and less on the hot spot. Port Royal or something it's still 55C core, 60C memory. They are engineered very well, expensive though.


----------



## Baasha

bmagnien said:


> UNIGINE Superposition benchmark score
> 
> 
> UNIGINE Superpsition detailed score page
> 
> 
> 
> 
> benchmark.unigine.com
> 
> 
> 
> 
> 
> 15k 8k superposition. full fans in a warm ambient room. these cards are bonkers and going to get better with blocks.


For a single GPU to be almost as good as Titan Xp 4-Way SLI (80% of the score) is phenomenal - it is 5.5 years ago but still...


----------



## LunaP

KedarWolf said:


> The #1 reason I ordered a Strix is because Optimus Water Cooling is coming out with a block for them and the main reason I NEVER ordered an FE is because the 3090 series you could only use the stock BIOS on them. OWC makes a block for them too.


Yeah they said on their twitter they'll have blocks for the FE and others as well so if released I'll be grabbing one too. Since FE can be flashed this round.


----------



## Pollomir

arieldeboca said:


> What PSUS are you using with the 4090 (model and wattage)?


EVGA 1000 P2. No problems either.


----------



## Baasha

LunaP said:


> Yeah they said on their twitter they'll have blocks for the FE and others as well so if released I'll be grabbing one too. Since FE can be flashed this round.


Were you able to get the FE or Strix yet? I'm still waiting...


----------



## KedarWolf

I ran Cyberpunk on my 3090 with settings maxed out but ultra instead of psycho, I get 50C core, 70C memory and hot spot on my 3090, but my ambient temps are very warm right now.

Still, for one pump at 2600 RPM and one 360 rad with high ambient temps, it's decent. 3090s run hot.


----------



## Mad Pistol

Baasha said:


> For a single GPU to be almost as good as Titan Xp 4-Way SLI (80% of the score) is phenomenal - it is 5.5 years ago but still...


We've come a long way in a few generations.


----------



## mirkendargen

Bykski 分体式水冷官方网站


Bykski 分体式水冷官方网站




www.bykski.com





Come on Bykski, we need MSI/Gigabyte 4090 blocks, not 3090 Kingpin blocks lol. (The list is ordered by newest products).


----------



## Laithan

Subbing to the thread.. 141 pages already, daym.. have some homework to do. I scored an MSI 4090 Suprim liquid at MSRP from the MSI Store yesterday.

I barely even used my EVGA FTW3 Ultra 3090Ti but the 4K gains are too hard to pass up. Sadly I guess I wasted my money on the EK block for it that I never installed yet lol.


----------



## kaTus

Its limit of my 4090 Strix. Now gonna wait for EK block. 









Result not found







www.3dmark.com


----------



## Hanks552

TheNaitsyrk said:


> I think Suprim X has most, 26(GPU)+4(MEMORY). It's kind of let down with 520W BIOS. But can be flashed so not a problem.


Got it
I think I’m going to return my MSI trio then, cancel my backorder asus and get a Suprim X, the only problem is that they don’t have waterblock for it yet, just asus, zotac and FE


----------



## Clukos

Pushing the limits of my FE, waiting on EK block as well 

I scored 55 503 in Fire Strike 
I scored 30 813 in Time Spy 
I scored 28 646 in Port Royal 
I scored 40 831 in Fire Strike Extreme 
I scored 26 426 in Fire Strike Ultra 
I scored 15 353 in Time Spy Extreme


----------



## akgis

Strix boosts out of the box to 2735 no more than that, cant stable 3ghz not even 2940 is pushing it for benchs def not for gaming... very unlucky with the lotery?

It was a lot of meney for a bad one expecialy from what I read here loads of cheaper boards do more, kinda thinking in returning and waiting for the 4090ti, but if I regreat I might not be able to get a 4090 soon since the stock is being drip feed by the distributors.


----------



## yzonker

Bykski block should be here Wednesday for my TUF. I splurged for the DHL shipping. Like 3 or 4 days from China.


----------



## Panchovix

So the Gaming X Trio has only 3x8 pins adapter right? Which other cards have 3x8 adapters, so to avoid them?


----------



## LuckyImperial

I managed to cram one of these into a SFF Lian Li Q58. It needs cables, but the fit isn't so bad with a Liquid X. I know the 850 GM isn't ideal, but I don't plan on ever running beyond 100% power. The goal is actually to keep it below 100% power.


http://imgur.com/a/FLab4qp


Liquid X at 70% power slider:


----------



## Hanks552

Laithan said:


> Subbing to the thread.. 141 pages already, daym.. have some homework to do. I scored an MSI 4090 Suprim liquid at MSRP from the MSI Store yesterday.
> 
> I barely even used my EVGA FTW3 Ultra 3090Ti but the 4K gains are too hard to pass up. Sadly I guess I wasted my money on the EK block for it that I never installed yet lol.


How? What store? Because when I try to buy it, is just redirecting me to other stores like microcenter, BestBuy…


----------



## mirkendargen

Hanks552 said:


> How? What store? Because when I try to buy it, is just redirecting me to other stores like microcenter, BestBuy…








GeForce® RTX 40 Series - MSI-US Official Store







us-store.msi.com


----------



## Nizzen

New member in the family


----------



## Spiriva

arieldeboca said:


> What PSUS are you using with the 4090 (model and wattage)?


Corsair 1600i


----------



## Spiriva

Nizzen said:


> New member in the family
> View attachment 2577464


That white Apex z790 looks amazing too!  Did you buy it from a retail shop, or got it from Asus? I cant find any shop who got it yet in Europe.


----------



## Deve5tat0R

The latest ROG Strix member makes it's way home. Running 3Ghz stable in bechmarks and games, occasionaly drops to 2985Mhz depending on ambient temps. Temps @ 67C


----------



## Hanks552

mirkendargen said:


> GeForce® RTX 40 Series - MSI-US Official Store
> 
> 
> 
> 
> 
> 
> 
> us-store.msi.com


Did you receive a email? I signed up for notification but I don’t think I will act fast enough against bots. And I already have nowInstock notification but the last suprim I saw was on launch


----------



## mirkendargen

Hanks552 said:


> Did you receive a email? I signed up for notification but I don’t think I will act fast enough against bots. And I already have nowInstock notification but the last suprim I saw was on launch


No I didn't buy anything there, I just knew that's the store.

And FYI for other people that may not know Asus and Gigabyte have first party stores too:






Graphics Cards｜ASUS USA







www.asus.com









Shop - Components - GIGABYTE OFFICIAL STORE







store.gigabyte.us


----------



## sniperpowa

Timespy run last night with 13900k and suprim liquid X I scored 37 875 in Time Spy


----------



## EEE-RAY

Whats everyone's technique to determine maximum overclock?

I used the autooverclock thing and I got 2850mhz for clock.
That was a disappointing, so I then tested a manual overclock at voltage and temp limits. I add +25 increments and so far got to +225 without crashing in PR and time spy extreme. How do people here validate their overclocks stability for 24/7?

Addit: wow I fired up heaven and it artifacts like crazy. Very minor artefacting at +175, and no artifact at +150. That gives me 2970 mhz core. I can't believe so many years after release heaven is still so good for unvealing instability.


----------



## derthballs

AvengedRobix said:


> it's normal, the bios of Zotac was only 495W. You need to flash another bios if you want more. Have you flashed the gigabyte bios?


Yes im on the 600w gigabyte bios but the power wont go past 495 in benchmarks.


----------



## EEE-RAY

My strix with 600W bios also hardly ever hits above 500W due to voltage limit.


----------



## yzonker

I'm finding more and more that overclocking the core yields very little performance. Using the Strix bios in CP2077, 

+1800 mem was good for 3.3%
+150 core ([email protected]) was only 1.7%

And I did the core OC run with the mem at +1800 to give it the maximum mem bandwidth. 

Waterblock will still have the advantages of size and low noise, but it will provide virtually no performance.


----------



## N19htmare666

sweepersc said:


> I checked the Suprim Liquid X vbioses uploaded on TPU (95.02.18.40.4C) and it does say 530W. However, that is not the vbios my card came with, mine is 95.02.18.00.CD. Here's the Nvidia SMI screenshots:
> Stock
> View attachment 2577279
> 
> 
> Max PL
> View attachment 2577280
> 
> I also tried uploading in TPU but it said that registration from certain countries are no longer accepted. I put it in GDrive, here is the link . Can someone upload this on TPU please?
> 
> Regards,
> Mark


Will this work is flashed to the gigabyte waterforce to unlock 600w?


----------



## Arizor

yzonker said:


> I'm finding more and more that overclocking the core yields very little performance. Using the Strix bios in CP2077,
> 
> +1800 mem was good for 3.3%
> +150 core ([email protected]) was only 1.7%
> 
> And I did the core OC run with the mem at +1800 to give it the maximum mem bandwidth.
> 
> Waterblock will still have the advantages of size and low noise, but it will provide virtually no performance.


Yep for games I’m undervolted 0.95v @ 2700mhz / +1500mhz.

Max 340watts draw, 64C temps, and still outperforming stock. Could go to 500watts for about 3-4% more frames but absolutely no need.


----------



## yzonker

Arizor said:


> Yep for games I’m undervolted 0.95v @ 2700mhz / +1500mhz.
> 
> Max 340watts draw, 64C temps, and still outperforming stock. Could go to 500watts for about 3-4% more frames but absolutely no need.


Yea I wondered how it would work out when it became clear memory bandwidth wasn't going to increase very much compared to 30 series. 3080ti/3090 was almost linear with core so it was more significant (still obviously not big gains though). 

I love the card for its raw performance, but it's going to be kinda boring from an OC'ing/extreme cooling/ benching standpoint.


----------



## BTK

Anyone not getting the gsync logo on lg oled on their 4090 only VRR


----------



## ArcticZero

Anyone have experience with the Bykski 4090 Strix block? Any gotchas? Will be needing something as a stopgap while waiting for the Optimus ones.


----------



## mattskiiau

Had a few days to play with my Trio, was able to get it stable on a variety and games at 3015Mhz @ 1.90-1.1v

My curve looks like this:

1090v = 3015 Mhz
1095v = 3015 Mhz
1.100v = 3015 Mhz

I no longer see a reason to worry about buying a 4x8pin connector and BIOS flashing since 99% of the time it's VREL at this Mhz.


----------



## Mad Pistol

BTK said:


> Anyone not getting the gsync logo on lg oled on their 4090 only VRR


Which LG OLED do you have? I've never seen a GSYNC logo on my CX.


----------



## changboy

Just try my gigabyte gaming oc for around 10 minute, seam all is fine, i begun oc a bit and didn't saw any artifacts, i will try higher maybe tomorrow :


----------



## BTK

Mad Pistol said:


> Which LG OLED do you have? I've never seen a GSYNC logo on my CX.


C2 with my 3090 I got one


----------



## Bilco

Anyone have a spare Corsair PCIE5 type 4 600w 12VHPWR connector I can snag off them? Been sharking the shop daily but am never awake at 2am-4am when they come in stock.


----------



## schoolofmonkey

Bilco said:


> Anyone have a spare Corsair PCIE5 type 4 600w 12VHPWR connector I can snag off them? Been sharking the shop daily but am never awake at 2am-4am when they come in stock.


I have a HX1000 which is compatible with the cable, but after reading Corsair's press release it seems only the 1200w PSU's are compatible with the 600w power draw, 1000w are limited to 450w.

"
The official CORSAIR 12VHPWR 600W PSU cable is available to order immediately from the CORSAIR webstore. Orders will ship beginning September 29th.
To check if your PSU is compatible with the CORSAIR 12VHPWR 600W PSU cable, please check https://www.corsair.com/12vhpwr
*600W load requires a 1200W rated CORSAIR PSU or higher. 450W load requires 1000W or higher. 300W load requires 750W or higher. 
"


https://www.corsair.com/newsroom/press-release/ready-to-go-beyond-fast-corsair-announces-compatibility-for-nvidia-40-series-graphics-cards


----------



## mirkendargen

BTK said:


> Anyone not getting the gsync logo on lg oled on their 4090 only VRR


I have an LG CX and G-Sync is working fine. I did have a weird issue where 3840x2160 was compressed and 4096x2160 looked normal...but after disabling the 4096 resolutions in CRU (which I normally do anyway) it was fine.


----------



## BTK

mirkendargen said:


> I have an LG CX and G-Sync is working fine. I did have a weird issue where 3840x2160 was compressed and 4096x2160 looked normal...but after disabling the 4096 resolutions in CRU (which I normally do anyway) it was fine.


Yes gsync is working on my c2 but in game optimizer it says VRR not gsync my 3090 showed gsync 

does yours say VRR or gsync

It doesn’t really matter what it says it’s still working properly


----------



## mirkendargen

BTK said:


> Yes gsync is working on my c2 but in game optimizer it says VRR not gsync my 3090 showed gsync
> 
> does yours say VRR or gsync
> 
> It doesn’t really matter what it says it’s still working properly


CX's don't have "game optimizer" so no idea. The procedure on a CX is to enable "instant game response" on the HDMI port to the computer, then the screen will show as G-sync-capable in NVCP.


----------



## BTK

Ok so this would be a question for a c1 then yes gsync is enabled in nvcp and VRR is working


----------



## J7SC

Finally had a chance to bench at less than 25 C ambient ! The Gigabyte Gaming OC likes the colder temps, but I still run into Hotspot temps at around 550+W....still managed to improve scores. In TimeSpy Extreme, I now have the somewhat dubious honour 🥴 to have the last surviving Ryzen 5K desktop in HoF. Overall, the card has a ton of potential, but step 1 is to get a decent water-block, and in a few months take a look at upgrading mobo / CPU / Ram as step 2...slippery slope and all that

















@BTK ...yes, on my C1 GSync in NVCP enabled and VRR working...


----------



## Arizor

Watching @J7SC making his purchase list


----------



## BTK

J7SC said:


> Finally had a chance to bench at less than 25 C ambient ! The Gigabyte Gaming OC likes the colder temps, but I still run into Hotspot temps at around 550+W....still managed to improve scores. In TimeSpy Extreme, I now have the somewhat dubious honour 🥴 to have the last surviving Ryzen 5K desktop in HoF. Overall, the card has a ton of potential, but step 1 is to get a decent water-block, and in a few months take a look at upgrading mobo / CPU / Ram as step 2...slippery slope and all that
> View attachment 2577594
> 
> View attachment 2577596
> 
> 
> @BTK ...yes, on my C1 GSync in NVCP enabled and VRR working...


I know VRR is working but does it say gsync when open VRR menu


----------



## J7SC

BTK said:


> I know VRR is working but does it say gsync when open VRR menu


Yes


----------



## BTK

Can you provide a pic my 3090 said gsync but the 4090 only says VRR technically it’s the same thing but I think it must be a bug it’s been reported on Nvidia forums and Reddit by multiple people


----------



## Bilco

schoolofmonkey said:


> I have a HX1000 which is compatible with the cable, but after reading Corsair's press release it seems only the 1200w PSU's are compatible with the 600w power draw, 1000w are limited to 450w.
> 
> "
> The official CORSAIR 12VHPWR 600W PSU cable is available to order immediately from the CORSAIR webstore. Orders will ship beginning September 29th.
> To check if your PSU is compatible with the CORSAIR 12VHPWR 600W PSU cable, please check https://www.corsair.com/12vhpwr
> *600W load requires a 1200W rated CORSAIR PSU or higher. 450W load requires 1000W or higher. 300W load requires 750W or higher.
> "
> 
> 
> https://www.corsair.com/newsroom/press-release/ready-to-go-beyond-fast-corsair-announces-compatibility-for-nvidia-40-series-graphics-cards


I have a 1200hx so I should be fine with my understanding.


----------



## AKBrian

J7SC said:


> Finally had a chance to bench at less than 25 C ambient ! The Gigabyte Gaming OC likes the colder temps, but I still run into Hotspot temps at around 550+W....still managed to improve scores. In TimeSpy Extreme, I now have the somewhat dubious honour 🥴 to have the last surviving Ryzen 5K desktop in HoF. Overall, the card has a ton of potential, but step 1 is to get a decent water-block, and in a few months take a look at upgrading mobo / CPU / Ram as step 2...slippery slope and all that


Ah, darn. Mine's gone, too.

<cracks knuckles>
_ow_

Still there for 1x GPU TSE graphics score, though. Yay!


----------



## minittt

Hulk1988 said:


> I had the Waterforce card and I cannot recommend it. The temperatures are similar compared to my Strix and Suprim X and the memory temperature was even higher on the Waterforce card.


 was your waterforce a 3090 by any chance which uses Vram on other side of the PCB which I think 4090 doesn't have. I did find few online unboxing of Waterforce 4090 with Temp showing at least 10c cooler. Gigabyte is not known for high quality so that is a concern.


----------



## Nd4spdvn

BTK said:


> Anyone not getting the gsync logo on lg oled on their 4090 only VRR


Yes, I have reported this a few days ago here. On my CX when spamming the green button as it doesn't have a Game Optimizer menu it brings a small info window in one of the corners where it reports resolution, VRR mode etc. On 4090 my CX is only in VRR mode when Gsync engaged whereas when I had my 3080ti it reported Gsync just fine. It appears to work though from a variable refresh pov but not in proper Gsync mode. I hope it's just a driver issue and will be fixed.


----------



## J7SC

BTK said:


> Can you provide a pic my 3090 said gsync but the 4090 only says VRR technically it’s the same thing but I think it must be a bug it’s been reported on Nvidia forums and Reddit by multiple people


...I see what you mean re. the logo. Per pic below, on the very bottom is a recent pic from my 3090 Strix, while the rest refers to the 4090. Per quick test with MS 2020 and also 3DM (the warning they give you w/G-Sync on before you run the test) just now, it seems that G-Sync is definitely on, but w/o a NVidia driver update (or LG software update?), we just have to rely on VRR ON for now, as well as all the other confirmations shown below.


----------



## KillerBee33

J7SC said:


> ...I see what you mean re. the logo. Per pic below, on the very bottom is a recent pic from my 3090 Strix, while the rest refers to the 4090. Per quick test with MS 2020 and also 3DM (the warning they give you w/V-Sync on before you run the test) just now, it seems that V-Sync is definitely on, but w/o a NVidia driver update (or LG software update?), we just have to rely on VRR ON for now, as well as all the other confirmations shown below.
> View attachment 2577627


Same exact thing on my Samsung Q80T (2020) enabling g-sync in nvcp, enables VRR on TV.








G-Sync vs G-Sync Compatible: What's the Difference?


In this guide, we take a look at the difference between G-Sync monitors and G-Sync compatible monitors to help you determine which option is right for you. G-Sync is probably the most popular, longstanding, and ... Read more




techguided.com


----------



## Roldo

J7SC said:


> Finally had a chance to bench at less than 25 C ambient ! The Gigabyte Gaming OC likes the colder temps, but I still run into Hotspot temps at around 550+W....still managed to improve scores. In TimeSpy Extreme, I now have the somewhat dubious honour 🥴 to have the last surviving Ryzen 5K desktop in HoF. Overall, the card has a ton of potential, but step 1 is to get a decent water-block, and in a few months take a look at upgrading mobo / CPU / Ram as step 2...slippery slope and all that
> View attachment 2577594
> 
> View attachment 2577596
> 
> 
> @BTK ...yes, on my C1 GSync in NVCP enabled and VRR working...


30°C delta between GPU edge/hotspot is a lot...
What's it like at stock?


----------



## KedarWolf

Can't wait to get my Strix OC 4090.

I have the 7950x, ASROCK X670E Taichi, A-Die DDR5 with an unlocked PMIC, get the CPU mounting kit I need for them Tuesday, but the 4090 will be at least a few more weeks before they have stock to honour my preorder. 

Edit: I'm only going to update my build signature as the build progresses.

Still actually using everything in it right now.


----------



## J7SC

Roldo said:


> 30°C delta between GPU edge/hotspot is a lot...
> What's it like at stock?


12 C - 14 C delta stock, just under 20 C delta at full power

Actual hotspot temp is the issue when it gets over 530 W or so...I am tempted to re-paste the cooler, but want to take it apart only once, ie. for the water-block when I can get one. Currently, I can't run a combined max known stable OC (with VRAM stock) or max known efficient VRAM (with GPU stock)...I don't know for sure, but I figure the GPU's memory controllers get all flustered at that hotspot temp...


----------



## Azazil1190

For those that they have issues on lg oleds and nvidia logo in game optimiser.Try something.
Power on your tvs first and then power on the pcs

*But power on the tv at pc input before turn on the pc


----------



## Toony90

From what I observed on the internet, some people in Suprim Liquid X have a power limit of 600W, while some, like me, have a 530W bios, so it turns out that MSI did not show off this time ... If I would like to flash the bios to this 600W in the future, and the card would break the warranty is still valid or not? From what I heard MSI is rather troublesome in terms of warranty, it is a pity that I could not buy Nvidi FE


----------



## dreckschmeck

my Palit Gamerock non-OC is not so bad.
I scored 27 088 in Port Royal
I scored 14 450 in Time Spy Extreme

Only 450 watt PL so no major overclocking. But hey the Memory can be clocked to +2000  Also no coil whine at all
will run at 78% PL for 24/7 though


----------



## yzonker

That's interesting. I don't think I've ever seen anything other than VRR on my LG C1 including using both my 3090 and 3080ti.


----------



## yzonker

schoolofmonkey said:


> I have a HX1000 which is compatible with the cable, but after reading Corsair's press release it seems only the 1200w PSU's are compatible with the 600w power draw, 1000w are limited to 450w.
> 
> "
> The official CORSAIR 12VHPWR 600W PSU cable is available to order immediately from the CORSAIR webstore. Orders will ship beginning September 29th.
> To check if your PSU is compatible with the CORSAIR 12VHPWR 600W PSU cable, please check https://www.corsair.com/12vhpwr
> *600W load requires a 1200W rated CORSAIR PSU or higher. 450W load requires 1000W or higher. 300W load requires 750W or higher.
> "
> 
> 
> https://www.corsair.com/newsroom/press-release/ready-to-go-beyond-fast-corsair-announces-compatibility-for-nvidia-40-series-graphics-cards


I'm using the cable with my RM1000x and Asus TUF. No issues so far. I think that's just a recommendation, not a requirement. I ran the RM1000x with my 3090 on the KP 1kw bios for a long time without issue as well. It would pull 600w+ running TSE.


----------



## sweepersc

N19htmare666 said:


> Will this work is flashed to the gigabyte waterforce to unlock 600w?


Hard to tell, I've also yet to reach above 510W in any 3dmark benchmark 😅


----------



## Krzych04650

I don't think the theory of low demand due to mining crash and poor economic conditions was correct at all. Just yesterday one of the retailers here put 6 units of TUF card, which has MSRP of 9699, for 13999 and the dumbos still bought it. And that's not only in the EU where 4090 is like 2000 EUR for starters instead of $1600, but in freaking Poland, where this is the equivalent of almost four average monthly net salaries. Interesting times, anything just sells regardless of everything


----------



## 8472

630W


__ https://twitter.com/i/web/status/1584142869094629382


----------



## Vasoka

Got gigabyte gaming 4090, can go all the way to +280 on core in all benchmarks (furmark / 3rd mark etc), but sporadically crashes in gaming + shows big artifacts in AC: Valhalla maxed out unless it's brought down to +220. Wonder how normal that is? Benchmarks don't show artifacts, have ran a 8h+ loop of Timespy extreme with no issues. Kinda weird.


----------



## ALSTER868

Vasoka said:


> Got gigabyte gaming 4090, can go all the way to +280 on core in all benchmarks (furmark / 3rd mark etc), but sporadically crashes in gaming + shows big artifacts in AC: Valhalla maxed out unless it's brought down to +220. Wonder how normal that is? Benchmarks don't show artifacts, have ran a 8h+ loop of Timespy extreme with no issues. Kinda weird.


I would be happy if I were you with stable +220 and even more. Nothing weird in this behaviour. The more so as it is on one of the cheapest and not most vigorous card as far as electrical specs are concerned. 
What about vram, how far does it go?


----------



## Vasoka

ALSTER868 said:


> I would be happy if I were you with stable +220 and even more. Nothing weird in this behaviour. The more so as it is on one of the cheapest and not most vigorous card as far as electrical specs are concerned.


+1500, what do you mean "cheapest and not most vigorous card"? Kinda confused here ngl.


----------



## ALSTER868

Vasoka said:


> +1500, what do you mean "cheapest and not most vigorous card"? Kinda confused here ngl.


Gigabyte Gaming OC is one of the cheapest 4090. And has a simpler VRM than other cards do.


----------



## ALSTER868

Vasoka said:


> +1500




Seeing what the Gigabyte Gaming owners say here, seems I'm gonna get same card for myself, looks to be very good for its price...


----------



## Vasoka

ALSTER868 said:


> Gigabyte Gaming OC is one of the cheapest 4090. And has a simpler VRM than other cards do.


Ah, I see. It was the only card which I could snatch on day 1 in Bulgaria, and I wanted it instantly, + the reviews for it were very good so I took it, seems I made the right choice. 



ALSTER868 said:


> Seeing what the Gigabyte Gaming owners say here, seems I gonna get same card for myself, seems to be very good for its price...


It was still expensive af here, hope you can get a better price. So you're saying it's normal for benchmarks to pass, but games to misbehave / artifact?


----------



## bmagnien

Vasoka said:


> Ah, I see. It was the only card which I could snatch on day 1 in Bulgaria, and I wanted it instantly, + the reviews for it were very good so I took it, seems I made the right choice.
> 
> 
> 
> It was still expensive af here, hope you can get a better price. So you're saying it's normal for benchmarks to pass, but games to misbehave / artifact?


It’s normal and can go either way. On my 3090 unstable core oc caused program crashed. Unstable mem oc caused artifacts. Super unstable mem oc caused full system lock. Could oc higher in port Royal than in any game.

now with 4090, high core oc caused artifacts, and mem oc caused program crash. So the opposite. Also, i can go past mymax port Royal oc in all games
I’ve tested so far, but instant crash in PR, so again the opposite behavior of my 3090


----------



## RaMsiTo

Resolve 93% Power Limit Problem with Afterburner, MSI Tool Summary











suprim x before, gaming bios













after


----------



## ALSTER868

Vasoka said:


> So you're saying it's normal for benchmarks to pass, but games to misbehave / artifact?


You took it quite far to +280, not every card can do that, but if it's stable at +220 in every app and memory can hold +1500 then it's good.
Some freq could be benchable but unstable in games, it's absolutely normal. So you can enjoy your +220, again


----------



## Xavier233

Was anyone able to snag the ATX3.0 PSU with a single cable between the 4090 and the PSU? Having 4 cables makes the case look like a mess


----------



## Avet

MSI SUPRIM LIQUID X in the list says 600W limit while in TechPowerup bios limit is set to 530W. Did i miss some custom bios?


----------



## Glottis

Xavier233 said:


> Was anyone able to snag the ATX3.0 PSU with a single cable between the 4090 and the PSU? Having 4 cables makes the case look like a mess


Why buy a new PSU? All you need is a $20 12VHPWR cable from your PSU maker to get the same clean look. Here's the cable for Corsair PSUs that I'm using https://www.corsair.com/12vhpwr


----------



## Toony90

Avet said:


> MSI SUPRIM LIQUID X in the list says 600W limit while in TechPowerup bios limit is set to 530W. Did i miss some custom bios?


Not. Simply, MSI suddenly changed the bios from 600W to 530W in Suprim Liquid X for unknown reasons, some people who were lucky got a 600W bios card, but most people got a 530W bios card.


----------



## Xavier233

Glottis said:


> Why buy a new PSU? All you need is a $20 12VHPWR cable from your PSU maker to get the same clean look. Here's the cable for Corsair PSUs that I'm using https://www.corsair.com/12vhpwr


Because using that cable on a 1000w PSU is limiting your GPU to only 450w. I currently have a corsair RM1000x. Kind of useless to buy a 600w card only to be limited to a 450w by the PSU cabling you are using.


----------



## BTK

Regarding the LG VRR gsync logo issues I don’t know if just saying VRR vs saying gsync has any performance differences they both appear to work the same hopefully it will get sorted.


----------



## Laithan

Glottis said:


> Why buy a new PSU? All you need is a $20 12VHPWR cable from your PSU maker to get the same clean look. Here's the cable for Corsair PSUs that I'm using https://www.corsair.com/12vhpwr


I cannot seem to confirm if the Corsair AX1500i supports this cable.. Anyone know?


----------



## mirkendargen

Xavier233 said:


> Because using that cable on a 1000w PSU is limiting your GPU to only 450w. I currently have a corsair RM1000x. Kind of useless to buy a 600w card only to be limited to a 450w by the PSU cabling you are using.


That isn't how the cable works, the sense pins on it will always be grounded to signal 600w. That's a warning from Corsair that your PSU may not be able to handle a 600w GPU.


----------



## N19htmare666

8472 said:


> 630W
> 
> 
> __ https://twitter.com/i/web/status/1584142869094629382


Wonder if this can be flashed to msi liquid suprim and or the gigabyte waterforce


----------



## mirkendargen

N19htmare666 said:


> Wonder if this can be flashed to msi liquid suprim and or the gigabyte waterforce


Really just depends how the fan curve is setup between the cards, which channel the pump is on, etc.


----------



## Toony90

N19htmare666 said:


> Wonder if this can be flashed to msi liquid suprim and or the gigabyte waterforce


Probably yes, but the rmp speed of the fans in Neptune is unknown. I wish this card could be purchased in the UK.


----------



## N19htmare666

Toony90 said:


> Probably yes, but the rmp speed of the fans in Neptune is unknown. I wish this card could be purchased in the UK.


Same it does look awesome! Was thinking of getting a friend to send it over to uk but then thought the warranty would be a pain if it was ever needed


----------



## Locozero10

Flumpenlicht said:


> Just did it. Flashed the MSI Suprim x liquid BIOS to my Trio.
> Power Limit is now at 530 W with the 3x 8 pin.


does everything still work ok? Can you still control all your fans?


----------



## N19htmare666

Toony90 said:


> Probably yes, but the rmp speed of the fans in Neptune is unknown. I wish this card could be purchased in the UK.


If it works, this would almost turn the waterforce into the Neptune for the none Asian markets. Would probably be enough to convince me to buy over the liquid suprim with the 600w BIOS flashed


----------



## Glottis

Xavier233 said:


> Because using that cable on a 1000w PSU is limiting your GPU to only 450w. I currently have a corsair RM1000x. Kind of useless to buy a 600w card only to be limited to a 450w by the PSU cabling you are using.


Well that's just false. Getting the full 600W on my RM1000x. Power slider goes up to 133% and I've personally seen card pulling 550W in Quake 2 RTX.


----------



## N19htmare666

Toony90 said:


> Probably yes, but the rmp speed of the fans in Neptune is unknown. I wish this card could be purchased in the UK.


There's a fan speed curve here https://quasarzone.com/bbs/qc_qsz/views/1399250


----------



## LuckyImperial

My Liquid X is paired with a 12600k and I think it might be bottlenecking it, but the CPU core usage doesn't seem to indicate that. I'm not even seeing 4.9GHz effective.









I turned GSync off, which did help boost utilization. 

On another note...GPUZ says my Liquid X has a 480W/530W default/max power config. It also has zero coil whine for the record.


----------



## Xavier233

Glottis said:


> Well that's just false. Getting the full 600W on my RM1000x. Power slider goes up to 133% and I've personally seen card pulling 550W in Quake 2 RTX.


Are you using the cable from Corsair, or, you are using 4 PSU cables? Thats what makes the biggest difference. 

If you are using 4 cables directly on the PSU, even if you have a 1000w PSU, of course you are getting the full 600w


----------



## Xavier233

mirkendargen said:


> That isn't how the cable works, the sense pins on it will always be grounded to signal 600w. That's a warning from Corsair that your PSU may not be able to handle a 600w GPU.


Thats not whats written on their website. They clearly say that if you have a 1000w, the cable will output 450w, while if you have a 1200+ PSU, the cable will output 600w


----------



## Toony90

N19htmare666 said:


> If it works, this would almost turn the waterforce into the Neptune for the none Asian markets. Would probably be enough to convince me to buy over the liquid suprim with the 600w BIOS flashed


I have Suprim Liquid X and honestly if I could buy FE 4090, or at least Aorus Waterforce I wouldn't buy Suprim. The card is of course good, but due to the fact that it is only 240mm, the temperatures are lower but not much lower than air-cooled cards, and the memory temperatures are higher than in air-cooled cards, plus 530W bios. The only plus of Suprim is its size compared to air-cooled cards.


----------



## yzonker

Xavier233 said:


> Thats not whats written on their website. They clearly say that if you have a 1000w, the cable will output 450w, while if you have a 1200+ PSU, the cable will output 600w


It's just misleading. They are just looking at what total system power might be and recommending a size. And I'd say it's correct. Although I haven't had any issues with my RM1000x, I would be very close to OCP if I ran something like yCruncher and Kombustor at the same time since my 12900k will pull 300w+. But since I never run anything that loads both the CPU and GPU fully at the same time, it's a non-issue. But if I were buying a new PSU, I would definitely get 1200w minimum, probably more just to be safe.


----------



## mirkendargen

Xavier233 said:


> Thats not whats written on their website. They clearly say that if you have a 1000w, the cable will output 450w, while if you have a 1200+ PSU, the cable will output 600w


I have the cable, it's extremely clear how the sense pins are wired. They are warning you out of an abundance of caution not to use a 600w GPU on a 1000w PSU.


----------



## Xavier233

So you guys are saying that if I get that cable, technically, I can still pull 600w through that cable on a 1000w PSU? If yes, their description is misleading, giving us the impression we need to get a bigger PSU. 

Anyhow, my 12900K on full load, with everything else in the system pulls about 350w (with the GPU being idle). In worse case scenarios, the GPU can withdraw 600w + 350w = 950w. Would you guys consider a bigger PSU (and at which point I would just get a 1350w PSU or another ATX3.0 PSU which wont have a cable mess)? Or just get the 12v cable and call it a day?


----------



## mirkendargen

Xavier233 said:


> So you guys are saying that if I get that cable, technically, I can still pull 600w through that cable on a 1000w PSU? If yes, their description is misleading, giving us the impression we need to get a bigger PSU.
> 
> Anyhow, my 12900K on full load, with everything else in the system pulls about 350w (with the GPU being idle). In worse case scenarios, the GPU can withdraw 600w + 350w = 950w. Would you guys consider a bigger PSU (and at which point I would just get a 1350w PSU or another ATX3.0 PSU which wont have a cable mess)? Or just get the 12v cable and call it a day?


The cable isn't "smart", it doesn't know what PSU it's plugged into. It just maps +12v and ground wires from the PSU to the GPU. The sense pins are hardwired to ground on the cable. Now Corsair may be misleading you for good reason because whether you *can* use a 600w GPU with a 1000w PSU is different than whether you *should*. If Corsair said "this will work but we don't recommend it" people will still try when Corsair may feel they shouldn't.


----------



## Clukos

LuckyImperial said:


> My Liquid X is paired with a 12600k and I think it might be bottlenecking it, but the CPU core usage doesn't seem to indicate that. I'm not even seeing 4.9GHz effective.
> View attachment 2577716
> 
> 
> I turned GSync off, which did help boost utilization.
> 
> On another note...GPUZ says my Liquid X has a 480W/530W default/max power config. It also has zero coil whine for the record.


Likely CPU limit, this is what I got with a 5800X3D and I think even this is CPU limited


----------



## Xavier233

mirkendargen said:


> The cable isn't "smart", it doesn't know what PSU it's plugged into. It just maps +12v and ground wires from the PSU to the GPU. Now Corsair may be misleading you for good reason because whether you *csn* use a 600w GPU with a 100w PSU is different than whether you *should*. If Corsair said "this will work but we don't recommend it" people will still try when Corsair may feel they shouldn't.


Thats what I initially thought as well, since these PSUs were released well before the cables, so physically, they have no way of forcing that 450w vs 600w limitation. 

I see 2 advantages for the ATX3.0 PSUs though: much cleaner cable management, and a higher wattage capacity (1200w, 1350w, 1500w, etc) for a bit better headroom.


----------



## KedarWolf

Xavier233 said:


> Because using that cable on a 1000w PSU is limiting your GPU to only 450w. I currently have a corsair RM1000x. Kind of useless to buy a 600w card only to be limited to a 450w by the PSU cabling you are using.


That PSU cable, the two extensions plug right into your PSU, not to 8-pin pci-e cables. And according to Corsair specs each cable from the PSU can pull 324W.

If your PSU uses Corsair Type 4 cables that is.

To clarify, each extension attached can pull up to 324W, so 2x324W is fine.


----------



## yzonker

Clukos said:


> Likely CPU limit, this is what I got with a 5800X3D and I think even this is CPU limited


No, the 12600k should be plenty for running Superposition. A faster CPU/mem might help the score slighlty, but nowhere near enough to bring LuckyImperial's score up to where it should be. There's something else not set up correctly causing that low score.


----------



## yzonker

KedarWolf said:


> That PSU cable, the two extensions plug right into your PSU, not to 8-pin pci-e cables. And according to Corsair specs each cable from the PSU can pull 324W.
> 
> If your PSU uses Corsair Type 4 cables that is.


Yea, the Corsair cable installation is very clean. Really no different than an ATX 3.0 cable other than 2 plugs at the PSU instead of 1.


----------



## Xavier233

Do you guys think 1000w should be enough for a 4090 + 12900K? 

was anyone able to get that Cable in Canada? Its out of stock everywhere


----------



## KedarWolf

Xavier233 said:


> Do you guys think 1000w should be enough for a 4090 + 12900K?
> 
> was anyone able to get that Cable in Canada? Its out of stock everywhere


I have to use the 4x PCI-e cable adapter that will come with my Strix OC until Corsair has stock of that cable. Can't find it anywhere.

But I can actually do that, as I have an AX1600i.


----------



## Sheyster

BTK said:


> C2 with my 3090 I got one


I have a C2 and a CX. The game menu is very different on the C2. I don't recall seeing G-sync in the CX menu, only VRR.


----------



## AvengedRobix

Azazil1190 said:


> For those that they have issues on lg oleds and nvidia logo in game optimiser.Try something.
> Power on your tvs first and then power on the pcs
> 
> *But power on the tv at pc input before turn on the pc


Use this method with a 3090 and works.. And Windows see the certification for hdr. 
Now with a 4090 rest on VRR and certification not available


----------



## N19htmare666

LuckyImperial said:


> My Liquid X is paired with a 12600k and I think it might be bottlenecking it, but the CPU core usage doesn't seem to indicate that. I'm not even seeing 4.9GHz effective.
> View attachment 2577716
> 
> 
> I turned GSync off, which did help boost utilization.
> 
> On another note...GPUZ says my Liquid X has a 480W/530W default/max power config. It also has zero coil whine for the record.


why not use the bios linked below? 



sweepersc said:


> I checked the Suprim Liquid X vbioses uploaded on TPU (95.02.18.40.4C) and it does say 530W. However, that is not the vbios my card came with, mine is 95.02.18.00.CD. Here's the Nvidia SMI screenshots:
> Stock
> View attachment 2577279
> 
> 
> Max PL
> View attachment 2577280
> 
> I also tried uploading in TPU but it said that registration from certain countries are no longer accepted. I put it in GDrive, here is the link . Can someone upload this on TPU please?
> 
> Regards,
> Mark


----------



## N19htmare666

Clukos said:


> Likely CPU limit, this is what I got with a 5800X3D and I think even this is CPU limited
> View attachment 2577723


Please can someone run this benchmark who owns a 5950x using PBO (i.e. not an all core OC)?


----------



## Alemancio

So after 2900+ posts and 146 pages... *what are the best 4090s? divided by tiers?*


----------



## dreckschmeck

LuckyImperial said:


> My Liquid X is paired with a 12600k and I think it might be bottlenecking it, but the CPU core usage doesn't seem to indicate that. I'm not even seeing 4.9GHz effective.
> View attachment 2577716
> 
> 
> I turned GSync off, which did help boost utilization.
> 
> On another note...GPUZ says my Liquid X has a 480W/530W default/max power config. It also has zero coil whine for the record.


this is bad i guess. I get 35000 with only a ryzen 5800x3d


----------



## mirkendargen

Alemancio said:


> So after 2900+ posts and 146 pages... *what are the best 4090s? divided by tiers?*


Tier 1: The one you can find in stock at MSRP and put in your computer
Tier 2: The rest


----------



## Panchovix

Alemancio said:


> So after 2900+ posts and 146 pages... *what are the best 4090s? divided by tiers?*


Try with cards that have 4x8 pin adapters, since you can use 600W. 3x8 is limited to 450W if I'm not wrong


----------



## ZealotKi11er




----------



## Alemancio

mirkendargen said:


> Tier 1: The one you can find in stock at MSRP and put in your computer
> Tier 2: The rest


Nah, I can get Zotac at MSRP but the Trinity is limited at 700A +/- and Zotac has repeatedly shown us they do the bare minimum to comply with Nvidia design. They wont get my money.



Panchovix said:


> Try with cards that have 4x8 pin adapters, since you can use 600W. 3x8 is limited to 450W if I'm not wrong


Thats a fair point.


----------



## N19htmare666

Alemancio said:


> So after 2900+ posts and 146 pages... *what are the best 4090s? divided by tiers?*


Any with a 600+ BIOS


----------



## 8472

Alemancio said:


> So after 2900+ posts and 146 pages... *what are the best 4090s? divided by tiers?*


Best for what? Overclocking? Noise? Case compatibility? Included accessories? Waterblock selection?


----------



## a_Criminai

If anyone is using the $1599 Zotac 4090 trinity I can confirm the zotac 4090 extreme vbios works fine on it. 

Also side note, apparently oc scanner is capped at +165 MHz. Nvidia should change that.


----------



## yzonker

ZealotKi11er said:


>


Now will they do the VRAM artifact bug? LOL Otherwise they'll just look like they suck.


----------



## yzonker

Gotta know the tricks.


----------



## J7SC

8472 said:


> Best for what? Overclocking? Noise? Case compatibility? Included accessories? Waterblock selection?


...to draw impartial and intelligent 'rankings', one would have to have all available (!) cards in for testing by the same person in the same system setup before one could even divide them into tiers and rankings. With this generation, the 'spread' in performance seems narrower than before. Apart from availability and price, there is also size - actually being able to fit these behemoths into your particular setup - and finally your current PSU etc.

I had researched the Gigabyte Gaming OC and Asus TUF/OC just enough to be ready to jump on them when either came along within roughly the same price range. Under different circumstances, I would have opted for a 4090 Strix as it seems to be the 'KingPin' of this gen (once water-cooled)...but 4090 Tis are out sooner or later, and AMD may yet spring a surprise with a top-end mGPU (rumours swirl w/ up to 24K+ cores) in the winter / spring. I find the 4090 AIO water-cooled versions intriguing re. space-saving, but am disappointed a bit re. their VRM / VRAM cooling numbers and somewhat puny rad sizes / materials / pumps.

The 4090 Gigabyte Gaming OC ('GGOC') appeared here at only a US$20 premium over the FE...if the Asus TUF / OC would have been available for a similar price (or at all), I would have considered that one as well. Anyhow, the GGOC turned out to be a gem... 20+4 VRM (which I would call 'decent'), and the post-temp-bin-drop effective speed is still in the low 3000 range for most of the tests after starting much higher pre-temp-bin-drop. It also has no discernable coil-whine and comes with a dual vbios and a 600W rating. To get close to that though and sustain it with most cards of this genre, you will need sizable custom water-cooling, IMO, for benching...for gaming, most models from a quality vendor should do at around 450W max...


----------



## fitnessgrampacertest

Is anybody else having issues controlling the RGB Lights around the end of their ROG Strix RTX 4090? I can get the logo RGB's to be set/controlled just fine, but the RGB's around the end of the card are completely uncontrollable/unresponsive.


----------



## jcde7ago

Toony90 said:


> I have Suprim Liquid X and honestly if I could buy FE 4090, or at least Aorus Waterforce I wouldn't buy Suprim. The card is of course good, but due to the fact that it is only 240mm, the temperatures are lower but not much lower than air-cooled cards, and the memory temperatures are higher than in air-cooled cards, plus 530W bios. The only plus of Suprim is its size compared to air-cooled cards.


I'm seeing maybe 3-5c higher memory temps on my own Suprim Liquid X than a lot of the better air coolers out there...completely inconsequential given how high memory temps are rated to be fine at. Like, not even worth worrying about at all (this is nothing like how it was with the poor design of the initial 3090 FEs and mem temps skyrocketing past 90c).

I still don't understand why people are going crazy over the 600w Suprim Liquid X bios though; I have the 530w bios and I find there to be 0 incentive to flash to the 600w one. We're talking maybe 3-5% gains to draw another 150w over the default max.

It was already posted in this thread but the play with these 4090s is to undervolt to a comfortable, stable voltage and get essentially the same as stock performance while almost halving the power draw. My Suprim Liquid X is running at 2760mhz w/ a +1000 mem OC, completely stable at 0.950mv (max default factory OC clock was 2820mhz). Winter isn't here yet in Cali and my card "struggles" to get past 57c @100% load no matter what I throw at it.


----------



## Glottis

Laithan said:


> I cannot seem to confirm if the Corsair AX1500i supports this cable.. Anyone know?


Did you take a look at compatibility table? https://www.corsair.com/eu/en/psu-cable-compatibility
All AXi models are supported (except AX1600i)


----------



## yzonker

J7SC said:


> ...to draw impartial and intelligent 'rankings', one would have to have all available (!) cards in for testing by the same person in the same system setup before one could even divide them into tiers and rankings. With this generation, the 'spread' in performance seems narrower than before. Apart from availability and price, there is also size - actually being able to fit these behemoths into your particular setup - and finally your current PSU etc.
> 
> I had researched the Gigabyte Gaming OC and Asus TUF/OC just enough to be ready to jump on them when either came along within roughly the same price range. Under different circumstances, I would have opted for a 4090 Strix as it seems to be the 'KingPin' of this gen (once water-cooled)...but 4090 Tis are out sooner or later, and AMD may yet spring a surprise with a top-end mGPU (rumours swirl w/ up to 24K+ cores) in the winter / spring. I find the 4090 AIO water-cooled versions intriguing re. space-saving, but am disappointed a bit re. their VRM / VRAM cooling numbers and somewhat puny rad sizes / materials / pumps.
> 
> The 4090 Gigabyte Gaming OC ('GGOC') appeared here at only a US$20 premium over the FE...if the Asus TUF / OC would have been available for a similar price (or at all), I would have considered that one as well. Anyhow, the GGOC turned out to be a gem... 20+4 VRM (which I would call 'decent'), and the post-temp-bin-drop effective speed is still in the low 3000 range for most of the tests after starting much higher pre-temp-bin-drop. It also has no discernable coil-whine and comes with a dual vbios and a 600W rating. To get close to that though and sustain it with most cards of this genre, you will need sizable custom water-cooling, IMO, for benching...for gaming, most models from a quality vendor should do at around 450W max...


Yea, my advice is to just buy the one that fits in the case, looks good in the case (if they care), etc... PL isn't an issue really now that we can flash. FE can't be flashed, but has 600w out of the box. This will only be an issue if an XOC bios comes along AND you volt mod. Only other reason to flash would be to fix some incompatibility with the stock bios. I guess I wouldn't buy an FE for this reason alone since other cards are just as good.

Seems like the MSI's are getting the most positive comments in regards to coil whine. TUF seems to have a lot of reports of whine. Mine has some, although I only notice it when I benchmark with low fans (of course I run max fans when I'm really trying). Games have enough sound that I don't hear it. I have a Phanteks Enthoo Pro 2 which is mesh top/front/bottom, so not a particularly quiet case and it's sitting 2-3 feet from me on my desk. It'll be noticeable once I block the card and don't have a lot of fan running during benching.


----------



## jcde7ago

yzonker said:


> Seems like the MSI's are getting the most positive comments in regards to coil whine.


I don't have any coil whine with my Suprim Liquid X, at least none that I can discern over my 10x case fans running at 1k rpm...I did get a couple people on Reddit though that were saying they could hear some coil whine from their Liquid X's...maybe it all just depends on how sensitive we are to it.


----------



## BeZol

yzonker said:


> Now will they do the VRAM artifact bug? LOL Otherwise they'll just look like they suck.


They had the VRAM artifact bug, but bench crashed at the end. (around 2 hours 55minutes of the stream maybe)
Same stuff happened to me at Time Spy Extreme Graphics Test 1 and had +700 Graphics Score at the end (like +10% fps whole GT1).


----------



## N19htmare666

jcde7ago said:


> I'm seeing maybe 3-5c higher memory temps on my own Suprim Liquid X than a lot of the better air coolers out there...completely inconsequential given how high memory temps are rated to be fine at. Like, not even worth worrying about at all (this is nothing like how it was with the poor design of the initial 3090 FEs and mem temps skyrocketing past 90c).
> 
> I still don't understand why people are going crazy over the 600w Suprim Liquid X bios though; I have the 530w bios and I find there to be 0 incentive to flash to the 600w one. We're talking maybe 3-5% gains to draw another 150w over the default max.
> 
> It was already posted in this thread but the play with these 4090s is to undervolt to a comfortable, stable voltage and get essentially the same as stock performance while almost halving the power draw. My Suprim Liquid X is running at 2760mhz w/ a +1000 mem OC, completely stable at 0.950mv (max default factory OC clock was 2820mhz). Winter isn't here yet in Cali and my card "struggles" to get past 57c @100% load no matter what I throw at it.


----------



## yzonker

BeZol said:


> They had the VRAM artifact bug, but bench crashed at the end. (around 2 hours 55minutes of the stream maybe)
> Same stuff happened to me at Time Spy Extreme Graphics Test 1 and had +700 Graphics Score at the end (like +10% fps whole GT1).


That's the real trick is to hit the sweet spot and get it to complete. The mem OC that gets it there moves around a little too, probably due to temperature and maybe other factors. Easier in PR since it's shorter. I managed to do it a couple of times. Haven't bothered in any of the others as it's just a PITA and not worth it IMO.


----------



## KedarWolf

Glottis said:


> Did you take a look at compatibility table? https://www.corsair.com/eu/en/psu-cable-compatibility
> All AXi models are supported (except AX1600i)


It's a Type 4 cable and all the cables work on Corsair PSUs, the only one that is pinned different is the 24-pin power cable, the rest are interchangeable.

From the Corsair website.

Disclaimer: The only difference between Type 3 and Type 4 cables is the pinout of the 24-pin ATX cable; all other cables (SATA, PCIe, etc) are the same.


----------



## mirkendargen

KedarWolf said:


> It's a Type 4 cable and all the cables work on Corsair PSUs, the only one that is pinned different is the 24-pin power cable, the rest are interchangeable.
> 
> From the Corsair website.
> 
> Disclaimer: The only difference between Type 3 and Type 4 cables is the pinout of the 24-pin ATX cable; all other cables (SATA, PCIe, etc) are the same.


Triple verify the pinout ahead of time, but I think Evga PSU PCIE cables are also the same pinout. Others may be as well.


----------



## J7SC

..fyi, just checked all the major computer retail chains and their branches in this general metro area, and all 4090s are still sold out (for now). I am sure that more shipments are on the way, though - still happy that I went for it when I found about out (in this thread...) that one of the cards on my shortlist was available locally then (and for the best price - so far).


----------



## yzonker

They hit it! 30k+. Lol.


----------



## J7SC

yzonker said:


> They hit it! 30k+. Lol.


3360MHz, -35 C is '''all''' it took  ...just goes to show how much potential is in those cards if you can cool them


Spoiler


----------



## BigMack70

Alemancio said:


> So after 2900+ posts and 146 pages... *what are the best 4090s? divided by tiers?*


My Windforce 4090 is the best 4090 because it's my 4090. It's the top tier GPU in my house, definitely a full tier above my 3090.


----------



## bmagnien

yzonker said:


> They hit it! 30k+. Lol.


It’s 100% the artifact bug, they gained 1k points with almost no change to core or mem frequency. I was benching alongside them and got a few artifact runs. At the top down outside section where the fps are highest, I went from 205.6 max fps to over 220 fps during an artifacting bug run. Not gonna opine about whether it’s legit or not, 3DMark will decide that, but that’s 100% what is happening and what happpened on the gamers nexus stream


----------



## yzonker

J7SC said:


> 3360MHz, -35 C is '''all''' it took  ...just goes to show how much potential is in those cards if you can cool them
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2577770


They did say they were going to verify with the 3dmark folks that it's valid, although maybe all they'll do is upload it. It was 302xx. Highest score so far. 

This supports what I've been saying though. Gains from cooling are going to be much less than 30 series due to the memory bottleneck. I doubt I can do +1800 on the mem with the chiller unless I can keep it warmer. 

An XOC bios would be handy for that since they always keep the mem running at full speed. Not sure how that compares to just locking the core at 1100mv which keeps it at full speed too (or maybe just setting "prefer max performance" in NVCP).


----------



## yzonker

bmagnien said:


> It’s 100% the artifact bug, they gained 1k points with almost no change to core or mem frequency. I was benching alongside them and got a few artifact runs. At the top down outside section where the fps are highest, I went from 205.6 max fps to over 220 fps during an artifacting bug run. Not gonna opine about whether it’s legit or not, 3DMark will decide that, but that’s 100% what is happening and what happpened on the gamers nexus stream


Yea that's what I meant by hitting "it". Should have been more clear probably. Interesting though, I did it twice and only got 500pts. Explains why I'm 290xx and other are mid 29k.

Maybe this will give some visibility to the issue and at least get an official position from the 3dmark folks.


----------



## Hanks552

Can somebody link some info about GPU and VRAM stages? I see the the msi suprim have the highest numbers but I’m wondering what does that mean.
I have a msi trio that have the lowest number, wondering if I should sell it and try to get a suprim but i want to understand this stages first


----------



## bmagnien

yzonker said:


> Yea that's what I meant by hitting "it". Should have been more clear probably. Interesting though, I did it twice and only got 500pts. Explains why I'm 290xx and other are mid 29k.
> 
> Maybe this will give some visibility to the issue and at least get an official position from the 3dmark folks.


I got a 28.7 run with just a cold room, and my artifacting run was looking like a mid 29 and crashed right at the end. Trying for a complete artifact run now lol just for kicks.


----------



## ZealotKi11er

J7SC said:


> 3360MHz, -35 C is '''all''' it took  ...just goes to show how much potential is in those cards if you can cool them
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2577770


That pathetic with LN2. These cards dont have much to OC. RDNA3 if anything like RDNA3 will be better. Even Ampere was much better with LN2. We have cards on air doing 3100MHz.


----------



## J7SC

yzonker said:


> They did say they were going to verify with the 3dmark folks that it's valid, although maybe all they'll do is upload it. It was 302xx. Highest score so far.
> 
> This supports what I've been saying though. Gains from cooling are going to be much less than 30 series due to the memory bottleneck. I doubt I can do +1800 on the mem with the chiller unless I can keep it warmer.
> 
> An XOC bios would be handy for that since they always keep the mem running at full speed. Not sure how that compares to just locking the core at 1100mv which keeps it at full speed too (or maybe just setting "prefer max performance" in NVCP).


...I was actually surprised at their relatively low VRAM speed in that run. As posted before, my biggest challenge is to keep hotspot below 90C (general GPU at around 72C then depending on ambient conditions). Also, 3DM issued a patch / update (yesterday?) so may be that looks at the artifact issue.

...it's the final 100W-150W for benching which are the issue (though psycho RT in CP2077 can also do it at lower PL). That's nothing that couldn't be fixed with a decent water-block, a good paste and good mount. That said, I don't obviously know how a 4nm w/ this many additional transistors will behave like compared for example to my 8nm 3090 w/520W in a big loop - time will tell.


----------



## yzonker

J7SC said:


> ...I was actually surprised at their relatively low VRAM speed in that run. As posted before, my biggest challenge is to keep hotspot below 90C (general GPU at around 72C then depending on ambient conditions). Also, 3DM issued a patch / update (yesterday?) so may be that looks at the artifact issue.
> 
> ...it's the final 100W-150W for benching which are the issue (though psycho RT in CP2077 can also do it at lower PL). That's nothing that couldn't be fixed with a decent water-block, a good paste and good mount. That said, I don't obviously know how a 4nm w/ this many additional transistors will behave like compared for example to my 8nm 3090 w/520W in a big loop - time will tell.


Like I mentioned before, when I was cooling my case with my chiller and rads, I lost about 100 on the mem clock. I didn't notice how low the mem temp got to at idle. Probably in the 20-30C range is my guess. I dropped 10C or a little more by doing that. This is the issue the LN2 guys hit with the 3090ti. 2Gb chips don't like the cold. Jay2Cents hit it in the video I posted a while back using his portable A/C unit too. 

Well my block will be here Wednesday. Haven't decided if I'm going to install the card right away though because of this. I'd really like to do a cool/cold weather test first just to see what the mem does. But it's warm again here right now. I can see the possibility of using cheap pads though just to keep the mem temp up.


----------



## gooface

So I am picking up the Giga OC on Tuesday, is it considered a solid card still? 

I think I will use the quiet bios on it since I dont really care about overclocking it, and the quiet bios seems to cool better than the FE so it should be great to run 24/7.

Only other card I wish I could get my hands on is the FE, but that seems like a near impossible battle at this point.


----------



## bmagnien

gooface said:


> So I am picking up the Giga OC on Tuesday, is it considered a solid card still?
> 
> I think I will use the quiet bios on it since I dont really care about overclocking it, and the quiet bios seems to cool better than the FE so it should be great to run 24/7.
> 
> Only other card I wish I could get my hands on is the FE, but that seems like a near impossible battle at this point.


I’m completely satisfied with my giga oc thus far. Coming from evga I was a bit concerned but so far so good. Will just get the ek block next month and call it a day as I’ve had good results with them thus far despite others’ opinions.


----------



## J7SC

gooface said:


> So I am picking up the Giga OC on Tuesday, is it considered a solid card still?
> 
> I think I will use the quiet bios on it since I dont really care about overclocking it, and the quiet bios seems to cool better than the FE so it should be great to run 24/7.
> 
> Only other card I wish I could get my hands on is the FE, but that seems like a near impossible battle at this point.


I really like the Giga-Gaming-OC and would recommend it (I typically buy either Asus or Gigabyte GPUs). On my sample at least, no coil whine and it is vet quiet in gaming - until you go beyond ~ 450-500W, then it gets toasty and loud, like most of these air cooled 4090s.


----------



## ZealotKi11er

J7SC said:


> ...I was actually surprised at their relatively low VRAM speed in that run. As posted before, my biggest challenge is to keep hotspot below 90C (general GPU at around 72C then depending on ambient conditions). Also, 3DM issued a patch / update (yesterday?) so may be that looks at the artifact issue.
> 
> ...it's the final 100W-150W for benching which are the issue (though psycho RT in CP2077 can also do it at lower PL). That's nothing that couldn't be fixed with a decent water-block, a good paste and good mount. That said, I don't obviously know how a 4nm w/ this many additional transistors will behave like compared for example to my 8nm 3090 w/520W in a big loop - time will tell.


Its 5nm not 4nm.


----------



## yzonker

Hanks552 said:


> Can somebody link some info about GPU and VRAM stages? I see the the msi suprim have the highest numbers but I’m wondering what does that mean.
> I have a msi trio that have the lowest number, wondering if I should sell it and try to get a suprim but i want to understand this stages first


I'm going to go against what others have been saying and state that it doesn't matter at all unless you're going to shunt and volt mod the card. All of them have enough VRM to handle 600w. 

The only thing you get is a small reduction in heat from the VRM at those high power levels since it's theoretically running at a more efficient level. It's similar to PSU efficiency. Best efficiency is somewhere in the middle. But these cards don't even pull 500w in most games, so who really cares.


----------



## jcde7ago

Hanks552 said:


> Can somebody link some info about GPU and VRAM stages? I see the the msi suprim have the highest numbers but I’m wondering what does that mean.
> I have a msi trio that have the lowest number, wondering if I should sell it and try to get a suprim but i want to understand this stages first


They're referencing the actual hardware in the cards; MSI Suprim line has more physical VRM phases than any other 4090 available right now.

Chart here, credit to u/kirkle8


----------



## EarlZ

I am very surprised that there is no VRM information yet on the Auros Master 4090 which is been on retail for a bit.


----------



## J7SC

ZealotKi11er said:


> Its 5nm not 4nm.


?? 
Multiple sources have it at 4nm !


----------



## schoolofmonkey

jcde7ago said:


> They're referencing the actual hardware in the cards; MSI Suprim line has more physical VRM phases than any other 4090 available right now.


The better/higher the vrm stages, the higher and more stable the overclock?
I've got the Galax which I can hit 3Ghz GPU, but the vram doesn't go as high as some of the other cards, from what I see on the front page it has 4×50A (200A) vram stages and 18×50A (900A) CPU stages, still running a 450-510w BIOS.


----------



## mirkendargen

ZealotKi11er said:


> That pathetic with LN2. These cards dont have much to OC. RDNA3 if anything like RDNA3 will be better. Even Ampere was much better with LN2. We have cards on air doing 3100MHz.


Call me crazy, but cards performing closer to their best possible in real world every day achievable situations is a win to me, not something to complain about lol.




J7SC said:


> ??
> Multiple sources have it at 4nm !


The process node is called "4N" which I think people mistake to mean 4nm (maybe intentionally by TSMC) but I also recall reading it's actually 5nm.








NVIDIA Ada Lovelace GPUs To Have A Node Advantage Over AMD RDNA 3, Rumored To Utilize TSMC 4N Process


NVIDIA Ada Lovelace GPUs for next-gen GeForce RTX 40 Gaming graphics cards will have a node advantage over AMD RDNA 3 in the form of TSMC 4N.




wccftech.com












Nvidia clarifies: The TSMC 4N used by the RTX 40 GPU is a 5nm process - TechGoing


According to Hong Kong media HKEPC, Nvidia clarified today that the RTX 40 GPU uses TSMC’s 4N 5nm process, not the 4nm process, due to a large number of media writing errors. According to reports, the Nvidia RTX 40 GPU is based on the 5nm TSMC 4N process, the N in “4N” should stand for […]




www.techgoing.com


----------



## fitnessgrampacertest

ZealotKi11er said:


> Its 5nm not 4nm.


Absolutely false. Its 100% a 4nm process, NOT a 5nm. No question about it. No debate. You wanna see the GPU-Z readout?


----------



## mirkendargen

fitnessgrampacertest said:


> Absolutely false. Its 100% a 4nm process, NOT a 5nm. No question about it. No debate. You wanna see the GPU-Z readout?


Ummm....GPU-Z isn't magically scanning the lithography of the chip. It's just spitting out whatever it's database says, which could easily be wrong in the confusion with the node name.

I looked up the Hopper white paper and Nvidia is being deliberately deceptive in it, I'm pretty sure they'd say 4nm here if it was 4nm, but they know it's 5nm so they don't and let people assume. This is comparing the Ampere compute cards to the Hopper compute cards. The ratio on the transistor counts is also much closer to 5/7 than 4/7, another clue.








NVIDIA Hopper Architecture In-Depth | NVIDIA Technical Blog


Everything you want to know about the new H100 GPU.




developer.nvidia.com














At the end of the day who really cares though...? The card performs the way it does, and the way it performs is good. Published node size is really just marketing propaganda at this point.


----------



## J7SC

fitnessgrampacertest said:


> Absolutely false. Its 100% a 4nm process, NOT a 5nm. No question about it. No debate. You wanna see the GPU-Z readout?





mirkendargen said:


> Ummm....GPU-Z isn't magically scanning the lithography of the chip. It's just spitting out whatever it's database says, which could easily be wrong in the confusion with the node name.


Now I'm going for a late lunch...


----------



## mirkendargen

J7SC said:


> Now I'm going for a late lunch...
> View attachment 2577786
> 
> 
> View attachment 2577787


N4 != 4N. I think that's the mass confusion.


----------



## Benjit

mirkendargen said:


> Ummm....GPU-Z isn't magically scanning the lithography of the chip. It's just spitting out whatever it's database says, which could easily be wrong in the confusion with the node name.
> 
> I looked up the Hopper white paper and Nvidia is being deliberately deceptive in it, I'm pretty sure they'd say 4nm here if it was 4nm, but they know it's 5nm so they don't and let people assume. This is comparing the Ampere compute cards to the Hopper compute cards:
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA Hopper Architecture In-Depth | NVIDIA Technical Blog
> 
> 
> Everything you want to know about the new H100 GPU.
> 
> 
> 
> 
> developer.nvidia.com
> 
> 
> 
> 
> 
> View attachment 2577785
> 
> 
> At the end of the day who really cares though...? The card performs the way it does, and the way it performs is good. Published node size is really just marketing propaganda at this point.


----------



## Arizor

Lads, this is a dead end. For any claim of 4nm from the media, I can find one stating uncategorically it's 5nm. We won't know at the moment, until there's a direct quote from TSMC or Jensen is caught for an interview whilst airing out his leather.


----------



## ZealotKi11er

Thank you good sir for proving me right. Even 5nm probably not "real" 5nm. They are just marketing names at this point. Only thing we know that will use 4nm is PHX APU next year.


----------



## bmagnien

Arizor said:


> Lads, this is a dead end. For any claim of 4nm from the media, I can find one stating uncategorically it's 5nm. We won't know at the moment, until there's a direct quote from TSMC or Jensen is caught for an interview whilst airing out his leather.
> 
> View attachment 2577791


This sounds correct.
“
The NVIDIA Ada Lovelace RTX 40 series uses a special TSMC 4N process, a modified version of NVIDIA, which actually belongs to the TSMC 5nm process family.
Truly,TSMC 5nm family version is very rich, including N5 standard version, N5P, N4, N4P, N4X, NVIDIA 4N performance enhanced version.”


----------



## yzonker

That score is up from the GN stream. +1500 mem isn't bad, although he was using a heater to keep it warmer. 









I scored 30 289 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 31866 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Alelau18

Foundries just like to intentionally do those things and is indeed misleading, TSMC 4N != N4 (which is what is known as 4nm), 4N (what Ada uses) is an optimized N5 (which is 5nm). Naming of process nodes is mess, which is why I liked the change Intel did and try to be consistent with similar transistor density = same number.


----------



## KedarWolf

yzonker said:


> That score is up from the GN stream. +1500 mem isn't bad, although he was using a heater to keep it warmer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 30 289 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 31866 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


He uses the motherboard I bought, X670E Taichi.

I get my waterblock mounting kit Tuesday, but the 4090 will be a few weeks or more. 

3DMark reports the 7950x as 6nm, it's 5nm though.


----------



## mirkendargen

Alelau18 said:


> Foundries just like to intentionally do those things and is indeed misleading, TSMC 4N != N4 (which is what is known as 4nm), 4N (what Ada uses) is an optimized N5 (which is 5nm). Naming of process nodes is mess, which is why I liked the change Intel did and try to be consistent with similar transistor density = same number.


Yup. I think the most conclusive evidence is that given node size is a marketing spec these days. Nvidia would be screaming from the rooftops that Ada/Hopper are 4nm. Can anyone find Nvidia saying anywhere that it is? All they ever say is the "4N process" and let people assume instead, because they know that does the marketing for them without false advertising. As proven by the general assumption in this thread and all the media articles people have linked, the plan is working as intended.

I still don't care one bit about which the card actually is. If it was 1um (yes micrometer) and performed the way it does, awesome, I'll take it. We should all want it to be on the bigger node, because that leaves more room for improvement next time around on TSMC 2-3nm.


----------



## ZealotKi11er

Any case that is good for these 4090 to run them on air? Want to build a second rig and keep my water cooled O11 XL with 6900 XT until 7900 XT and WC that.


----------



## J7SC

ZealotKi11er said:


> Thank you good sir for proving me right. Even 5nm probably not "real" 5nm. They are just marketing names at this point. Only thing we know that will use 4nm is PHX APU next year.


...proving you right ? Good grief.
At best, it seems somewhat ambiguous, at worst, it was a waste of time. At the end of the day, it was just about cooling over 70 billion transistors in a shrunken node most literature and reference materials refer to as a 4 nm process...


----------



## yzonker

J7SC said:


> ...proving you right ? Good grief.
> At best, it seems somewhat ambiguous, at worst, it was a waste of time. At the end of he day, it was just about cooling over 70 billion transistors in a shrunken node most literature and reference materials refer to as a 4 nm process...


I personally don't care what it is as long as it's fast and there's one in my machine.


----------



## BigMack70

ZealotKi11er said:


> Any case that is good for these 4090 to run them on air? Want to build a second rig and keep my water cooled O11 XL with 6900 XT until 7900 XT and WC that.


The O11 XL is just generally excellent. Really comfy running this card on air in mine. I assume any case of that quality tier will have no issues.


----------



## N19htmare666

ZealotKi11er said:


> Its 5nm not 4nm.


5nm ha... Have you got a tape measure out and measured it. From what I'm seeing is f..ING massive


----------



## Arizor

J7SC said:


> ...proving you right ? Good grief.
> At best, it seems somewhat ambiguous, at worst, it was a waste of time. At the end of he day, it was just about cooling over 70 billion transistors in a shrunken node most literature and reference materials refer to as a 4 nm process...


Exactly, who cares folks, let's move on! 

Really torn on watercooling this. I'm _very_ suspicious we're going to see the Ti within 6 months, at which point I'll want to sell my 4090 and get the Ti for futureproofing, and we all know watercooling can complicate the selling.


----------



## Mad Pistol

I STILL have people telling me that a 5800x3D will bottleneck an RTX 4090.

I mean... really?!?!?!

I just played a 128-player Battlefield 2042 match, and holy hell was it enjoyable! Framerates are stratospheric if you turn down the details enough, and even if you leave resolution at 4K with details maxed + DLSS Quality, I'm still bumping up against the artificial 116 FPS limit I set for my LG CX OLED (4K120).

Have people lost their minds? 5800x3D + RTX 4090 is effing epic!!!



Arizor said:


> Really torn on watercooling this. I'm _very_ suspicious we're going to see the Ti within 6 months, at which point I'll want to sell my 4090 and get the Ti for futureproofing, and we all know watercooling can complicate the selling.


It's painfully obvious that Nvidia locked voltage on the RTX 4090 to keep it from performing too good. These giant coolers are completely overbuilt for the cards they sit on. The only viable conclusion here is that Nvidia is keeping the binned GPU dies that can take 1.15 or 1.2 volts, and then allowing them to be clocked higher for the RTX 4090 Ti.

The 4090 is artificially locked; water cooling is, IMO, a waste for this GPU.


----------



## Kaltenbrunner

So is there next to no stock of these available ? And how bad is the scalping, and what do miners think ?

I can't wait to see what AMD has for competition. I never looked at the 4090 regular gaming benchmarks yet, if the 4090 really is way ahead, I wonder what AMD will price there stuff at.


----------



## Xavier233

On the Giga 4090 OC, anyone knows whats the diff between Quiet and performance mode - Is it how many watts max, fan speeds or else?


----------



## Arizor

Xavier233 said:


> On the Giga 4090 OC, anyone knows whats the diff between Quiet and performance mode - Is it how many watts max, fan speeds or else?


Fan speed, and in some, max watts.


----------



## BigMack70

Mad Pistol said:


> It's painfully obvious that Nvidia locked voltage on the RTX 4090 to keep it from performing too good. These giant coolers are completely overbuilt for the cards they sit on. The only viable conclusion here is that Nvidia is keeping the binned GPU dies that can take 1.15 or 1.2 volts, and then allowing them to be clocked higher for the RTX 4090 Ti.


Nah. 









NVIDIA GeForce RTX 4090 - Where the misconception about 600 watts really comes from and why the cards are so huge | igor'sLAB


Yesterday we were allowed to publish the first own pictures of NVIDIA's upcoming graphics card generation, but we still have to keep quiet about further details up to the performance until the set…




www.igorslab.de


----------



## mirkendargen

Mad Pistol said:


> It's painfully obvious that Nvidia locked voltage on the RTX 4090 to keep it from performing too good. These giant coolers are completely overbuilt for the cards they sit on. The only viable conclusion here is that Nvidia is keeping the binned GPU dies that can take 1.15 or 1.2 volts, and then allowing them to be clocked higher for the RTX 4090 Ti.
> 
> The 4090 is artificially locked; water cooling is, IMO, a waste for this GPU.


1.2v would actually be the 800-900w card everyone was scared of, for 5% performance gains on top of 10% for the full die. I don't think they're going to up the voltage any (unless AMD comes out of left field with a killer and it's the only way for Nvidia to retain the performance crown).

I honestly think the supply diverted to compute card rumors make more sense. Selling compute cards to China before the ban kicks in might be the new "mining", especially when they can sell them for 5x or more what they sell gaming cards for. Yeah Hopper is a slightly different die so that isn't swappable, but throwing together a full die Quadro card and selling it for $6k+ could be done "overnight".


----------



## BigMack70

OK, I gotta say... DLSS 3 is the real deal in the scenarios where you can take advantage of it. Running A Plague Tale Requiem at 5120x1440 at 180-200fps is... impressive.


----------



## drstancpa

bottjeremy said:


> My entry level Gigabyte Windforce 4090 on a 5900x is doing quite nice. No coil whine, card is quiet, overclocks like a champ. Been very stable at +220 +1500 106% power target. 1.1 volts. 3045mhz clock speed in games. Very happy with the 4090.
> 
> 
> 
> 
> 
> 
> Also, have a few of the top spots for 5900x and 4090 on 3Dmark with this card.


Does the Windforce support 133% power limit, to 600W? Glad to hear this card is performing admirably!


----------



## BigMack70

drstancpa said:


> Does the Windforce support 133% power limit, to 600W? Glad to hear this card is performing admirably!


No. 106%.


----------



## drstancpa

Carillo said:


> Hey. I got the Gigabyte Windforce model yesterday here in Norway, only card i was able to purchase. All gone in seconds. I know the Gaming OC is 600 watt, but not sure about the windforce. I will receive the card probably next week, since the nearest retailer is 7 hours away  Anyone else got the Windforce ?


Did you ever find out if the Windforce can go to 600W? Hope you got your card!


----------



## drstancpa

BigMack70 said:


> No. 106%.


Thanks!


----------



## drstancpa

Vasoka said:


> Got gigabyte gaming 4090, can go all the way to +280 on core in all benchmarks (furmark / 3rd mark etc), but sporadically crashes in gaming + shows big artifacts in AC: Valhalla maxed out unless it's brought down to +220. Wonder how normal that is? Benchmarks don't show artifacts, have ran a 8h+ loop of Timespy extreme with no issues. Kinda weird.


Was this on the regular Gaming, or the Gaming OC?


----------



## drstancpa

bmagnien said:


> MSI’s lower end doesn’t have 600w bios out of the box. While flashing bios is almost certainly going to be possible in the future, it’s currently not and technically no guarantee that it will. Just in case that matters. Gigabytes 2nd to lowest and Asus’s lowest both do support 600w. Not sure about the wind force


By second to lowest, do you mean the non-OC 4090 Gaming from Gigabyte? I'm trying to confirm that card will in fact allow +33% to 600W.


----------



## Hanks552

jcde7ago said:


> They're referencing the actual hardware in the cards; MSI Suprim line has more physical VRM phases than any other 4090 available right now.
> 
> Chart here, credit to u/kirkle8
> View attachment 2577775


But more = better? Just for overclock? Because I’m planning to run it with water, but there is no MSI waterblock yet. I don’t want to do shunt mode anymore, just bios flash and water cooling should be enough for me


----------



## Sayenah

Hello,

I am somewhat new to overclocking and trying my hand at manual overclocking with with GPU Tweak 3. I have a Strix 4090 mated to a 12900k with DDR5 6400c32.

I can pass Speedway 20 loop stress test at 3075Mhz and +1725 offset on the memory. Seems decent right? No artifacts and Port Royal does fine. The funny thing is, if I move the needle to +5 on GPU OR +5 on memory, Port Royal doesn’t even start. I’d imagine mid-test crashes or artifacts but not an abrupt “drop-off”. Weird. 

Also why is my Port Royal score only 27.4k? I should be able to crack 28k, no?









Thank you.


----------



## pajonk

Hello,
my TUF OC 4090 is crashing with black screen and Windows Freeze at +1000Mhz Memory Clock.
My suggestion is that the Thermalpads are not correctly in contact with every single ram module.
Does anyone knows how to coppermod the Tuf?
I don't know how thick the copper and ther thermalpads has to be.
Dont want to disamble my 2250 € card at the moment...
Thanks


----------



## Arizor

pajonk said:


> Hello,
> my TUF OC 4090 is crashing with black screen and Windows Freeze at +1000Mhz Memory Clock.
> My suggestion is that the Thermalpads are not correctly in contact with every single ram module.
> Does anyone knows how to coppermod the Tuf?
> I don't know how thick the copper and ther thermalpads has to be.
> Dont want to disamble my 2250 € card at the moment...
> Thanks


You can see a dude taking his apart here, looks like 2 or 3mm from me eyeballing, but some of the more astute members here might have a better guess for you.


----------



## mirkendargen

pajonk said:


> Hello,
> my TUF OC 4090 is crashing with black screen and Windows Freeze at +1000Mhz Memory Clock.
> My suggestion is that the Thermalpads are not correctly in contact with every single ram module.
> Does anyone knows how to coppermod the Tuf?
> I don't know how thick the copper and ther thermalpads has to be.
> Dont want to disamble my 2250 € card at the moment...
> Thanks


In Hwinfo is the memory temperature actually high? It's an easy theory to check before taking apart the card.


----------



## pajonk

The temp is ok. But the temp sensor is not measuring every single ram module. Maybe it's measuring a well contact module but another module is not in full contact and making the problems.


----------



## mirkendargen

pajonk said:


> The temp is ok. But the temp sensor is not measuring every single ram module. Maybe it's measuring a well contact module but another module is not in full contact and making the problems.


I recall from the 3090's when this became a thing that the sensor output is the hottest module. I could be remembering wrong or it's changed though.


----------



## Alberto_It

RaMsiTo said:


> Resolve 93% Power Limit Problem with Afterburner, MSI Tool Summary
> 
> View attachment 2577681
> 
> 
> 
> suprim x before, gaming bios
> 
> 
> View attachment 2577695
> 
> 
> 
> 
> after
> 
> View attachment 2577696


Can you please explain me the differences between the two bios versions? Default power limit is always 480W


----------



## J7SC

Arizor said:


> You can see a dude taking his apart here, looks like 2 or 3mm from me eyeballing, but some of the more astute members here might have a better guess for you.


Interesting to watch someone else running Port Royal and confirming that in the low 60s C, there's a bin drop (15 MHz)...I know it well . It could also be triggered by hotspot, but in any case, I look forward to water-cooling my 4090. The loop is already done and waiting, and I see that Alphacool is talking about a block for my Giga-G-OC (in addition to the one announced by EK). I am still also hoping for Bykski and Phanteks. Most of those also will offer TUF blocks, afaik.


----------



## RaMsiTo

Alberto_It said:


> Can you please explain me the differences between the two bios versions? Default power limit is always 480W


a bug in afterburner, default 93% instead of 100%, nothing really important.


----------



## pajonk

J7SC said:


> Interesting to watch someone else running Port Royal and confirming that in the low 60s C, there's a bin drop (15 MHz)...I know it well . It could also be triggered by hotspot, but in any case, I look forward to water-cooling my 4090. The loop is already done and waiting, and I see that Alphacool is talking about a block for my Giga-G-OC (in addition to the one announced by EK). I am still also hoping for Bykski and Phanteks. Most of those also will offer TUF blocks, afaik.


Yes a 15Mhz drop is really really a big deal. So that no game longer runs smooth


----------



## Toony90

jcde7ago said:


> I'm seeing maybe 3-5c higher memory temps on my own Suprim Liquid X than a lot of the better air coolers out there...completely inconsequential given how high memory temps are rated to be fine at. Like, not even worth worrying about at all (this is nothing like how it was with the poor design of the initial 3090 FEs and mem temps skyrocketing past 90c).
> 
> I still don't understand why people are going crazy over the 600w Suprim Liquid X bios though; I have the 530w bios and I find there to be 0 incentive to flash to the 600w one. We're talking maybe 3-5% gains to draw another 150w over the default max.
> 
> It was already posted in this thread but the play with these 4090s is to undervolt to a comfortable, stable voltage and get essentially the same as stock performance while almost halving the power draw. My Suprim Liquid X is running at 2760mhz w/ a +1000 mem OC, completely stable at 0.950mv (max default factory OC clock was 2820mhz). Winter isn't here yet in Cali and my card "struggles" to get past 57c @100% load no matter what I throw at it.


The point is that everyone should have a choice, no one is punishing anyone to use 600W, and so a few people got this 600W bios card, and the other 530W goes to do a flash bios, but in theory it invalidates the warranty.

I also have Suprim Liquid X so the temp is fine, but from what I saw is Waterforce memory clock 3ghz temperature 51-52, Gigabyte Gaming air-cooled temp 61-62 etc. Suprim Liquid is a good card, but in my opinion the size is too big. does not stand out from the competition. Unfortunately, I was unable to buy the Waterforce, Nvidi FE that I hunted.


----------



## AvengedRobix

pajonk said:


> Hello,
> my TUF OC 4090 is crashing with black screen and Windows Freeze at +1000Mhz Memory Clock.
> My suggestion is that the Thermalpads are not correctly in contact with every single ram module.
> Does anyone knows how to coppermod the Tuf?
> I don't know how thick the copper and ther thermalpads has to be.
> Dont want to disamble my 2250 € card at the moment...
> Thanks


Can simple be a bada bin of the Memory and not related ti temp or contact


----------



## J7SC

pajonk said:


> Yes a 15Mhz drop is really really a big deal. So that no game longer runs smooth


...at full power, I typically go through 3x15 MHz bins. In any case, I watercool every CPU and GPU anyway...disappearance of bin speed drops is a pleasant side effect, as is the sound of silence  - and with this 4090, I'll will be able to actually fit it in my dual mobo single case setup once the behemoth air cooler comes off


----------



## Arizor

J7SC said:


> ...at full power, I typically go through 3x15 MHz bins. In any case, I watercool every CPU and GPU anyway...disappearance of bin speed drops is a pleasant side effect, as is the sound of silence  - and with this 4090, I'll will be able to actually fit it in my dual mobo single case setup once the behemoth air cooler comes off


Dare I ask what happens to all your old waterblocked cards? Do you find it easier to sell them on in Canada (bit tricky here in Oz)? Or do you have a dungeon of old waterblocked cards hidden from the missus?


----------



## J7SC

Arizor said:


> Dare I ask what happens to all your old waterblocked cards? Do you find it easier to sell them on in Canada (bit tricky here in Oz)? Or do you have a dungeon of old waterblocked cards hidden from the missus?


...I have never sold any used PC parts, waterblocked or otherwise; the 'old soldiers' just march on and get used in a software firm dev environment.


----------



## Arizor

J7SC said:


> ...I have never sold any used PC parts, waterblocked or otherwise; the 'old soldiers' just march on and get used in a software firm dev environment.


Pics? I want to see this waterblock 'terracotta' army!


----------



## Alberto_It

RaMsiTo said:


> a bug in afterburner, default 93% instead of 100%, nothing really important.


I'm not sure, my power limits slide is 100% but minimum power draw I think is 10W instead of 150W 🤔


----------



## J7SC

Arizor said:


> Pics? I want to see this waterblock 'terracotta' army!


No pics from the office...check your cell phones at the door


----------



## Arizor

Dammit!


----------



## Carillo

drstancpa said:


> Did you ever find out if the Windforce can go to 600W? Hope you got your card!


It's 480 watt max. That card is returned. Ended up with the Strix


----------



## Carillo

yzonker said:


> That score is up from the GN stream. +1500 mem isn't bad, although he was using a heater to keep it warmer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 30 289 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 31866 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


That score should not be valid is my opinion anyway. He suddenly got 1000 points from 29K to 30K in his last run without changing core and memory. The scenes were not rendering properly, just a lot of artefacts and extremely high FPS in some scenes. Memory seemed to be in a cold bug state.


----------



## Arizor

Temps after running maxxed out PL, Voltage etc. in game stable settings for a while. I'm happy with the core temp and memory, but hot spot temperature is a bit high. What are we thinking folks in terms of maximum temps allowable?

I can easily run it undervolted at 0.95v for about 4-5% less performance (though still above stock perf).


----------



## EEE-RAY

So I thought I blew up my card. I was running heaven at +1750 mhz mem so I thought what the hell and set it to +200mhz.

catastrophic failure. The screen went rainbow and hard locked on desktop. I had to manual power off at the wall.

after I restarted every time I even tried to set it to +1750mhz mem (my old setting) it hard locks immediately on desktop. Why would it become suddenly so unstable at my old clock if there wasn’t some sort of permanent damage?

anywayz after a hard lock few restart cycles suddenly +1750 is ok again. Whay id the reason for this behaviour? This “deterioration” and recovery?

Addit: also there are 55,000 events in event log viewer of the same nvlddmkm error lol


----------



## jootn2kx

Mad Pistol said:


> I STILL have people telling me that a 5800x3D will bottleneck an RTX 4090.
> 
> I mean... really?!?!?!
> 
> I just played a 128-player Battlefield 2042 match, and holy hell was it enjoyable! Framerates are stratospheric if you turn down the details enough, and even if you leave resolution at 4K with details maxed + DLSS Quality, I'm still bumping up against the artificial 116 FPS limit I set for my LG CX OLED (4K120).
> 
> Have people lost their minds? 5800x3D + RTX 4090 is effing epic!!!


 yes sadly huge cpu bottlenecks in A plague tale reqiem @ 4K with my 5800x3d + 3080TI with dlss on quality or balanced. Worst parts gpu usage went to 60/70 ties. Especially in crowdy city parts and parts with alot of rats.


----------



## Alberto_It

@RaMsiTo you have got a private message


----------



## J7SC

Arizor said:


> View attachment 2577884
> 
> 
> Temps after running maxxed out PL, Voltage etc. in game stable settings for a while. I'm happy with the core temp and memory, but hot spot temperature is a bit high. What are we thinking folks in terms of maximum temps allowable?
> 
> I can easily run it undervolted at 0.95v for about 4-5% less performance (though still above stock perf).


I would try to keep the hotspot under 90 C. Check GPUz bios tab for allowable temp range (it may say s.th. like 68 C to 88 C, though that may be general temp). Below is the Giga-G-OC from a Port Royal run with less crazy ambient than before here, temps look to be similar with the exception of VRAM which generally seems to be the case in YT tests of this card. Starting clock was 3105


----------



## Blameless

Mad Pistol said:


> I STILL have people telling me that a 5800x3D will bottleneck an RTX 4090.
> 
> I mean... really?!?!?!
> 
> I just played a 128-player Battlefield 2042 match, and holy hell was it enjoyable! Framerates are stratospheric if you turn down the details enough, and even if you leave resolution at 4K with details maxed + DLSS Quality, I'm still bumping up against the artificial 116 FPS limit I set for my LG CX OLED (4K120).
> 
> Have people lost their minds? 5800x3D + RTX 4090 is effing epic!!!


It can be 'epic' and bottlenecked at the same time.

If you uncap your frame rate entirely, does the GPU ever fall below ~98% utilization? If it does, chances are the CPU is a bottleneck at the settings you're using. If not, then any bottleneck present is probably pretty small...but could still potentially be evident in peak 1% frame times.



pajonk said:


> The temp is ok. But the temp sensor is not measuring every single ram module.


It should be polling on die sensors on every GDDR6X IC and reporting the highest temp.



mirkendargen said:


> I recall from the 3090's when this became a thing that the sensor output is the hottest module. I could be remembering wrong or it's changed though.


Would be a completely useless reading if it was anything other than the highest one.


----------



## atilcan06

Anyone had any info on GALAX RTX 4090 Serious Gaming?


----------



## yzonker

Interesting, they did toss out that 30.2k run from the live stream. 









3DMark Port Royal Hall of Fame


The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.




www.3dmark.com


----------



## Arizor

J7SC said:


> I would try to keep the hotspot under 90 C. Check GPUz bios tab for allowable temp range (it may say s.th. like 68 C to 88 C, though that may be general temp). Below is the Giga-G-OC from a Port Royal run with less crazy ambient than before here, temps look to be similar with the exception of VRAM which generally seems to be the case in YT tests of this card. Starting clock was 3105


Yep that's my plan - GPUZ says 88C max, but as you say, I think that's GPU core, which for me maxxed around 77.

ASUS are very proud their capacitors are rated for 20,000 hours at 105C, so hopefully keeping it below 90 gives us some longevity, though again I'm just hanging around until the Ti comes along so I can put it under water


----------



## Zogge

Mad Pistol said:


> I STILL have people telling me that a 5800x3D will bottleneck an RTX 4090.
> 
> I mean... really?!?!?!
> 
> I just played a 128-player Battlefield 2042 match, and holy hell was it enjoyable! Framerates are stratospheric if you turn down the details enough, and even if you leave resolution at 4K with details maxed + DLSS Quality, I'm still bumping up against the artificial 116 FPS limit I set for my LG CX OLED (4K120).
> 
> Have people lost their minds? 5800x3D + RTX 4090 is effing epic!!!
> 
> 
> 
> It's painfully obvious that Nvidia locked voltage on the RTX 4090 to keep it from performing too good. These giant coolers are completely overbuilt for the cards they sit on. The only viable conclusion here is that Nvidia is keeping the binned GPU dies that can take 1.15 or 1.2 volts, and then allowing them to be clocked higher for the RTX 4090 Ti.
> 
> The 4090 is artificially locked; water cooling is, IMO, a waste for this GPU.


Same here with my 5050X PBO OC and 4090. 116fps stable in BF2042 both in raster mode ultra as well as DLSS quality ultra on my CX 48".


----------



## zware62

Mad Pistol said:


> And an overclock...
> +210/+1500 core/mem
> Voltage maxed, Fan at 90%
> 
> Not bad for a crappy little (well... big) Windforce 4090.
> 
> View attachment 2577297





Mad Pistol said:


> Be careful. The Windforce has a much lower VRM setup than the STRIX, so you might risk destroying the card.


I have just flushed Windforce with gaming OC bios over stock silent bios ... might be its measuring error but i got +1000 points better 3dmark time spy with +210 clock and +1500 memory ... compared with same setup on original oc bios. gpuz still cant read the bios as it was before... i have 5800x3d and 3200 ddr4.

Only issue was resetting of nvidia surround (i use 3x1440 monitors) as win 10 behaved like i have changed the gpu!


----------



## AvengedRobix

Found this in reddit.. attention to not bend too much the cable 😉


----------



## JedixJarf

The guy had barely any bend at all.















AvengedRobix said:


> Found this in reddit.. attention to not bend too much the cable 😉
> View attachment 2577919
> 
> View attachment 2577918


----------



## Xavier233

My cable is way more bent downwards than the pic....


----------



## Carillo

AvengedRobix said:


> Found this in reddit.. attention to not bend too much the cable 😉
> View attachment 2577919
> 
> View attachment 2577918


To me that look’s like a plug not properly inserted to the connector .. that doesn’t just happen with a proper connection.


----------



## bottjeremy

drstancpa said:


> Does the Windforce support 133% power limit, to 600W? Glad to hear this card is performing admirably!


106%. It's honestly good enough for the 4090 since I can be over 3ghz effective clock, but if you are trying to be top spot on 3dMark then this is not the card for you. I do have 3 top spots with 5900x and 4090 though.


----------



## Xavier233

Carillo said:


> To me that look’s like a plug not properly inserted to the connector .. that doesn’t just happen with a proper connection.


----------



## bmagnien

J7SC said:


> I see that Alphacool is talking about a block for my Giga-G-OC (in addition to the one announced by EK). I am still also hoping for Bykski and Phanteks. Most of those also will offer TUF blocks, afaik.


where’d you see the alphacool gigaoc blocks?


----------



## bmagnien

Unofficial 3DMark response on the artifacting scores:


----------



## KedarWolf

I'm on my way to pick up my ASUS Strix OC RTX 4090!!!


----------



## R1fast

Was finally able to score a 4090 FE through BestBuy via a 'local store' notification on the Fixit Discord (1 cancelled order sitting at local BB store). Also FYI - I found out that OCUK has several models of the EK Quantum Vector2 4090 blocks cheaper than EK Store (ex: 4090 AIB full coverage block for £250, plus tax and ~£36 shipping via DHL ). Just hoping my order doesn't get killed by customs/tax.

PS heya @KedarWolf - good to see ya (big fan of your BIOS mods!)


----------



## dreckschmeck

AvengedRobix said:


> Found this in reddit.. attention to not bend too much the cable 😉
> View attachment 2577919
> 
> View attachment 2577918


good thing i saw that. My cables were exactly bend to the right side as shown in his build pic. Changed it now to go more upwards and use the CPU AIO tubes as a cable rest 
You could also reference to this cablemod disclamer on how to bend the cables correctly:


----------



## 8472

dreckschmeck said:


> good thing i saw that. My cables were exactly bend to the right side as shown in his build pic. Changed it now to go more upwards and use the CPU AIO tubes as a cable rest
> You could also reference to this cablemod disclamer on how to bend the cables correctly:
> View attachment 2577964


35mm is going to be unrealistic for most people with horizontally mounted 4090s. That'd mean 175mm or more would be needed. 

I don't understand why the AIBs couldn't do what cablemod is doing and include a 90 degree adapter.


----------



## BigMack70

8472 said:


> 35mm is going to be unrealistic for most people with horizontally mounted 4090s. That'd mean 175mm or more would be needed.
> 
> I don't understand why the AIBs couldn't do what cablemod is doing and include a 90 degree adapter.


Yup I just don't have room for that kind of clearance in my O11 XL. I just tried to be gentle with it and hopefully it won't melt on me. I also wonder if these things are melting under normal 400-450W loads or if they are melting when pushed to 600W.


----------



## Sheyster

Has anyone been able to run an MSI Trio with a 4-pin adapter or cable, along with a 600w ASUS or GB BIOS flashed?


----------



## mirkendargen

8472 said:


> 35mm is going to be unrealistic for most people with horizontally mounted 4090s. That'd mean 175mm or more would be needed.
> 
> I don't understand why the AIBs couldn't do what cablemod is doing and include a 90 degree adapter.


Yeah it boggles my mind that zero AIBs thought of this and just put the connector on the end of the card instead of the top (although plenty of people have length issues too I guess...). It seems like Nvidia might have mandated something about the adapters since they're all Nvidia-provided I think.


----------



## drstancpa

bottjeremy said:


> 106%. It's honestly good enough for the 4090 since I can be over 3ghz effective clock, but if you are trying to be top spot on 3dMark then this is not the card for you. I do have 3 top spots with 5900x and 4090 though.


Nice! Thanks.


----------



## bmagnien

mirkendargen said:


> although plenty of people have length issues too I guess...


speak for yourself


----------



## Xavier233

Well if something happens to the card or connector within 30 days, its going back to the store. If its safe for 30 days, and u dont re-adjust or bend the cable as it currently is, I dont think you will have issues. I bent mine heavily to form a 90 degree angle, but I also did it as far back from the plastic connector as I could, and so far no issues. Otherwise, I wont be even be able to close the case door.


----------



## J7SC

bmagnien said:


> where’d you see the alphacool gigaoc blocks?


Reference is here (per Alphacool forum). I understand that there are still supply-chain issues in China (re. firm delivery dates)


----------



## mirkendargen

bmagnien said:


> speak for yourself


Some of us are packing a Strix, some of us are packing an Inno3d. Don't let the ladies find out


----------



## AcidWeb

Both TUF and Strix got VBIOS update.
"Update VBIOS to improve compatibility"


----------



## J7SC

Arizor said:


> Pics? I want to see this waterblock 'terracotta' army!


...'terracotta army' boot camp ... to be clear, this is for fun only, and limited to a light 3D load (see temps, ambient was 20 C), but I let it run for a while per GPU-Z below. Hasn't crashed yet so I might try to go higher. Also, it didn't melt any power connectors...


----------



## Xavier233

General question on the 4090s: I noticed that the GPU temps reported in many tools (MSI Afterburner for example) as actually the GPU temp itself, not the GPU Hotspot. I also noticed that there is about 10C difference between both: if its 70C on the GPU, its 80C on the hotspot. 

My question is, when Nvidia advertises the max GPU temp as 90C, does that refer to the GPU temp or the hotspot temp? If its the GPU temp, it means the hotspot temps are 100C


----------



## Xavier233

Second question: what GPU temps would you feel comfortable running at a max daily? Am personally going with 90C hotspot temp, which means 80C GPU temp.


----------



## jomama22

So this is testing a supreme x liquid, no shunting or ln2 or anything:









I won't be uploading because it's the same thing GN had happen (and I'm assuming a whole lot of the other scores on the HOF).

Basically, the HOF should not be used as a reference.

Once hwbot figures out what it wants to do with 4090 submissions is when I would start considering actually using these types of leaderboards as a comparison chart.


----------



## mirkendargen

jomama22 said:


> So this is testing a supreme x liquid, no shunting or ln2 or anything:
> View attachment 2577989
> 
> 
> I won't be uploading because it's the same thing GN had happen (and I'm assuming a whole lot of the other scores on the HOF).
> 
> Basically, the HOF should not be used as a reference.
> 
> Once hwbot figures out what it wants to do with 4090 submissions is when I would start considering actually using these types of leaderboards as a comparison chart.


It's still slightly useful because you can see what kinds of clocks people are able to get, and just ignore the inflated scores.


----------



## jomama22

mirkendargen said:


> It's still slightly useful because you can see what kinds of clocks people are able to get, and just ignore the inflated scores.


Clocks will be possibly bloated because of it, so it's something to watch out for. Anyone getting 29000+ is more or less guaranteed to be submitting a borked score, but I have also had high 28000's with the same.

Shunting and volt mods are really the only way to be getting 29000+ ATM.


----------



## mirkendargen

jomama22 said:


> Clocks will be possibly bloated because of it, so it's something to watch out for. Anyone getting 29000+ is more or less guaranteed to be submitting a borked score, but I have also had high 28000's with the same.
> 
> Shunting and volt mods are really the only way to be getting 29000+ ATM.


I'd correct this to say any 29k+ room temperature score is probably a borked score. The LN2 (and honestly water if waterblocks were more readily available. Someone putting an AIO rad in an icewater bucket could probably also put up monster numbers) scores are legit above 29k.....but not above 30k.


----------



## N19htmare666

Zogge said:


> Same here with my 5050X PBO OC and 4090. 116fps stable in BF2042 both in raster mode ultra as well as DLSS quality ultra on my CX 48".


This is exactly the same setup (5950x Pbo oc) I will have when I get mine. Please can do me a huge favour and run superposition 4k and post a screenshot of the results. I'm very interested to see the Delta with the 58003d and Intel scores posted before.


----------



## jim2point0

R1fast said:


> Was finally able to score a 4090 FE through BestBuy via a 'local store' notification on the Fixit Discord (1 cancelled order sitting at local BB store).


I wish I knew that was a thing. I found that discord and saw that my local best buy had an FE last Tuesday I could have snagged.

When you see that notification pop up, do you call them to confirm they have it? Do you just show up and ask around? Or can you order it online and go pick it up?


----------



## J7SC

mirkendargen said:


> It's still slightly useful because you can see what kinds of clocks people are able to get, and just ignore the inflated scores.





mirkendargen said:


> I'd correct this to say any 29k+ room temperature score is probably a borked score. The LN2 (and honestly water if waterblocks were more readily available. Someone putting an AIO rad in an icewater bucket could probably also put up monster numbers) scores are legit above 29k.....but not above 30k.


...I would add that the best way to check and compare scores at 3DMark is to look at the comp data 3DM provides re. how it stacks up to similar systems (ie. 13900K vs 3600X, DDR5 vs DDR4) on the left screen with the arc display (or by doing and advanced search comp), otherwise it is like the 24 hours of Le Mans and its different classes --- a modified GT car vs a factory prototype whooshing by. In addition to checking the 'class of system' at 3DM at a given bench, I also always check the indicated temps as well - it speaks volumes whether someone is on air, water, chiller, or full sub-zero.

...speaking of temps, I still haven't quite wrapped my head around the most efficient VRAM with my card. It can reach 1600+ artifact-free with a light OC on the core, but at full core power and PL, most efficient VRAM speed drops to between 1418 and 1500. I suspect it relates at least in part to hotspot temps (the efficient-VRAM drop gets worse the more hotspot rises above 70 C)...water-cooling at ambient will at least help to provide some clues. The 4090 also has 72 MB of cache (vs 6 MB of the 3090); surely that might have s.th. to do with it as well at really high internal temps.


----------



## mirkendargen

J7SC said:


> ...I would add that the best way to check and compare scores at 3DMark is to look at the comp data 3DM provides re. how it stacks up to similar systems (ie. 13900K vs 3600X, DDR5 vs DDR4) on the left screen with the arc display (or by doing and advanced search comp), otherwise it is like the 24 hours of Le Mans and its different classes --- a modified GT car vs a factory prototype whooshing by. In addition to checking the 'class of system' at 3DM at a given bench, I also always check the indicated temps as well - it speaks volumes whether someone is on air, water, chiller, or full sub-zero.
> 
> ...speaking of temps, I still haven't quite wrapped my head around the most efficient VRAM with my card. It can reach 1600+ artifact-free with a light OC on the core, but at full core power and PL, most efficient VRAM speed drops to between 1418 and 1500. I suspect it relates at least in part to hotspot temps (the efficient-VRAM drop gets worse the more hotspot rises above 70 C)...water-cooling at ambient will at least help to provide some clues. The 4090 also has 72 MB of cache (vs 6 MB of the 3090); surely that might have s.th. to do with it as well at really high internal temps.


I've given up on trying to figure out what the limits are on air. It's too hard to dial in something that's both stable at 25C at the start of a run, and 60C+ at the end of a run. I'll try harder when I have a block.


----------



## GQNerd

J7SC said:


> ...I would add that the best way to check and compare scores at 3DMark is to look at the comp data 3DM provides re. how it stacks up to similar systems (ie. 13900K vs 3600X, DDR5 vs DDR4) on the left screen with the arc display (or by doing and advanced search comp), otherwise it is like the 24 hours of Le Mans and its different classes --- a modified GT car vs a factory prototype whooshing by. In addition to checking the 'class of system' at 3DM at a given bench, I also always check the indicated temps as well - it speaks volumes whether someone is on air, water, chiller, or full sub-zero.


Not sure how 3dmark will address this (hopefully a one time purging of the leaderboard), but there are a TON of scores on the HOF that are not valid.. but the only way you can see they are not valid, is by adding them to "Compare". 

Check out the screenshot.. I kept their names out, but it's easy to see who's doing fishy stuff/glitching high scores:


----------



## z390e

Just a heads' up for those who saw the /r/Nvidia post about melted cable...

The user who had that issue plays Path of Exile, we saw cards in the 3 series go over 600w with unlocked BIOS on Path of Exile with global illumination on.

I see all the YT streamers are making videos over that one reddit post. I'm almost 100% sure this user just played the one game that is almost as bad as what New World did to GPU's, but POE's code is by design.

@Falkentyne first pointed out the path of exile huge power draw a while back in the 3 series KPE thread I think. Its the entire reason I dont put the 1000w 3080ti BIOS on my card, because I play POE.

Here is his post about it

https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/post-28708810

And you can even see people in that thread saying TDP is being surpassed quote:

*My 390 watt BIOS flashed card touched at least 405 watts with the global illumination on. It's the only game I've played, Cyperpunk 2077 included, that made it meaningfully overshoot the max TDP. *

We saw people posting about this on the POE forums as well even on AMD cards.

https://www.pathofexile.com/forum/view-thread/3272678


I am 100% certain that the user with the melted cable plays path of exile. His last comments in his comment history on reddit prove it.


----------



## N19htmare666

z390e said:


> Just a heads' up for those who saw the /r/Nvidia post about melted cable...
> 
> The user who had that issue plays Path of Exile, we saw cards in the 3 series go over 600w with unlocked BIOS on Path of Exile with global illumination on.
> 
> I see all the YT streamers are making videos over that one reddit post. I'm almost 100% sure this user just played the one game that is almost as bad as what New World did to GPU's, but POE's code is by design.
> 
> @Falkentyne first pointed out the path of exile huge power draw a while back in the 3 series KPE thread I think. Its the entire reason I dont put the 1000w 3080ti BIOS on my card, because I play POE.
> 
> Here is his post about it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [Official] NVIDIA RTX 3090 Owner's Club
> 
> 
> Uhhh....not gonna fanboy here, but that statement is hilariously incorrect to the point I'm not sure you aren't being sarcastic... You can make an argument that Intel processors are better for gaming than Ryzen processors, but bug free? Lol... So what’s a few security flaws that require...
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> And you can even see people in that thread saying TDP is being surpassed quote:
> 
> *My 390 watt BIOS flashed card touched at least 405 watts with the global illumination on. It's the only game I've played, Cyperpunk 2077 included, that made it meaningfully overshoot the max TDP. *
> 
> We saw people posting about this on the POE forums as well even on AMD cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Technical Support - PoE 3.1x - GPU Overheats with Global Illumination [FIXED mostly] - Forum - Path of Exile
> 
> 
> Path of Exile is a free online-only action RPG under development by Grinding Gear Games in New Zealand.
> 
> 
> 
> 
> www.pathofexile.com


Even if it's just one game, if it could happen to everyone and it's not an isolated incident then it's a major problem. 

I'm surprised that it could not be replicated using kombustor donut 5200mb... I wonder what other games it could effect.


----------



## Panchovix

It seems that it also happened to a TUF 4090 user, while playing Black Desert Online


__
https://www.reddit.com/r/nvidia/comments/yc6g3u/_/itmjt9d










Also to a Gigabyte OC user, not much info though


__
https://www.reddit.com/r/nvidia/comments/yc6g3u/_/itmneun


----------



## z390e

N19htmare666 said:


> Even if it's just one game, if it could happen to everyone and it's not an isolated incident then it's a major problem.
> 
> I'm surprised that it could not be replicated using kombustor donut 5200mb... I wonder what other games it could effect.


POE is F2P. Go into path of exile with your 4090 and turn global illumination on in the graphic settings yourself, but be ready for black screen like Falkentyne got on his 3 series. I imagine an unlocked BIOS with 4090 on POE would be the new champion power draw in a video game.

Its been this way since 2012 I think. Good luck getting them to "fix" it. I don't know any other games with this issue except Quake RTX


----------



## KedarWolf

My Strix OC is a really good Sample.

Got the below on air. Look at the clocks it'll run, zero artifacts.









I scored 28 640 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## KedarWolf

Both my BIOS's on my Strix were the 450W ones. I had to flash the 600W one.


----------



## Xavier233

KedarWolf said:


> My Strix OC is a really good Sample.
> 
> Got the below on air. Look at the clocks it'll run, zero artifacts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 640 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2578024
> 
> 
> View attachment 2578025


What was your max GPU temp recorded, and at how many max power?


----------



## BigMack70

z390e said:


> Just a heads' up for those who saw the /r/Nvidia post about melted cable...
> 
> The user who had that issue plays Path of Exile, we saw cards in the 3 series go over 600w with unlocked BIOS on Path of Exile with global illumination on.
> 
> I see all the YT streamers are making videos over that one reddit post. I'm almost 100% sure this user just played the one game that is almost as bad as what New World did to GPU's, but POE's code is by design.
> 
> @Falkentyne first pointed out the path of exile huge power draw a while back in the 3 series KPE thread I think. Its the entire reason I dont put the 1000w 3080ti BIOS on my card, because I play POE.
> 
> Here is his post about it
> 
> https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/post-28708810
> 
> And you can even see people in that thread saying TDP is being surpassed quote:
> 
> *My 390 watt BIOS flashed card touched at least 405 watts with the global illumination on. It's the only game I've played, Cyperpunk 2077 included, that made it meaningfully overshoot the max TDP. *
> 
> We saw people posting about this on the POE forums as well even on AMD cards.
> 
> https://www.pathofexile.com/forum/view-thread/3272678
> 
> 
> 
> *I refuse to have an account on reddit or youtube, so I am posting it here, to educate people.* I am 100% certain that the user with the melted cable plays path of exile. His last comments in his comment history on reddit prove it.


I'm currently boycotting Path of Exile because they removed ultra-wide monitor support in the most recent patch, but when I play that game even before, I always lowered my power limiter to below my card's stock power draw rating for these very reasons. It's a power virus game that will fry GPUs.


----------



## yzonker

z390e said:


> Just a heads' up for those who saw the /r/Nvidia post about melted cable...
> 
> The user who had that issue plays Path of Exile, we saw cards in the 3 series go over 600w with unlocked BIOS on Path of Exile with global illumination on.
> 
> I see all the YT streamers are making videos over that one reddit post. I'm almost 100% sure this user just played the one game that is almost as bad as what New World did to GPU's, but POE's code is by design.
> 
> @Falkentyne first pointed out the path of exile huge power draw a while back in the 3 series KPE thread I think. Its the entire reason I dont put the 1000w 3080ti BIOS on my card, because I play POE.
> 
> Here is his post about it
> 
> https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/post-28708810
> 
> And you can even see people in that thread saying TDP is being surpassed quote:
> 
> *My 390 watt BIOS flashed card touched at least 405 watts with the global illumination on. It's the only game I've played, Cyperpunk 2077 included, that made it meaningfully overshoot the max TDP. *
> 
> We saw people posting about this on the POE forums as well even on AMD cards.
> 
> https://www.pathofexile.com/forum/view-thread/3272678
> 
> 
> 
> *I refuse to have an account on reddit or youtube, so I am posting it here, to educate people.* I am 100% certain that the user with the melted cable plays path of exile. His last comments in his comment history on reddit prove it.


My TUF will only go a little over 500w with that game. I posted a screenshot a few days ago. Interested to know if any other models actually go higher.


----------



## jomama22

z390e said:


> Just a heads' up for those who saw the /r/Nvidia post about melted cable...
> 
> The user who had that issue plays Path of Exile, we saw cards in the 3 series go over 600w with unlocked BIOS on Path of Exile with global illumination on.
> 
> I see all the YT streamers are making videos over that one reddit post. I'm almost 100% sure this user just played the one game that is almost as bad as what New World did to GPU's, but POE's code is by design.
> 
> @Falkentyne first pointed out the path of exile huge power draw a while back in the 3 series KPE thread I think. Its the entire reason I dont put the 1000w 3080ti BIOS on my card, because I play POE.
> 
> Here is his post about it
> 
> https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/post-28708810
> 
> And you can even see people in that thread saying TDP is being surpassed quote:
> 
> *My 390 watt BIOS flashed card touched at least 405 watts with the global illumination on. It's the only game I've played, Cyperpunk 2077 included, that made it meaningfully overshoot the max TDP. *
> 
> We saw people posting about this on the POE forums as well even on AMD cards.
> 
> https://www.pathofexile.com/forum/view-thread/3272678
> 
> 
> 
> *I refuse to have an account on reddit or youtube, so I am posting it here, to educate people.* I am 100% certain that the user with the melted cable plays path of exile. His last comments in his comment history on reddit prove it.


I mean, unless he is shunted (which they clearly aren't) there is no excuse for it. The plug is rated at 600w, the power limiters should keep it there or at the very least, only allow incredibly short bursts over it. Tolerances are much lower because of the pin size.

It was dumb to go to this single connection plain and simple. 

The fact users even have to worry about how much the cable bends near the connector is very short sighted.


----------



## mirkendargen

z390e said:


> Just a heads' up for those who saw the /r/Nvidia post about melted cable...
> 
> The user who had that issue plays Path of Exile, we saw cards in the 3 series go over 600w with unlocked BIOS on Path of Exile with global illumination on.
> 
> I see all the YT streamers are making videos over that one reddit post. I'm almost 100% sure this user just played the one game that is almost as bad as what New World did to GPU's, but POE's code is by design.
> 
> @Falkentyne first pointed out the path of exile huge power draw a while back in the 3 series KPE thread I think. Its the entire reason I dont put the 1000w 3080ti BIOS on my card, because I play POE.
> 
> Here is his post about it
> 
> https://www.overclock.net/threads/official-nvidia-rtx-3090-owners-club.1753930/post-28708810
> 
> And you can even see people in that thread saying TDP is being surpassed quote:
> 
> *My 390 watt BIOS flashed card touched at least 405 watts with the global illumination on. It's the only game I've played, Cyperpunk 2077 included, that made it meaningfully overshoot the max TDP. *
> 
> We saw people posting about this on the POE forums as well even on AMD cards.
> 
> https://www.pathofexile.com/forum/view-thread/3272678
> 
> 
> 
> *I refuse to have an account on reddit or youtube, so I am posting it here, to educate people.* I am 100% certain that the user with the melted cable plays path of exile. His last comments in his comment history on reddit prove it.


I mean, even if POE is bananas and uses 10% above the BIOS power limit, 10% above the BIOS power limit is too close to comfort when we're talking about retail BIOSes, not some XOC not-for-the-public stuff. Some percentage of adapters will be crappy, some percentage of users will manage to stick it in wrong (giggity). This might be like....the only time I think an extra connection is a good idea for people with space issues and to get a 90deg adapter. Once I can get a block on I'll probably vertical mount mine so it won't be an issue.

The problem isn't the connector, it's the size of the cards ramming it up against the side of cases.


----------



## z390e

I dont think POE has any limits is the issue in general and my theory (unproven, just a theory) is that this could have contributed to that user's issues.


----------



## Arizor

Yep first thing it made me think of was the Amazon game fiasco.


----------



## Nizzen

jomama22 said:


> So this is testing a supreme x liquid, no shunting or ln2 or anything:
> View attachment 2577989
> 
> 
> I won't be uploading because it's the same thing GN had happen (and I'm assuming a whole lot of the other scores on the HOF).
> 
> Basically, the HOF should not be used as a reference.
> 
> Once hwbot figures out what it wants to do with 4090 submissions is when I would start considering actually using these types of leaderboards as a comparison chart.


I have 31k port royal bugged score LOL.


----------



## GRABibus

ASUS Strix OC ordered today


----------



## GRABibus

Nizzen said:


> I have 31k port royal bugged score LOL.


Link ?


----------



## Xavier233

Xavier233 said:


> General question on the 4090s: I noticed that the GPU temps reported in many tools (MSI Afterburner for example) as actually the GPU temp itself, not the GPU Hotspot. I also noticed that there is about 10C difference between both: if its 70C on the GPU, its 80C on the hotspot.
> 
> My question is, when Nvidia advertises the max GPU temp as 90C, does that refer to the GPU temp or the hotspot temp? If its the GPU temp, it means the hotspot temps are 100C


Anyone?


----------



## Arizor

Xavier233 said:


> Anyone?


No idea sadly, had this convo a few pages back. Everything absolutely pushed to max my hot spot hits 92C, which concerns me slightly (I'm running an undervolt now). I suspect GPU core max is 90C, so hot spot would be near 100C, but honestly don't know.


----------



## LunaP

Baasha said:


> Were you able to get the FE or Strix yet? I'm still waiting...


Nope, been checking daily, religiously even lol, the wait sucks. Part of me feels like FE might be a short lived thing with BB, but trying to stay positive. Its been 2 weeks now and still nothing...


----------



## N19htmare666

mirkendargen said:


> I mean, even if POE is bananas and uses 10% above the BIOS power limit, 10% above the BIOS power limit is too close to comfort when we're talking about retail BIOSes, not some XOC not-for-the-public stuff. Some percentage of adapters will be crappy, some percentage of users will manage to stick it in wrong (giggity). This might be like....the only time I think an extra connection is a good idea for people with space issues and to get a 90deg adapter. Once I can get a block on I'll probably vertical mount mine so it won't be an issue.
> 
> The problem isn't the connector, it's the size of the cards ramming it up against the side of cases.


Don't forget the power draw from the PCI slot (75w) is in addition to the 600w from the rated cable. So anything under 675w should be safe. The Neptune is rated at 630w out of the box.


----------



## jim2point0

jim2point0 said:


> I wish I knew that was a thing. I found that discord and saw that my local best buy had an FE last Tuesday I could have snagged.
> 
> When you see that notification pop up, do you call them to confirm they have it? Do you just show up and ask around? Or can you order it online and go pick it up?


Replying to myself because I actually just managed to pick one up following the exact same method, lol. 

I joined this discord: Join the Fixitfixitfixit Drops & Tech Discord Server!

You can get roles for USA drops, which I think is amazon, newegg, bestbuy, etc. And also bestbuy local roles.










I signed up for bestbuy notifications for my state, which ping you for when there's a canceled order sitting on the shelves at customer service. Not a few hours after joining, I got a notification for my local best buy. So I drove down and talked to the guy standing by the door as soon as you walk in. He looked it up on their system and said that they had 1 GPU. He went over and grabbed it and I paid for it. Also chatted up another guy who was picking up his online order from the stock drop last Wednesday. I told him about the discord since his friends are also trying to find them.

Anyways. Woo.


----------



## Xavier233

Arizor said:


> No idea sadly, had this convo a few pages back. Everything absolutely pushed to max my hot spot hits 92C, which concerns me slightly (I'm running an undervolt now). I suspect GPU core max is 90C, so hot spot would be near 100C, but honestly don't know.


Is it running stable with the undervolt? How much undervolt did you do?


----------



## LunaP

jim2point0 said:


> Replying to myself because I actually just managed to pick one up following the exact same method, lol.
> 
> I joined this discord: Join the Fixitfixitfixit Drops & Tech Discord Server!
> 
> You can get roles for USA drops, which I think is amazon, newegg, bestbuy, etc. And also bestbuy local roles.
> 
> View attachment 2578040
> 
> 
> I signed up for bestbuy notifications for my state, which ping you for when there's a canceled order sitting on the shelves at customer service. Not a few hours after joining, I got a notification for my local best buy. So I drove down and talked to the guy standing by the door as soon as you walk in. He looked it up on their system and said that they had 1 GPU. He went over and grabbed it and I paid for it. Also chatted up another guy who was picking up his online order from the stock drop last Wednesday. I told him about the discord since his friends are also trying to find them.
> 
> Anyways. Woo.
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2578043



This is actually positive news so thanks for this I've been on the discord but starting to feel Phoenix isn't such a popular place anymore lol They never had stock on day 1 and u'd think at least 1 BB did but this gives me hope at least. So thank you for this. +1


----------



## alasdairvfr

N19htmare666 said:


> This is exactly the same setup (5950x Pbo oc) I will have when I get mine. Please can do me a huge favour and run superposition 4k and post a screenshot of the results. I'm very interested to see the Delta with the 58003d and Intel scores posted before.


So not me you were asking but I have the 5950x on PBO, here is my score:









EDIT: just did another run changing pretty much nothing and better results











Not super impressive compared to some though. Any suggestions? This looks like I'm CPU constrained?


----------



## 8472

jim2point0 said:


> Replying to myself because I actually just managed to pick one up following the exact same method, lol.
> 
> I joined this discord: Join the Fixitfixitfixit Drops & Tech Discord Server!
> 
> You can get roles for USA drops, which I think is amazon, newegg, bestbuy, etc. And also bestbuy local roles.
> 
> View attachment 2578040
> 
> 
> I signed up for bestbuy notifications for my state, which ping you for when there's a canceled order sitting on the shelves at customer service. Not a few hours after joining, I got a notification for my local best buy. So I drove down and talked to the guy standing by the door as soon as you walk in. He looked it up on their system and said that they had 1 GPU. He went over and grabbed it and I paid for it. Also chatted up another guy who was picking up his online order from the stock drop last Wednesday. I told him about the discord since his friends are also trying to find them.
> 
> Anyways. Woo.
> 
> View attachment 2578043


One thing to note is that your discord account must be at least 30 days old to join. I tried to join it, but was rejected due to my account being created today.


----------



## Arizor

Xavier233 said:


> Is it running stable with the undervolt? How much undervolt did you do?


Runs rock solid.


----------



## N19htmare666

Clukos said:


> Likely CPU limit, this is what I got with a 5800X3D and I think even this is CPU limited
> View attachment 2577723





alasdairvfr said:


> So not me you were asking but I have the 5950x on PBO, here is my score:
> View attachment 2578049
> 
> 
> Not super impressive compared to some though. Any suggestions? This looks like I'm CPU constrained?


You're a star, thank you. So it looks like 14%(10% after the second run) gain by switching to a 5800x3d. It's giving me food for thought. A nice boost without a new mb or ram. Was that with a oc on the 4090?


yzonker has the superposition tips




yzonker said:


> Gotta know the tricks.
> 
> 
> View attachment 2577745


----------



## Baasha

That Discord is asking for a phone number to be able to see the channels, post etc. (aka verified)? huh? Did you actually provide some rando with your personal info??



jim2point0 said:


> Replying to myself because I actually just managed to pick one up following the exact same method, lol.
> 
> I joined this discord: Join the Fixitfixitfixit Drops & Tech Discord Server!
> 
> You can get roles for USA drops, which I think is amazon, newegg, bestbuy, etc. And also bestbuy local roles.
> 
> View attachment 2578040
> 
> 
> I signed up for bestbuy notifications for my state, which ping you for when there's a canceled order sitting on the shelves at customer service. Not a few hours after joining, I got a notification for my local best buy. So I drove down and talked to the guy standing by the door as soon as you walk in. He looked it up on their system and said that they had 1 GPU. He went over and grabbed it and I paid for it. Also chatted up another guy who was picking up his online order from the stock drop last Wednesday. I told him about the discord since his friends are also trying to find them.
> 
> Anyways. Woo.
> 
> View attachment 2578043


----------



## jomama22

Baasha said:


> That Discord is asking for a phone number to be able to see the channels, post etc. (aka verified)? huh? Did you actually provide some rando with your personal info??


It's because those discords breed scammers and bots. Requiring a phone number at least helps cut that down.


----------



## jomama22

LunaP said:


> This is actually positive news so thanks for this I've been on the discord but starting to feel Phoenix isn't such a popular place anymore lol They never had stock on day 1 and u'd think at least 1 BB did but this gives me hope at least. So thank you for this. +1


Stock is going to get better rapidly. Scalpers are starting to stop buying cards as there just isn't a margin for it (around me, marketplace has many listed under $2K already).


----------



## yzonker

N19htmare666 said:


> You're a star, thank you. So it looks like 14% gain by switching to a 5800x3d. It's giving me food for thought. A nice boost without a new mb or ram. Was that with a oc on the 4090?
> 
> 
> yzonker has the superposition tips


No way you'll gain 14% going from a 5950x to a 5800x3d. That benchmark isn't very CPU limited. It can make a difference, but nothing like 14%. With my 3090, my 5800x3D actually scored worse than my 5800x in Superposition. Not a 4090 but it scored WORSE probably due to the lower clock speed. I didn't find a single benchmark that the 5800x3D scored better in. Much faster in games but not in less CPU intensive graphics benchmarks. 

We'll see for sure pretty soon. I just popped a 13900k in my machine about an hour ago. I'll run Superposition again when I get the OC figured out on the new CPU. 

I'm not sure why you're scoring low though. What core/mem offsets did you run? Did you have everything else closed out? G-sync off, etc...?


----------



## changboy

I dont see any of my game have dlss 3.0 in the setting, do i need do something to activate this or something i missed ?


----------



## Arizor

changboy said:


> I dont see any of my game have dlss 3.0 in the setting, do i need do something to activate this or something i missed ?


What games? Only Plague Tale and Spiderman have it enabled currently. If it's missing from those games, enable Hardware Scheduling in GPU Settings


----------



## BigMack70

changboy said:


> I dont see any of my game have dlss 3.0 in the setting, do i need do something to activate this or something i missed ?


Only a couple games have DLSS Frame Generation currently. To use it, you do need to have "Hardware-accelerated GPU scheduling" enabled in Windows graphics settings.


----------



## N19htmare666

yzonker said:


> No way you'll gain 14% going from a 5950x to a 5800x3d. That benchmark isn't very CPU limited. It can make a difference, but nothing like 14%. With my 3090, my 5800x3D actually scored worse than my 5800x in Superposition. Not a 4090 but it scored WORSE probably due to the lower clock speed. I didn't find a single benchmark that the 5800x3D scored better in. Much faster in games but not in less CPU intensive graphics benchmarks.
> 
> We'll see for sure pretty soon. I just popped a 13900k in my machine about an hour ago. I'll run Superposition again when I get the OC figured out on the new CPU.
> 
> I'm not sure why you're scoring low though. What core/mem offsets did you run? Did you have everything else closed out? G-sync off, etc...?


Thank you. It's making me think whether I should hold off the CPU update for now.. I did see the lows in those screenshots went from 120 to 200fps from 5950x (in both screenshots) to 5800x3d. With lows being much more important is still making me think. Did you experience significant gains in the lows when switching to 5800x3d even though the average didn't improve?


----------



## changboy

Ok coz i read many game support it but it seam its not there yet. I dont have vrr or g syc with my Oled C8.

I didn't tried Plague Tale yet lol.


----------



## alasdairvfr

N19htmare666 said:


> You're a star, thank you. So it looks like 14%(10% after the second run) gain by switching to a 5800x3d. It's giving me food for thought. A nice boost without a new mb or ram. Was that with a oc on the 4090?
> 
> 
> yzonker has the superposition tips


I know the x3d does have some singlecore gains but not sure if something else is at play

I did this run with core + 255 & memory +1250 so that got about 3060-3090 core clock and 2950-ish memory which seems pretty good. Bumping core higher the bench fails.
Card is a Gigabyte Gaming OC



yzonker said:


> No way you'll gain 14% going from a 5950x to a 5800x3d. That benchmark isn't very CPU limited. It can make a difference, but nothing like 14%. With my 3090, my 5800x3D actually scored worse than my 5800x in Superposition. Not a 4090 but it scored WORSE probably due to the lower clock speed. I didn't find a single benchmark that the 5800x3D scored better in. Much faster in games but not in less CPU intensive graphics benchmarks.
> 
> We'll see for sure pretty soon. I just popped a 13900k in my machine about an hour ago. I'll run Superposition again when I get the OC figured out on the new CPU.
> 
> I'm not sure why you're scoring low though. What core/mem offsets did you run? Did you have everything else closed out? G-sync off, etc...?


I did update my bios today since I was on C6H beta bios that introduced Ryzen 5000 to the board. Updating it actually helped some performance issues I was getting adding a few thousand points to the score.

My Offsets are in the -15 to -25 depending on core. My PPT/TDC/EDC are 225/150/190 and playing with them doesn't seem to help much. It could be an issue with the board.
My single core speed has never been great on this CPU/board and while multicore is okay Superposition I feel is hitting a light thread bottleneck somewhere.

Gsync is off for this.

It could be other services running in the background maybe, I'm not sure how scheduling priority is handled with this bench. Cinebench I set to high or realtime to mitigate that issue.


----------



## changboy

I just try Plague Tale omg i have like 60 fps and i activate frame generation and jump at 180 fps, its crazy.


----------



## jim2point0

Baasha said:


> That Discord is asking for a phone number to be able to see the channels, post etc. (aka verified)? huh? Did you actually provide some rando with your personal info??


What? I didn't have to give my phone number.

This is what I see when I join.









After clicking that button, I see this:









After clicking that checkbox, I can see the channels. At no point did I enter my phone number anywhere. If I had to do that, I wouldn't have joined.


----------



## Nico67

Arizor said:


> View attachment 2577884
> 
> 
> Temps after running maxxed out PL, Voltage etc. in game stable settings for a while. I'm happy with the core temp and memory, but hot spot temperature is a bit high. What are we thinking folks in terms of maximum temps allowable?
> 
> I can easily run it undervolted at 0.95v for about 4-5% less performance (though still above stock perf).


GPU Rails Powers is a little concerning, I believe this is where the card quality will start to show itself, power excursions could start to become a problem at high core clk, 1.1v+ and max PL. Probably also evidenced by the higher hot spot temp. Watercooling should definately help tame this a bit though, keeping vrm's cooler.



8472 said:


> 35mm is going to be unrealistic for most people with horizontally mounted 4090s. That'd mean 175mm or more would be needed.
> 
> I don't understand why the AIBs couldn't do what cablemod is doing and include a 90 degree adapter.


I want to see cables with 90 degree 16pin connectors on them, surely that's got to be better for connectors and bend radius issues. To many adapters causes connector losses and potential issues, 4 x 8pins in 16pin adaptor into right angle adapter is not good. One 16pin right angle to 16pin psu at 16awg is going to be the safest and cleanest, and I am quite surprised by the lack of supportprior to launch, given this ATX3.0 spec is a few years old now.


----------



## tooandrew

hey everyone. i have a ZOTAC RTX4090 AMP EXTREME AIRO and the ****ing power limit stops at 110 with all 4 8pins connected. the voltage slider is disabled too. i paid an extra $100 for a ****ing gimped card. anything selling for $100 over msrp should have at least all the features the founders edition has. stay far away.

because nvidia's embargo for aib cards didnt lift until 9am launch day - you know, when microcenter was opening? i was forced to take a guess and hope i didnt get ****ed. i bought one of the ones over msrp because i thought, it would be crazy if the card that costs $100 more was worse than the base model. but nope. they ****ed me right in the ass without dinner or anything.

has anyone ****ed around with flashing vbios on these yet?


----------



## tooandrew

jim2point0 said:


> What? I didn't have to give my phone number.
> 
> This is what I see when I join.
> View attachment 2578056
> 
> 
> After clicking that button, I see this:
> View attachment 2578057
> 
> 
> After clicking that checkbox, I can see the channels. At no point did I enter my phone number anywhere. If I had to do that, I wouldn't have joined.


that's discord asking for his phone number, not the server, so it can classify him as a verified user. the server set verified users only. you already linked your phone number to your discord account so it didnt give you a hard time


----------



## Arizor

Nico67 said:


> GPU Rails Powers is a little concerning, I believe this is where the card quality will start to show itself, power excursions could start to become a problem at high core clk, 1.1v+ and max PL. Probably also evidenced by the higher hot spot temp. Watercooling should definately help tame this a bit though, keeping vrm's cooler.
> 
> 
> 
> I want to see cables with 90 degree 16pin connectors on them, surely that's got to be better for connectors and bend radius issues. To many adapters causes connector losses and potential issues, 4 x 8pins in 16pin adaptor into right angle adapter is not good. One 16pin right angle to 16pin psu at 16awg is going to be the safest and cleanest, and I am quite surprised by the lack of supportprior to launch, given this ATX3.0 spec is a few years old now.


Just to add a contrast and update, here's my temps with my .95 undervolt after a few hours of GPU load. "Max" column arrowed.


----------



## BTK

I have an msi gaming trio 4090 the adapter is only 3x8 pin and I was able to fill out a form on the seasonic site for a free 16 pin cable for my PSU (GX-1000w) is it ok to use this cable it’s 600w even though the adapter for the gpu is only 450w


----------



## jomama22

tooandrew said:


> hey everyone. i have a ZOTAC RTX4090 AMP EXTREME AIRO and the ****ing power limit stops at 110 with all 4 8pins connected. the voltage slider is disabled too. i paid an extra $100 for a ****ing gimped card. anything selling for $100 over msrp should have at least all the features the founders edition has. stay far away.
> 
> because nvidia's embargo for aib cards didnt lift until 9am launch day - you know, when microcenter was opening? i was forced to take a guess and hope i didnt get ****ed. i bought one of the ones over msrp because i thought, it would be crazy if the card that costs $100 more was worse than the base model. but nope. they ****ed me right in the ass without dinner or anything.
> 
> has anyone ****ed around with flashing vbios on these yet?


Yes, it is easy enough with the newest nvflash and done the same way as it has been for ages now.


----------



## LuckyImperial

BTK said:


> I have an msi gaming trio 4090 the adapter is only 3x8 pin and I was able to fill out a form on the seasonic site for a free 16 pin cable for my PSU (GX-1000w) is it ok to use this cable it’s 600w even though the adapter for the gpu is only 450w


Yeah, that's fine. The graphics card will now have access to more power if you ever choose to overclock. You can now move the power slider beyond 100%.


----------



## BTK

LuckyImperial said:


> Yeah, that's fine. The graphics card will now have access to more power if you ever choose to overclock. You can now move the power slider beyond 100%.


Is the bios locked to 450w though


----------



## alitayyab

tooandrew said:


> hey everyone. i have a ZOTAC RTX4090 AMP EXTREME AIRO and the ****ing power limit stops at 110 with all 4 8pins connected. the voltage slider is disabled too. i paid an extra $100 for a ****ing gimped card. anything selling for $100 over msrp should have at least all the features the founders edition has. stay far away.
> 
> because nvidia's embargo for aib cards didnt lift until 9am launch day - you know, when microcenter was opening? i was forced to take a guess and hope i didnt get ****ed. i bought one of the ones over msrp because i thought, it would be crazy if the card that costs $100 more was worse than the base model. but nope. they ****ed me right in the ass without dinner or anything.
> 
> has anyone ****ed around with flashing vbios on these yet?


With the latest firestorm utility you can up the voltage to 1.1V from 1.05.
For max power limit, i dont think going to 600 Watt adds much as from what i understand the card is mostly voltage bound.


----------



## WilliamLeGod

Any1 has data of performance gain from 10900k to 13900k using 4090 at 4k max settings games? Thanks!


----------



## morph.

Woot Cablemod to the rescue, finally got rid of that unsightly adapter. Now just to wait for the 90 deg adapter and GPU water block!


----------



## schoolofmonkey

If anyone is interested in the Thermaltake Toughpower GF3 1200w 12VHPWR cable looks like:


----------



## LuckyImperial

BTK said:


> Is the bios locked to 450w though


No. If you use an adapter with 4 plugs, or a dedicated 12vhpwr cable the card will have access to more power than 450W.


----------



## LukeOverHere

Clearly i have issues …… i thought, oh, a 4090 will be a kool upgrade, so i bought one…. Then @zhrooms suggested selling my EVGA T2 1600W PSU and buying a compatible PSU for the 4090 to save using dodgy cables…. I thought… hmm thats a great idea and wont cost anything as the EVGA T2 Sale will cover the cost…… next minute…. 13900K + Kingston 6000Mhz DDR5, Noctua Mounting Kit + Asus Prime Z790 Board checked out …. Its been a very cheap upgrade, thank you all for your motivational speeches


----------



## gooface

so regarding the cards eating more than 600w and ruining the connectors, if I use only 3 out of the 4 plugs on my card (giga OC) when it comes tomorrow, will that limit it from not getting even close to the 600+ w current the cards are pulling?


----------



## LuckyImperial

gooface said:


> so regarding the cards eating more than 600w and ruining the connectors, if I use only 3 out of the 4 plugs on my card (giga OC) when it comes tomorrow, will that limit it from not getting even close to the 600+ w current the cards are pulling?



Yes, or you could plug in all 4 and not touch the power slider. Both should have similar results.


----------



## dante`afk

damage control


----------



## J7SC

dante`afk said:


> damage control
> 
> View attachment 2578098


...did you manage to see the pre-censored post ? I wonder if it was just descriptive, or not...


----------



## mirkendargen

LukeOverHere said:


> Clearly i have issues …… i thought, oh, a 4090 will be a kool upgrade, so i bought one…. Then @zhrooms suggested selling my EVGA T2 1600W PSU and buying a compatible PSU for the 4090 to save using dodgy cables…. I thought… hmm thats a great idea and wont cost anything as the EVGA T2 Sale will cover the cost…… next minute…. 13900K + Kingston 6000Mhz DDR5, Noctua Mounting Kit + Asus Prime Z790 Board checked out …. Its been a very cheap upgrade, thank you all for your motivational speeches


That T2 1600 was probably higher quality than whatever you replaced it with


----------



## heptilion

Should I be concerned? This on HAF700 Evo


----------



## Paynal

J7SC said:


> ...did you manage to see the pre-censored post ? I wonder if it was just descriptive, or not...


Lurker since the Northwood era, first time poster.

I grabbed it via reveddit.



> While playing Overwatch 2 the card stopped working, restarted the pc and had no leds on the card and screen artifacting. Gigabyte OC, luckily mine didn’t catch fire


There were two photos attached.


----------



## morph.

heptilion said:


> Should I be concerned? This on HAF700 Evo
> View attachment 2578105


That bending exceeds the recommended parameters of 3-3.5cm before bending fyi.


----------



## mirkendargen

heptilion said:


> Should I be concerned? This on HAF700 Evo
> View attachment 2578105


It looks like it's torqueing the actual connector so I'd say yes.


----------



## heptilion

morph. said:


> That bending exceeds the recommended parameters of 3-3.5cm before bending fyi.


Phew, hopefully this will hold till I can get a Corsair cable. No place to buy in Australia


----------



## Arizor

Just to calm nerves slightly, this is a solid quick video on the (probable) cause, which is bending very close to the pins. So just make sure not to aggressively bend and twist the cable as you’re installing and it _should_ be ok.


----------



## LunaP

jomama22 said:


> Stock is going to get better rapidly. Scalpers are starting to stop buying cards as there just isn't a margin for it (around me, marketplace has many listed under $2K already).


Yeah I'm staying in the BB channel till it pops up, 1 did but it was gone as I clicked hte link but was 15 miles away ( MSI ) so its giving me hope, whats 72 hours of no sleep anyways I'm a gamer! )


----------



## LukeOverHere

mirkendargen said:


> That T2 1600 was probably higher quality than whatever you replaced it with


Yes, i have to admit, the EVGA T2 is an absolute beast of a product even though it is aging…Quality wise my new psu is likely a slight downgrade…..in saying that i have never had a psu fail. At the same time, i wanted a new ATX 3 / PCIe 5.0 compatible PSU so I replaced it with a Thermaltake GF3 1350W


----------



## J7SC

Paynal said:


> Lurker since the Northwood era, first time poster.
> 
> I grabbed it via reveddit.
> 
> 
> 
> There were two photos attached.
> 
> View attachment 2578106
> View attachment 2578107


Thanks !


----------



## tooandrew

alitayyab said:


> With the latest firestorm utility you can up the voltage to 1.1V from 1.05.
> For max power limit, i dont think going to 600 Watt adds much as from what i understand the card is mostly voltage bound.





https://www.zotac.com/us/firestorm


the one from here? because the one from here isnt letting me change voltage either.

edit: i found the right one. nevermind


----------



## EastCoast

heptilion said:


> Should I be concerned? This on HAF700 Evo
> View attachment 2578105


Yes. There is a reason why Intel, who helped design this, doesn't use it.


----------



## sebastiankhoa

WilliamLeGod said:


> Any1 has data of performance gain from 10900k to 13900k using 4090 at 4k max settings games? Thanks!


 Here 4k benchmark game from i9 10850k to 5800X3D, so much performance gain. Game: *Horizon zero dawn: Max setting 4K* Frist i9 10850k:







And 5800X3D







Almost 30FPS gain Shadow of the Tomb raider i9







R7







12FPS and you can see on i9 GPU bound 46%, and AMD gpu bound 99% 3D Mark TimeSpy


----------



## J7SC

I am not too thrilled about this new power connector, either. I suppose that if they would have gone for a traditional 4x 8 pin PCIe, the pcbs would have been wider by at least an inch - besides, it is clear that they wanted to move the power delivery as close to the actual die as possible. 

Still, I noticed before that on my 3090 with 3x 8 pin PCIe and 520 W vbios, each of the PCIe cables (stock ones that come with Seasonic Prime Platinum PX 1300) would get warm - not hot, but warm during heavy GPU loads. At the same time, with a full waterblock, thermal putty and all that, the 3090's VRM didn't exceeded mid 30s C (HWInfo has sensors for that for the 3090 Strix). 

With these 4 into 1 dongles for the 4090, it is clear that all the PCIe supply cables are bunching together to join into a single connector with a much smaller total area than 4x 8 pin PCIE would have had...so clearly less area for heat dissipation while transferring more energy. Then there is this whole thing about 'do not bend' for 3 cm to 3.5 cm cable length at the joint end. I understand why, but for a consumer product that goes into people's homes, it is worth questioning these release decisions... 

Hopefully, most PSU manufacturers will make the required new cables w/o dongle available, even for some older models.


----------



## long2905

J7SC said:


> Hopefully, most PSU manufacturers will make the required new cables w/o dongle available, even for some older models.


i have an antec hcp1300 which works fine. but the chance they would create a cable for it is slim


----------



## th3illusiveman

jim2point0 said:


> Replying to myself because I actually just managed to pick one up following the exact same method, lol.
> 
> I joined this discord: Join the Fixitfixitfixit Drops & Tech Discord Server!
> 
> You can get roles for USA drops, which I think is amazon, newegg, bestbuy, etc. And also bestbuy local roles.
> 
> View attachment 2578040
> 
> 
> I signed up for bestbuy notifications for my state, which ping you for when there's a canceled order sitting on the shelves at customer service. Not a few hours after joining, I got a notification for my local best buy. So I drove down and talked to the guy standing by the door as soon as you walk in. He looked it up on their system and said that they had 1 GPU. He went over and grabbed it and I paid for it. Also chatted up another guy who was picking up his online order from the stock drop last Wednesday. I told him about the discord since his friends are also trying to find them.
> 
> Anyways. Woo.
> 
> View attachment 2578043


Ok, can someone guide me through how to set that up? I have literally never used discord but have an old account. I clicked the link, verified my account but i dont see any of those roles/ role management? i know i sound like a grandpa, but idk lol.


----------



## jim2point0

So... even with only 3 of the power connectors hooked up (I don't feel like digging out a 4th cable right now), I still get double the performance in one of the most demanding games I've played.

Here's a scene where I only got 48 FPS in Dying Light 2 with my 3080TI.









And now 93 fps with the 4090. 93% more performance, lol. 









That's impressive. Not gonna try benchmarking until I do a 4th connector. It's just... my PCIE cables are all custom and I only have 3 of them. I want to get a 4th cable to match first.


----------



## Zogge

alasdairvfr said:


> So not me you were asking but I have the 5950x on PBO, here is my score:
> View attachment 2578049
> 
> 
> EDIT: just did another run changing pretty much nothing and better results
> 
> View attachment 2578053
> 
> 
> 
> Not super impressive compared to some though. Any suggestions? This looks like I'm CPU constrained?












This would take me to place 22 in top 100. 
TUF BIOS, max volt, max power, +265 / +1100. PBO all core curve -25 on 5950x with maxed TDP/EDP/PPT, 3600mhz cl14 memory. 
I am still running it in the normal Windows 11 environment etc with lots of background applications etc so I am sure it can be further optimized if I was pushing for a top 10 position. Also I could place it closer to the window as it is 11 deg c outside in Sweden today.. but this was just as it is mounted in my already cramped case.


----------



## th3illusiveman

th3illusiveman said:


> Ok, can someone guide me through how to set that up? I have literally never used discord but have an old account. I clicked the link, verified my account but i dont see any of those roles/ role management? i know i sound like a grandpa, but idk lol.


Ok, i think i figured it out... but as usual, Canada gets the short end of the stick - the roles they have are pretty much only useful for USA. the Canada ones are for last gen and the best buy tracker is only US states.... 

Any other discords w/ Canada bestbuy stock? only interested in FE.


----------



## Nizzen

GRABibus said:


> Link ?


I hided it, but took a screen on phone now


----------



## RaMsiTo

Miguelios said:


> Not sure how 3dmark will address this (hopefully a one time purging of the leaderboard), but there are a TON of scores on the HOF that are not valid.. but the only way you can see they are not valid, is by adding them to "Compare".
> 
> Check out the screenshot.. I kept their names out, but it's easy to see who's doing fishy stuff/glitching high scores:
> View attachment 2578015


I agree, that 29509 is mine, this link is from another pass which appears as valid..









I scored 28 884 in Port Royal


Intel Core i9-9900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## LunaP

Fingers crossed, there's 1 showing available at a nearby bestbuy ( not online but at the store supposedly ) gonna drive over in the morning and pass the SKU. Gigabyte OC 4090 hopefully blocks exist.


----------



## Arizor

Hey folks, got a response from ASUS regarding GPU hot spots:


----------



## diogogmaio

Quick question guys.
The FE with 600w is still the best 4090 to buy? Or any reason to go to the other brands?
Any advantage?
Thanks
Just want to avoid MSI since they are garbage (specially with vBIOS compatibility with the SUPRIM x 3080ti)


----------



## Antsu

Finally had time to install my TUF. Seems to be average, maybe even a little bit better than average but my room was cold when I was testing it out. 

I forgot to give you an update, but Aquacomputer did reply to me and say that they are already working on a TUF / STRIX block, but can't give any ETA at the moment.


----------



## Nico67

Arizor said:


> Just to add a contrast and update, here's my temps with my .95 undervolt after a few hours of GPU load. "Max" column arrowed.
> 
> View attachment 2578067


Yeah it looks much better at that, which is likely what Nvidia intended in order to keep it all under control 



J7SC said:


> I am not too thrilled about this new power connector, either. I suppose that if they would have gone for a traditional 4x 8 pin PCIe, the pcbs would have been wider by at least an inch - besides, it is clear that they wanted to move the power delivery as close to the actual die as possible.
> 
> Still, I noticed before that on my 3090 with 3x 8 pin PCIe and 520 W vbios, each of the PCIe cables (stock ones that come with Seasonic Prime Platinum PX 1300) would get warm - not hot, but warm during heavy GPU loads. At the same time, with a full waterblock, thermal putty and all that, the 3090's VRM didn't exceeded mid 30s C (HWInfo has sensors for that for the 3090 Strix).
> 
> With these 4 into 1 dongles for the 4090, it is clear that all the PCIe supply cables are bunching together to join into a single connector with a much smaller total area than 4x 8 pin PCIE would have had...so clearly less area for heat dissipation while transferring more energy. Then there is this whole thing about 'do not bend' for 3 cm to 3.5 cm cable length at the joint end. I understand why, but for a consumer product that goes into people's homes, it is worth questioning these release decisions...
> 
> Hopefully, most PSU manufacturers will make the required new cables w/o dongle available, even for some older models.


I think the 16pins should really be max rated at 300w as they really only have the same amount pins. At max 450w and just put two of those on a card 

I had two 8pins on my 3090, running capped at 500w for two years no worries, but I did get two shorter aftermarket cables in 16awg, which never even got warm. That's pretty much what a 16pin is, just more compact with smaller pins.


----------



## 8472

CableMod is going to do a 180 degree adapter as well as the 90 degree one. This will eliminate the need to bend the cable at all. 


__
https://www.reddit.com/r/nvidia/comments/y6bb6f/_/itp1s3u


----------



## J7SC

Nizzen said:


> I hided it, but took a screen on phone now
> 
> View attachment 2578139


...misery loves company ? Almost same clocks, valid (other than my air-cooler wanted another bin for breakfast)


----------



## GRABibus

GRABibus said:


> ASUS Strix OC ordered today


I m curious to see how much I will be bottlenecked with my 5950X 😂


----------



## J7SC

Arizor said:


> Hey folks, got a response from ASUS regarding GPU hot spots:
> 
> View attachment 2578143


Nice & useful info ! While I have a few higher Superposition 4K at full blast and 18 C ambient, this one is as far as I am willing to go at 24 C ambient and air cooling. Note 1.05v max instead of 1.1v, 115 % PL instead of 133%, and NV CP panel set to 'normal' power instead of 'prefer max power', so also some downclocking. Once watercooled, I'll push the full boundaries. FYI, I also have a 120mm Arctic P12 blowing on the back of the card and the power connector...seems to work well.


----------



## GRABibus

J7SC said:


> ...misery loves company ? Almost same clocks, valid (other than my air-cooler wanted another bin for breakfast)
> View attachment 2578167


5950X bottleneck ?


----------



## GRABibus

Nizzen said:


> I hided it, but took a screen on phone now
> 
> View attachment 2578139


What’s your CPU there ?


----------



## dboom

Easy.
24 ambient.


----------



## alasdairvfr

sebastiankhoa said:


> Here 4k benchmark game from i9 10850k to 5800X3D, so much performance gain. Game: *Horizon zero dawn: Max setting 4K* Frist i9 10850k:
> 
> 
> 
> 
> 
> 
> 
> And 5800X3D
> 
> 
> 
> 
> 
> 
> 
> Almost 30FPS gain Shadow of the Tomb raider i9
> 
> 
> 
> 
> 
> 
> 
> R7
> 
> 
> 
> 
> 
> 
> 
> 12FPS and you can see on i9 GPU bound 46%, and AMD gpu bound 99% 3D Mark TimeSpy


That's quite the uplift! What were your settings for DLSS/RT for these? This really shows something interesting here, that while the i9 has a better CPU score in Timespy - the 5800X3D's single core performance pulls the graphics score ahead substantially.





Zogge said:


> View attachment 2578135
> 
> 
> This would take me to place 22 in top 100.
> TUF BIOS, max volt, max power, +265 / +1100. PBO all core curve -25 on 5950x with maxed TDP/EDP/PPT, 3600mhz cl14 memory.
> I am still running it in the normal Windows 11 environment etc with lots of background applications etc so I am sure it can be further optimized if I was pushing for a top 10 position. Also I could place it closer to the window as it is 11 deg c outside in Sweden today.. but this was just as it is mounted in my already cramped case.


Oh so my 34,061 score isn't bad then? I also have a bunch of stuff running in the background most of the time


----------



## Nizzen

13


GRABibus said:


> What’s your CPU there ?


13900kf p5800/e4700/ring4900


----------



## Laithan

Arizor said:


> Hey folks, got a response from ASUS regarding GPU hot spots:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> View attachment 2578143


OMG you found Gavin! lol

(rdr2 reference)


----------



## MrTOOSHORT

Giga OC 4090:


----------



## stahlhart

Okay, I'll fall on the grenade -- why don't they come up with something like this?

(I didn't read all the way through the thread -- if it has already been addressed, ignore me)


----------



## KedarWolf

Arizor said:


> Hey folks, got a response from ASUS regarding GPU hot spots:
> 
> View attachment 2578143


My ASUS Strix OC when running Cyberpunk with everything completely maxed out and HDR on gets 64C core. 64C hotspot and I forget the memory, but I'll check later when I get home from work.


----------



## Baasha

Is anyone near the Chicago MC by chance? Please let me know ASAP.


----------



## Glottis

schoolofmonkey said:


> If anyone is interested in the Thermaltake Toughpower GF3 1200w 12VHPWR cable looks like:
> View attachment 2578095
> 
> View attachment 2578096


This potentially can be very problematic to cable manage. That thick shrink wrap is too close to connector and is a nightmare when you need to bend the cable. My Corsair PSU modular cables look similar and they are just terrible. At least Corsair's 12VHPWR cable comes without awful shrink wrap.


----------



## Panchovix

Does someone have the PNY XLR8 Verto 4090? Wondering how it is on PCB/VRM, since it comes with a 4x8 cable adapter, but the max by the VBIOS is 450W if I'm not wrong.

Asking since I can buy a MSI 4090 Gaming Trio (but comes with just an 3x8 pin adapter, so no 600W even with VBIOS flash) or the PNY one (which I haven't heard like ever, so wondering if the PCBs/VRms suck lol)


----------



## KedarWolf

Glottis said:


> This potentially can be very problematic to cable manage. That thick shrink wrap is too close to connector and is a nightmare when you need to bend the cable. My Corsair PSU modular cables look similar and they are just terrible. At least Corsair's 12VHPWR cable comes without awful shrink wrap.


I'm thinking about getting the 90-degree 12VHPWR cable when it's available.






CableMod 12VHPWR Angled Adapter – CableMod Global Store







store.cablemod.com





Edit: Oh, I see it's just an adapter, you plug your existing 12VHPWR cable into it.


----------



## warrior-kid

Panchovix said:


> Does someone have the PNY XLR8 Verto 4090? Wondering how it is on PCB/VRM, since it comes with a 4x8 cable adapter, but the max by the VBIOS is 450W if I'm not wrong.
> 
> Asking since I can buy a MSI 4090 Gaming Trio (but comes with just an 3x8 pin adapter, so no 600W even with VBIOS flash) or the PNY one (which I haven't heard like ever, so wondering if the PCBs/VRms suck lol)


I've sent mine back unopened and replaced with Suprim X. I did investigate it, however, and am pretty sure it has a full x4 cable and very standard regular PCB and clocks.


----------



## Panchovix

warrior-kid said:


> I've sent mine back unopened and replaced with Suprim X. I did investigate it, however, and am pretty sure it has a full x4 cable and very standard regular PCB and clocks.


Thanks man! Yeah, I suspected it would be as barebones as it could.

Welp, gonna wait to another card, thanks!


----------



## scanty85usmc

parky.fp said:


> Anyone else with an MSI X570-A pro experiencing an issue getting a display signal? VGA light is red with an ASUS TUF 4090 - system boots fine with a 3090.
> 
> Bios updated to latest,1000w PSU


I have a Gigabyte X570 board and on cold boots i get no signal, once I hard cycle the power it works the second. But its every time I boot my PC up for the first time it doesn't want to detect my display port cable. Very frustrating.


----------



## AdamK47

In light of the increasing news of 12V connectors burning up, I think I'll stick with the three 8-pin connectors and a 520W BIOS on my Gaming Trio. You guys have fun with your 600W.


----------



## pajonk

AdamK47 said:


> In light of the increasing news of 12V connectors burning up, I think I'll stick with the three 8-pin connectors and a 520W BIOS on my Gaming Trio. You guys have fun with your 600W.


3 Pins are just 450W!


----------



## legomax

Carillo said:


> To me that look’s like a plug not properly inserted to the connector .. that doesn’t just happen with a proper connection.


 What happens mostly is that if you bend it too much. It will slightly pull out certain pins. And it will heat up too much cause of it


----------



## Panchovix

pajonk said:


> 3 Pins are just 450W!


In theory you should be able to use 525W I think? (The adapter is 3x8pin, so 3x150W = 450W + 75W from the PCI-E Slot)
Though, there's no way to use 600W (maybe it does works if you flash a VBIOS and get a 4x8 adapter; not sure if someone have tested that though, or if the VRMs will be able to hold that)
(Or well, shunt modding, but doesn't seem worth at all on the 4090)


----------



## 8472

__ https://twitter.com/i/web/status/1584950589393293312


----------



## mirkendargen

Panchovix said:


> In theory you should be able to use 525W I think? (The adapter is 3x8pin, so 3x150W = 450W + 75W from the PCI-E Slot)
> Though, there's no way to use 600W (maybe it does works if you flash a VBIOS and get a 4x8 adapter; not sure if someone have tested that though, or if the VRMs will be able to hold that)
> (Or well, shunt modding, but doesn't seem worth at all on the 4090)


HWinfo seems to always say the slot power is only 10w no matter what. It may or may not be reading correctly. And furmark with 600w BIOS can definitely exceed 600w by a small amount.


----------



## AdamK47

pajonk said:


> 3 Pins are just 450W!


3 pins would be more like 30W.

Three 8 Pin connectors goes over 500W. I should know. It's sitting in front of me right now. It's like you guys haven't heard of the 3090 Ti. All came with three 8 Pin connectors to 12V connector.


----------



## legomax

Panchovix said:


> In theory you should be able to use 525W I think? (The adapter is 3x8pin, so 3x150W = 450W + 75W from the PCI-E Slot)
> Though, there's no way to use 600W (maybe it does works if you flash a VBIOS and get a 4x8 adapter; not sure if someone have tested that though, or if the VRMs will be able to hold that)
> (Or well, shunt modding, but doesn't seem worth at all on the 4090)


Yeah no the FE is only 450 plus 75 pcie slot power. The partners can do 600 plus 75 tho.


----------



## mirkendargen

AdamK47 said:


> 3 pins would be more like 30W.
> 
> Three 8 Pin connectors goes over 500W. I should know. It's sitting in front of me right now. It's like you guys haven't heard of the 3090 Ti. All came with three 8 Pin connectors to 12V connector.


And there wasn't ever any drama about cooked 3090ti connectors (or 3090fe). I think it really is all coming down to the height of the cards ramming the connector against case windows.


----------



## AvengedRobix

Annoyng from people who: the new connector catches fire...


----------



## 8472

The gigabyte oc appears to be in stock at their US store. 









GIGABYTE GeForce RTX 4090 GAMING OC 24G Graphics Card (GV-N4090GAMING OC-24GD)







store.gigabyte.us


----------



## pajonk

I made a video about Undervolting vs Overclocking vs Powerlimit
May it help you...


----------



## motivman

flashed 600W suprim X bios to my dud trio, and max power draw 577W. My card is so garbage.. maxes out at +1100 memory and will not run anything over 3000 MHz for core... I need me a good clocking 4090, SMH


----------



## motivman

8472 said:


> The gigabyte oc appears to be in stock at their US store.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GIGABYTE GeForce RTX 4090 GAMING OC 24G Graphics Card (GV-N4090GAMING OC-24GD)
> 
> 
> 
> 
> 
> 
> 
> store.gigabyte.us


And..... SOLD OUT!!!! anything to replace my garbage trio.. SMH


----------



## motivman

pajonk said:


> 3 Pins are just 450W!


Nope, with 600W suprim X bios, I can pull as much as 577W on my Gaming trio.


----------



## AdamK47

motivman said:


> flashed 600W suprim X bios to my dud trio, and max power draw 577W. My card is so garbage.. maxes out at +1100 memory and will not run anything over 3000 MHz for core... I need me a good clocking 4090, SMH
> View attachment 2578312


I know what you mean. How can you ever continue living knowing there are other 4090s capable of delivering 2 FPS more than what you're currently getting?


----------



## motivman

AdamK47 said:


> I know what you mean. How can you ever continue living knowing there are other 4090s capable of delivering 2 FPS more than what you're currently getting?


I want to be the top 100 port royal leaderboards man...  first world problems huh? lol


----------



## Laithan

As many have seen, getting the GPU to draw its _full_ power can be tricky. At this stage, until we know more about the 4090 (time tells all) we probably don't want to even try and draw full power.. I mean, who even feels confident to push things to limits right now as it seems no better time than now to just "wait and see".

_<rant>_ IMO even if you ignore fire and safety risk, creating a brand new (questionable) connector that is almost instantly running at its maximum capacity seems like an inadequet design choice. Given the added fire and safety risks plus the inability to "massage" the position of the cable as needed I cannot see this being anything other than DOA and perhaps will make history as a notable design failure.

...but as we know there's always those who ride the fine line and head no warnings lol.. Definitely don't do this but if you really want to try and max out power draw you can use the Furmark stress test. Not the regular furmark it must be the stress test option. It is going to take a moment but if you let it run it will start to sip more as it goes. You may reach temp limits before power limits but if you can keep it cool enough it should generate max power. I have not tried this with my 4090 (not gonna) but this is me drawing 531W with my 3090Ti (which only had 3x8-pins). The XOC BIOS was defined at 516W (including PCIe power) and as you can see I was able to slightly even exceed that with the stress test.
https://www.overclock.net/threads/e...tra-hybrid-hydro-copper.1798041/post-28981806


----------



## LunaP

I'm official now! \o/

Got up early and went to BB since tracker mentioned a SKU, tons of people there, some old guy blocking the door wanting in first so they opened the other door, lucked out they had 3 cards in stock!

Now to find a damn water block to go w/ this + which type of screws too (gonna have to order pads and stuff + screwdriver kit )


----------



## Xavier233

2 Gigabyte 4090 OC, both have coil whine. Am starting to give up


----------



## AdamK47

motivman said:


> I want to be the top 100 port royal leaderboards man...  first world problems huh? lol


I use to be on the leaderboards back when I was doing 3-Way and 4-Way SLI. Now I value stability and playing actual games with the hardware. There will be cards coming out in the future (however long that may be) that will trounce any 4090. The sliver of performance you're wanting will be even more meaningless.


----------



## Nizzen

motivman said:


> I want to be the top 100 port royal leaderboards man...  first world problems huh? lol


I like more Timespy/timespy extreme because of cpu performance  (Nzz)


----------



## AdamK47

LunaP said:


> I'm official now! \o/
> 
> Got up early and went to BB since tracker mentioned a SKU, tons of people there, some old guy blocking the door wanting in first so they opened the other door, lucked out they had 3 cards in stock!
> 
> Now to find a damn water block to go w/ this + which type of screws too (gonna have to order pads and stuff + screwdriver kit )


Nice to see the cards are trickling in.

The old guy was blocking one door, so a BB employee opened the door he wasn't standing in front of? Hilarious if true.


----------



## LunaP

AdamK47 said:


> Nice to see the cards are trickling in.
> 
> The old guy was blocking one door, so a BB employee opened the door he wasn't standing in front of? Hilarious if true.


Yeah I was waiting NEAR the door and more and more people started showing up ,then this guy just walks in front of us and just leans against it, so the BB employee walked up then over and undid the shut door and we trickled in through there, I literally bolted.


----------



## biigshow666

Got my gigabyte gaming oc yesterday. What a great card. So far can only get +250 / +1800 to run .


----------



## LunaP

Anyone recommend any good DP splitters, gonna try and just split the ports for 2 additional monitors vs using an additioanl GPU. 1x 1440p and 1x 1080p

Hopefully blocks are up for the OC gaming.


----------



## alasdairvfr

Did some more 3dmark

While my scores aren't like truly amazing or anything they do put me in the top 100 for my cpu/gpu combo which is all I need lol.

NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com) 











NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com) 










NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,Gigabyte Technology Co., Ltd. X570 AORUS MASTER (3dmark.com) 










NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)









Also after some system tweaks I bumped my Superposition score like 500 pts









Most of these were on a +255 core and +1400-1600 mclk, this card seems pretty happy in the 3060-3090 mhz core range in most games and tests I have tried. 3105 it can handle in a few but gets dicey. 

Power consumption doesn't typically exceed 565w @ 1.1v. 133% power. Need more voltage to go any higher or drop temps further. Not planning to put the card on water but where I live temps go to -20C so I might be able to squeeze a bit more out of this card yet.


----------



## N19htmare666

J7SC said:


> Nice & useful info ! While I have a few higher Superposition 4K at full blast and 18 C ambient, this one is as far as I am willing to go at 24 C ambient and air cooling. Note 1.05v max instead of 1.1v, 115 % PL instead of 133%, and NV CP panel set to 'normal' power instead of 'prefer max power', so also some downclocking. Once watercooled, I'll push the full boundaries. FYI, I also have a 120mm Arctic P12 blowing on the back of the card and the power connector...seems to work well.
> 
> View attachment 2578177


This makes me think I should keep my 5950x and not bother with a 5800x3d...



scanty85usmc said:


> I have a Gigabyte X570 board and on cold boots i get no signal, once I hard cycle the power it works the second. But its every time I boot my PC up for the first time it doesn't want to detect my display port cable. Very frustrating.


Try setting PCI to gen 4 in the bios rather than auto.


----------



## AvengedRobix

Nice run =) Zotac amp extreme with Colorful 550W


----------



## Hanks552

LunaP said:


> I'm official now! \o/
> 
> Got up early and went to BB since tracker mentioned a SKU, tons of people there, some old guy blocking the door wanting in first so they opened the other door, lucked out they had 3 cards in stock!
> 
> Now to find a damn water block to go w/ this + which type of screws too (gonna have to order pads and stuff + screwdriver kit )


What tracker? Using the app in stock option?


----------



## motivman

alasdairvfr said:


> Did some more 3dmark
> 
> While my scores aren't like truly amazing or anything they do put me in the top 100 for my cpu/gpu combo which is all I need lol.
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)
> View attachment 2578336
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)
> View attachment 2578337
> 
> 
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,Gigabyte Technology Co., Ltd. X570 AORUS MASTER (3dmark.com)
> View attachment 2578338
> 
> 
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASUSTeK COMPUTER INC. CROSSHAIR VI HERO (3dmark.com)
> View attachment 2578339
> 
> 
> Also after some system tweaks I bumped my Superposition score like 500 pts
> View attachment 2578340
> 
> 
> Most of these were on a +255 core and +1400-1600 mclk, this card seems pretty happy in the 3060-3090 mhz core range in most games and tests I have tried. 3105 it can handle in a few but gets dicey.
> 
> Power consumption doesn't typically exceed 565w @ 1.1v. 133% power. Need more voltage to go any higher or drop temps further. Not planning to put the card on water but where I live temps go to -20C so I might be able to squeeze a bit more out of this card yet.



what card do you have?... seeing these results makes me want to throw my card in the trash... I really LOST the lottery on my 4090, SMH


----------



## motivman

AdamK47 said:


> I use to be on the leaderboards back when I was doing 3-Way and 4-Way SLI. Now I value stability and playing actual games with the hardware. There will be cards coming out in the future (however long that may be) that will trounce any 4090. The sliver of performance you're wanting will be even more meaningless.


yeah, but for me, I hardly play games, overclocking is the fun part for me honestly


----------



## alasdairvfr

motivman said:


> what card do you have?... seeing these results makes me want to throw my card in the trash... I really LOST the lottery on my 4090, SMH


Gigabyte Gaming OC. Is it too late to return your card? Seems luck of the draw, the GB OC as well as Tuf/TufOC ppl have has some spectacular results. I had some work to get my scores where they are and for sure comparing ppl with 13th gen intel or 7000 ryzen I can't really compete since even non-cpu bound benches will benefit more than a margin of error due to better IPC and faster single core. I ended up upgrading my BIOS since i was on a beta and redid my PBO, memory oc which is probably a couple % gain as well.


----------



## LunaP

Hanks552 said:


> What tracker? Using the app in stock option?


The Falcodrin community discord one that was linked here.


----------



## Xavier233

alasdairvfr said:


> Gigabyte Gaming OC. Is it too late to return your card? Seems luck of the draw, the GB OC as well as Tuf/TufOC ppl have has some spectacular results. I had some work to get my scores where they are and for sure comparing ppl with 13th gen intel or 7000 ryzen I can't really compete since even non-cpu bound benches will benefit more than a margin of error due to better IPC and faster single core. I ended up upgrading my BIOS since i was on a beta and redid my PBO, memory oc which is probably a couple % gain as well.


Do you have coil whine on your GPU? 2xGigabyte OC I got have it


----------



## alasdairvfr

Xavier233 said:


> Do you have coil whine on your GPU? 2xGigabyte OC I got have it


I cant hear anything above my aio fans/case fans. I tried running some tests at 3-500 fps but nothing I can make out. Seems to be luck of the draw with bins and coil whine this time around


----------



## Xavier233

alasdairvfr said:


> I cant hear anything above my aio fans/case fans. I tried running some tests at 3-500 fps but nothing I can make out. Seems to be luck of the draw with bins and coil whine this time around


Sent you a DM


----------



## Hanks552

LunaP said:


> The Falcodrin community discord one that was linked here.


Can you send the link?


----------



## Muut

AdamK47 said:


> 3 pins would be more like 30W.
> 
> Three 8 Pin connectors goes over 500W. I should know. It's sitting in front of me right now. It's like you guys haven't heard of the 3090 Ti. All came with three 8 Pin connectors to 12V connector.


I'm running my Suprim X at full load on a Corsair TX750M with just 2x (8pins -------- 8 pins +8pins) pigtail cables and it could hold the full 530W. Now that I have flashed the Strix OC on it, it even pulls 560w just fine.

Mind you, I really don't recommend doing this.The PSU only powers the GPU and I have a Seasonic Prime Ultra 850 which powers my motherboard and CPU.

This was not meant to last, it was just a temporary solution until I found a descent PSU. And I did find something. My AX1600i will be arriving tomorrow !


----------



## legomax

mirkendargen said:


> And there wasn't ever any drama about cooked 3090ti connectors (or 3090fe). I think it really is all coming down to the height of the cards ramming the connector against case windows.


I think that is because those connector was slightly slanted from the design. As it was following the x style in the middle


----------



## Rei86

AdamK47 said:


> Nice to see the cards are trickling in.
> 
> The old guy was blocking one door, so a BB employee opened the door he wasn't standing in front of? Hilarious if true.


Cards aren't going to trickle in if rumors are true (and these people passing off rumors have been pretty right about the 4090 besides the power usage, but I feel like that was the Ada Titan that got canceled), and no we aren't gonna get more 4090s as nVidia is diverting more people and supply towards their data center line.

So this could all be FUD, but if you want a 4090 get one now I guess...


----------



## J7SC

N19htmare666 said:


> This makes me think I should keep my 5950x and not bother with a 5800x3d...


When clock speed in benchies trumps other factors, the speed limit (single core / thread) of the 5800x3d will hold you back a bit. Ditto for those benchies where max core number helps. In other apps, the extra cache of the 5800X3d helps.

On the slippery slope of upgrade path, once a major new piece of hardware (such as GPU) is installed, the temptation grows exponentially. Since I did not get into LG1700 before and it is soon to be replaced by LG1800 (?), time to sit back a bit. Meteor Lake (which apparently will be s.th. to behold, same for new AMD models, including xxx3d Vcache) is on the horizon and into the ES labs, so patience. The new AM5 socket is alleged to have a bit more longevity, but no rush to replace my 5950x. The way the PC market is slowing down overall, you might see additional 'aggressive' releases by both AMD and Intel on CPUs (never mind NVidia & AMD on the GPU side of things) over the next few months, many with IPC improvements, some with lower power envelopes, to grab as much of the whole market as they can.

---

@Arizor - ...per terracotta army post > here, I went a bit higher with my Gigabyte Gaming OC ...









VERY important to note that this was with a light 3D load with the core at below 40 C max and fans on 100%. Water-cooling and careful attention to die pasting / mounting will definitely help extract some more performance at full load (w/ 'real world clocks', at least). Per your post a few pages back quoting the Asus rep, I don't run full power on the air-cooled 4090, ie. for Superposition, unless it can keep hotspot below 90 C. In any case, I can't complain about this Gigabyte Gaming OC re. its speed potential - and no coil whine to boot.


----------



## Arizor

Looks like a golden sample @J7SC ! I’m in the same frame of mind re: upgrading. Gonna wait for them Ti, and then AM5 platform to mature. Maybe 2023 Christmas is a good time to upgrade.


----------



## Roacoe717

System is 7950x
Strix 670e
Gigabyte OC 4090.
I am getting very slow bandwidth, does anyone know what's wrong??? Sorry for the crappy pictures, I forgot my login in details for this site.


----------



## LunaP

Hanks552 said:


> Can you send the link?








Discord - A New Way to Chat with Friends & Communities


Discord is the easiest way to communicate over voice, video, and text. Chat, hang out, and stay close with your friends and communities.




discord.gg






Anyone know if the 4090 uses 1.5 pads or just 0.5 and 1.0 ? Putting fuji pads in my cart + kryonaut, unless theres better recommended ones now.


----------



## Muut

J7SC said:


> When clock speed in benchies trumps other factors, the speed limit (single core / thread) of the 5800x3d will hold you back a bit. Ditto for those benchies where max core number helps. In other apps, the extra cache of the 5800X3d helps.
> 
> On the slippery slope of upgrade path, once a major new piece of hardware (such as GPU) is installed, the temptation grows exponentially. Since I did not get into LG1700 before and it is soon to be replaced by LG1800 (?), time to sit back a bit. Meteor Lake (which apparently will be s.th. to behold, same for new AMD models, including xxx3d Vcache) is on the horizon and into the ES labs, so patience. The new AM5 socket is alleged to have a bit more longevity, but no rush to replace my 5950x. The way the PC market is slowing down overall, you might see additional 'aggressive' releases by both AMD and Intel on CPUs (never mind NVidia & AMD on the GPU side of things) over the next few months, many with IPC improvements, some with lower power envelopes, to grab as much of the whole market as they can.
> 
> ---
> 
> @Arizor - ...per terracotta army post > here, I went a bit higher with my Gigabyte Gaming OC ...
> View attachment 2578364
> 
> 
> VERY important to note that this was with a light 3D load with the core at below 40 C max and fans on 100%. Water-cooling and careful attention to die pasting / mounting will definitely help extract some more performance at full load (w/ 'real world clocks', at least). Per your post a few pages back quoting the Asus rep, I don't run full power on the air-cooled 4090, ie. for Superposition, unless it can keep hotspot below 90 C. In any case, I can't complain about this Gigabyte Gaming OC re. its speed potential - and no coil whine to boot.


My suprim X is a strong contender aswell


----------



## mirkendargen

Roacoe717 said:


> System is 7950x
> Strix 670e
> Gigabyte OC 4090.
> I am getting very slow bandwidth, does anyone know what's wrong??? Sorry for the crappy pictures, I forgot my login in details for this site.
> View attachment 2578367
> View attachment 2578368


Every x670e mobo has a BIOS update that says something like "fixed 4090 PCIE speed" in the notes, did you install that?


----------



## Roacoe717

mirkendargen said:


> Every x670e mobo has a BIOS update that says something like "fixed 4090 PCIE speed" in the notes, did you install that?


Literally just did that, booting up and testing again here in a few.


----------



## LuckyImperial

Laithan said:


> As many have seen, getting the GPU to draw its _full_ power can be tricky. At this stage, until we know more about the 4090 (time tells all) we probably don't want to even try and draw full power.. I mean, who even feels confident to push things to limits right now as it seems no better time than now to just "wait and see".
> 
> _<rant>_ IMO even if you ignore fire and safety risk, creating a brand new (questionable) connector that is almost instantly running at its maximum capacity seems like an inadequet design choice. Given the added fire and safety risks plus the inability to "massage" the position of the cable as needed I cannot see this being anything other than DOA and perhaps will make history as a notable design failure.
> 
> ...but as we know there's always those who ride the fine line and head no warnings lol.. Definitely don't do this but if you really want to try and max out power draw you can use the Furmark stress test. Not the regular furmark it must be the stress test option. It is going to take a moment but if you let it run it will start to sip more as it goes. You may reach temp limits before power limits but if you can keep it cool enough it should generate max power. I have not tried this with my 4090 (not gonna) but this is me drawing 531W with my 3090Ti (which only had 3x8-pins). The XOC BIOS was defined at 516W (including PCIe power) and as you can see I was able to slightly even exceed that with the stress test.
> https://www.overclock.net/threads/e...tra-hybrid-hydro-copper.1798041/post-28981806


I'm right here with you man. The amount of people with the singular immediate goal of getting their card to pull 600W is surprising to me. I know were on an OC forum, but 600W is a LOT of power to be messing around with. It seems like these new GPU coolers are keeping people from pausing to think if 600W through their system should be concerning. IMO, 600W into any PC component should be something to watch with concern.

The old liquid cooling community members know that once you beat die cooling, you're not done. I've watched VRM's melt off mobo's.

Should the stock cable support the maximum power limit shipped with the board? Yes. Does that mean you can now ignore the fact that you're juicing 600W through a 12pin cable? No. Not in my opinion.


----------



## LunaP

Kinda odd that some companies are already selling blocks while others aren't releasing till end of nov/early dec, Waiting back on Aqua to see if the gaming OC is on their list, else might have to swap card out for an FE/TUF since nothing available for at least a month +


----------



## Rei86

LunaP said:


> Kinda odd that some companies are already selling blocks while others aren't releasing till end of nov/early dec, Waiting back on Aqua to see if the gaming OC is on their list, else might have to swap card out for an FE/TUF since nothing available for at least a month +


eh...?
We used to get AIB cards like a month after reference and another six or so months out for the highest end models (Lightning Z, Classified, Matrix etc).
Than WB usually followed a week to month after.

So, just chill.

And as far as Aqua Computer goes LOL they took if I remember right (if not heatkiller) the entire generation to put anything out...


----------



## Antsu

Rei86 said:


> eh...?
> We used to get AIB cards like a month after reference and another six or so months out for the highest end models (Lightning Z, Classified, Matrix etc).
> Than WB usually followed a week to month after.
> 
> So, just chill.
> 
> And as far as Aqua Computer goes LOL they took if I remember right (if not heatkiller) the entire generation to put anything out...


Yeah, people need to chill out a bit. It takes time to design and produce the blocks. It would be nice if the AIB's could all send day 1 samples to block manufacturers, but that isn't realistic. Last time around the Aquacomputer 3090 blocks were ready and shipped before the end of the year, I'd expect similar ETA for the 4000 series.


----------



## MrTOOSHORT

These coolers for the 4090s are so good I see no rush to get a waterblock, even at all. Still might get one though, waiting patiently is not a problem this time around.
Just like the cablemod power cable to get here to get rid of this hideous 4 x 8pin to 16pin nvidia adapter.


----------



## WaXmAn

Anyone else having issues with 4090 FE not showing BIOS screen on boot? Told EVGA with the z690 DARK it's an issue. Wondering if anyone else is seeing this as well. I have tried 3 EVGA BIOS's with no resolve. Reseting CMOS doesn't fix it either.


----------



## LunaP

Rei86 said:


> So, just chill.


Whoah calm down there, no need to get angry at a post, we're all 13 and above here. Whats with people going from 0-100 over misunderstanding comments vs validating? Sheesh.

I'm aware that SOME companies were like that back then, prior to Evga leaving the scene, I always went with a specific that was usually out within the first 2 weeks of the cards going up, wasn't pitching a fit was just commenting on it. Take a break if you need it.


----------



## N19htmare666

LuckyImperial said:


> I'm right here with you man. The amount of people with the singular immediate goal of getting their card to pull 600W is surprising to me. I know were on an OC forum, but 600W is a LOT of power to be messing around with. It seems like these new GPU coolers are keeping people from pausing to think if 600W through their system should be concerning. IMO, 600W into any PC component should be something to watch with concern.
> 
> The old liquid cooling community members know that once you beat die cooling, you're not done. I've watched VRM's melt off mobo's.
> 
> Should the stock cable support the maximum power limit shipped with the board? Yes. Does that mean you can now ignore the fact that you're juicing 600W through a 12pin cable? No. Not in my opinion.


People are forgetting how much 3090s drew after bios updates. Not a huge charge with the 4090. What's changed is the cable... And people's trust changed with the unknown.


----------



## J7SC

MrTOOSHORT said:


> These coolers for the 4090s are so good I see no rush to get a waterblock, even at all. Still might get one though, waiting patiently is not a problem this time around.
> Just like the cablemod power cable to get here to get rid of this hideous 4 x 8pin to 16pin nvidia adapter.


At 'stock' which is 1.05v / 450W on my card (instead of the max 1.1v / 600W at full speed ahead) the factory air-cooler can handle pretty much everything I play though the hotspot temp - while below 90 C then - still catches my eye. I'll still get a water-block for it as the loop is already fully built anyway, but it is also more about being able to fit the card into a unique case setup, and the sound of silence w/ water-cooling. As to the hideous 4-into-1 dongle, there are enough warnings out there now to be very cautious re. twisting and bending. The Seasonic PX 1300 PSU I use for this setup is less than a year old, and Seasonic is offering a new cable for it...


----------



## mirkendargen

People are worried about 10A going through a pin on the connector....but don't think about 30A+ going through the tiny solder point on each VRM lol.


----------



## Rei86

LunaP said:


> Whoah calm down there, no need to get angry at a post, we're all 13 and above here. Whats with people going from 0-100 over misunderstanding comments vs validating? Sheesh.
> 
> I'm aware that SOME companies were like that back then, prior to Evga leaving the scene, I always went with a specific that was usually out within the first 2 weeks of the cards going up, wasn't pitching a fit was just commenting on it. Take a break if you need it.


Nope not popping off.
Just stating plainly this time around things have moved up in time line. So we'll just get everything in the same fashion.
It is odd however tho EK doesn't have a block ready to go for a FE model, but that's probably because EVGA is no longer making GPUs.


----------



## LunaP

Rei86 said:


> Nope not popping off.
> Just stating plainly this time around things have moved up in time line. So we'll just get everything in the same fashion.
> It is odd however tho EK doesn't have a block ready to go for a FE model, but that's probably because EVGA is no longer making GPUs.


My concern is just that my card only has EK as a supported block which I'm trying to avoid for personal as well as the many many issues I've seen here ( which is not meant to derail things as I know everyone has their good/bad experiences) but if no other blocks are announced I'll want to return the card and swap for a TUF/Strix/FE since those have good block line ups already that way I'm not stuck with only 1 option. I feel that's fair as I have a custom loop and don't want to tear part of it down/rebuild it to swap back to Air if forced. Sucks XSPC is gone, been using them since the Titan days.

I see Byski and alphacool already have theirs ready, sent an email to them to ask about the gigabyte as there have been mentions but nothing concrete. Not rushing for things but definitely don't want to get stuck either (while waiting)


----------



## bmagnien

@Antsu @MrTOOSHORT let the games begin 









EDIT: Holy moly. 3180mhz core clock, 12165 mhz mem.


----------



## Antsu

bmagnien said:


> @Antsu @MrTOOSHORT let the games begin
> View attachment 2578409
> 
> 
> EDIT: Holy moly. 3180mhz core clock, 12165 mhz mem.
> View attachment 2578410


Very nice! I did get a bit higher score after that run but I forgot to save it, damn it. 

Going to be busy with my platform swap to 13th gen, but I just might have to give Superposition a quick spin after work to try and beat that score again.


----------



## bmagnien

Antsu said:


> Very nice! I did get a bit higher score after that run but I forgot to save it, damn it.
> 
> Going to be busy with my platform swap to 13th gen, but I just might have to give Superposition a quick spin after work to try and beat that score again.


You’re gonna crush me with the 13th gen, I’m having to massively up the core and mem to compete with my last gen am4 cpu


----------



## Antsu

bmagnien said:


> You’re gonna crush me with the 13th gen, I’m having to massively up the core and mem to compete with my last gen am4 cpu


It might help a little bit, but it seems like it's pretty much 100% GPU bound already. Yeah, that is actually really nice for a 5800X3D, you must've been pushing it hard.


----------



## Sheyster

MrTOOSHORT said:


> Just like the cablemod power cable to get here to get rid of this hideous 4 x 8pin to 16pin nvidia adapter.


Looking forward to your feedback about this cable once you have it. I've read mixed reviews about Cablemod here on OCN.


----------



## heptilion

There is a bios update for the Strix. Says improve compatibility. Changed a few default settings as well.


----------



## originxt

Dreaded best buy delay. Hopefully tomorrow. Fun fact, when cards or specialty items like this are shipped to the store, it is not by truck or regular supply line. Mine is being shipped via ups from Nvidia to store. Unsure if it's for every order but was for me is so figured I'd post it.


----------



## alasdairvfr

Antsu said:


> It might help a little bit, but it seems like it's pretty much 100% GPU bound already. Yeah, that is actually really nice for a 5800X3D, you must've been pushing it hard.


For me on 5950x the 8k run is pretty much entirely GPU bound but Im CPU bound on the 4k run,
34,504 4k
14,530 8k

GPU utilization ~90 a good amount of the time


----------



## KedarWolf

On my 5950x Got Rank 8 8K. Rank 8 4K in Superposition.

14987 
36463 

I'll likely do better once I get my 7950x up and running on the weekend, but I won't really know until I get the PBO and memory dialled in right.


----------



## Nico67

Xavier233 said:


> 2 Gigabyte 4090 OC, both have coil whine. Am starting to give up


just a thought, but have you tried a different power supply, or possibly a ups. Could be more to do with how clean the power is rather than the cards?


----------



## LuckyImperial

I've sent moddiy a support email, but does anyone know if an ATX 3.0 PSU is required to use their new D8P-12VHPWR cable? 

Does the ATX 3.0 PSU send some special signal to the sense pins or something?









ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable


Buy ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable for $24.99 with Free Shipping Worldwide (In Stock)




www.moddiy.com


----------



## mirkendargen

LuckyImperial said:


> I've sent moddiy a support email, but does anyone know if an ATX 3.0 PSU is required to use their new D8P-12VHPWR cable?
> 
> Does the ATX 3.0 PSU send some special signal to the sense pins or something?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable
> 
> 
> Buy ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable for $24.99 with Free Shipping Worldwide (In Stock)
> 
> 
> 
> 
> www.moddiy.com


There's no magic with the sense pins, but your PSU does need to actually use 8pin connectors like that and be able to supply enough power from them. I'm using a moddiy cable on a Corsair AX1600i. I have the Corsair cable too, but I like having something longer and more flexible after trying the Corsair one (I have a monster case).


----------



## GQNerd

Muut said:


> My suprim X is a strong contender aswell
> 
> View attachment 2578373


Nice! But check your "Effective Clock" via Hwinfo to see your actual clocks..


----------



## LuckyImperial

mirkendargen said:


> There's no magic with the sense pins, but your PSU does need to actually use 8pin connectors like that and be able to supply enough power from them. I'm using a moddiy cable on a Corsair AX1600i. I have the Corsair cable too, but I like having something longer and more flexible after trying the Corsair one (I have a monster case).


Thanks for the quick reply. 

I'm hoping to use this cable with an EVGA SuperNOVA 850 GM, which is a SFX power supply with two 8pin VGA connectors. I checked the manual and it doesn't say how many watts each individual connector can supply.



https://www.evga.com/products/product.aspx?pn=123-GM-0850-X1



Moddiy asks for the PSU you're using, so I have to imagine they're going to have some level of compatibility check.


----------



## AKBrian

scanty85usmc said:


> I have a Gigabyte X570 board and on cold boots i get no signal, once I hard cycle the power it works the second. But its every time I boot my PC up for the first time it doesn't want to detect my display port cable. Very frustrating.


This may have been addressed further down the thread (I'm catching up on a few days worth), but you'll likely need a BIOS update.

I ran into a similar issue on my X570 Taichi, where it would not display any POST screen, but _would_ boot into Windows and then display properly if I just waited long enough. Once in Windows, I could use the "Boot to UEFI" to get back into the BIOS itself, which then displayed fine. Just not during initial bootup.
A few days later, I checked for a board BIOS, and a new one (in my case, P5.00) had been posted with the description "_1. Improve GPU compatibility for GeForce RTX 40 series_."

For anyone else on an ASRock AM4 board, specifically, check the JZ Electronic site to grab an updated BIOS for your board:



https://shop.jzelectronic.de/news.php?id=1666435667&sw=





> 21.10.22
> X470 Taichi Ultimate - BIOS P5.10
> X470 Taichi - BIOS P5.10
> X470 Gaming K4 - BIOS P4.80
> X470 Master SLI - BIOS P4.80
> B450 Steel Legend - BIOS P4.60
> B450 Gaming K4 - BIOS P5.60
> B450 Pro4 - BIOS P5.60
> B450 Gaming-ITX/ac - BIOS P5.10
> B450M Steel Legend - BIOS P4.60
> B450M Pro4 - BIOS P4.70
> B450M-HDV - BIOS P4.50
> 20.10.22
> X570 AQUA - BIOS P3.90
> X570 Creator - BIOS P3.90
> X570 Phantom Gaming X - BIOS P5.00
> X570 Taichi Razer - BIOS P1.90
> X570 Taichi - BIOS P5.00
> X570S PG Riptide - BIOS P2.20
> X570 Phantom Gaming 4 - BIOS P4.50
> X570 PG Velocita - BIOS P2.20
> X570 Extreme4 - BIOS P4.10
> X570 Steel Legend - BIOS P4.10
> X570 Pro4 - BIOS P4.50
> X570M Pro4 - BIOS P3.90
> X570 Phantom Gaming-ITX/TB3 - BIOS P3.70
> B550 Taichi Razer - BIOS P2.30
> B550 Taichi - BIOS P2.30
> B550 PG Velocita - BIOS P2.40
> B550 Extreme4 - BIOS P2.40
> B550 Steel Legend - BIOS P2.40
> B550 PG Riptide - BIOS P1.60
> B550 Phantom Gaming 4 - BIOS P2.40
> B550 Phantom Gaming 4/ac - BIOS P2.40
> B550 Pro4 - BIOS P2.40
> B550M Steel Legend - BIOS P2.50
> B550M Pro4 - BIOS P2.50
> B550M Phantom Gaming 4 - BIOS P2.10
> B550M-HDV - BIOS P2.30
> B550 Phantom Gaming-ITX/ax - BIOS P2.40
> B550M-ITX/ac - BIOS P2.10
> *1. Improve GPU compatibility for GeForce RTX 40 series*


For your Gigabyte board, check the BIOS page for a new beta BIOS, hopefully one will go up soon. I looked at the X570 Master and Xtreme and didn't see anything recent, but fingers crossed.


----------



## Antsu

PSA: "The VRAM bug" definitely is a thing in Superposition also, so be careful when comparing results. I am quite confident that this is how the #1 got over 41k with similar clocks.


----------



## mirkendargen

LuckyImperial said:


> Thanks for the quick reply.
> 
> I'm hoping to use this cable with an EVGA SuperNOVA 850 GM, which is a SFX power supply with two 8pin VGA connectors. I checked the manual and it doesn't say how many watts each individual connector can supply.
> 
> 
> 
> https://www.evga.com/products/product.aspx?pn=123-GM-0850-X1
> 
> 
> 
> Moddiy asks for the PSU you're using, so I have to imagine they're going to have some level of compatibility check.


Yeah they'll make the cable with the correct pinout for your PSU, but as to whether it will work out ok...

It *probably* will, but If I was doing that on that PSU personally I'd keep it at 450w. It is at least a single fat 12v rail which is good.


----------



## Mad Pistol

4090 is well over 2x faster than a 3080 when utilizing Ray Tracing.






























There are many others, but RT scaling on 4090 is mind-boggling.


----------



## KedarWolf

Muut said:


> My suprim X is a strong contender aswell
> 
> View attachment 2578373


Here is my best run.


----------



## Hulk1988

bmagnien said:


> @Antsu @MrTOOSHORT let the games begin
> View attachment 2578409
> 
> 
> EDIT: Holy moly. 3180mhz core clock, 12165 mhz mem.
> View attachment 2578410


Try to catch me  I am second. No idea how someone can get 41K points here on air. If I put my PC out when it’s cold may be 40K are possible.


----------



## J7SC

...finally time for just 'fun' on stock settings with the 4090 instead of establishing oc limits for later, when the water block is put on. This Giga-G-OC really keeps the VRAM cool even when benching, but all temps were perfect in gaming...

...running the motor bike off the bridge at high speed in Cyberpunk 2077 and ending up in the water (on purpose !)










...a quick visit to NY in FS2020 (which btw is supposed to get DLSS3, among other things, w/ SU 11 patch general release on 11.11.2022)










...over to Cologne (I used to live near there and climbed the Cathedral tower's stairs...)









...and finally on to Rio









The 4090 on a big OLED is s.th. to behold


----------



## ALSTER868

LuckyImperial said:


> I've sent moddiy a support email, but does anyone know if an ATX 3.0 PSU is required to use their new D8P-12VHPWR cable?
> 
> Does the ATX 3.0 PSU send some special signal to the sense pins or something?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable
> 
> 
> Buy ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable for $24.99 with Free Shipping Worldwide (In Stock)
> 
> 
> 
> 
> www.moddiy.com


Hey guys, do you think this moddiy cable is good enough to be considered, has anyone tried it yet? I heard from someone that their cables are thinner than cablemod's ones so I'm a bit concerned whether I should buy from them.


----------



## rahkmae

In conclusion, could say, that all video cards clock in the same way this time...that it doesn't matter if you have a cheap or expensive card...this 600W basically does not change anything. So what's the limit this time, V ? If buy, that cheaper card is the better buy?


----------



## RaMsiTo

bmagnien said:


> @Antsu @MrTOOSHORT let the games begin
> View attachment 2578409
> 
> 
> EDIT: Holy moly. 3180mhz core clock, 12165 mhz mem.
> View attachment 2578410





bmagnien said:


> @Antsu @MrTOOSHORT let the games begin
> View attachment 2578409
> 
> 
> EDIT: Holy moly. 3180mhz core clock, 12165 mhz mem.
> View attachment 2578410


----------



## LuckyImperial

mirkendargen said:


> Yeah they'll make the cable with the correct pinout for your PSU, but as to whether it will work out ok...
> 
> It *probably* will, but If I was doing that on that PSU personally I'd keep it at 450w. It is at least a single fat 12v rail which is good.


Yeah I'm keeping this thing at 70% power slider for thermal reasons. The SFF build will benefit greatly from the reduced size of the cable. Im grateful it'll work at all.

Also, moddiy support got back to me quickly and confirmed the cable will work.


----------



## schoolofmonkey

There is one upside to the 12VHPWR PSU's, there's no difference in pin outs like we currently have, it one standard layout so interchangeable.


----------



## 8472

I'm just realizing how CPU intensive the terrain level of detail is in MSFS. My card is at ~50% utilization while flying over Tokyo and Paris at 4k with all settings maxed including terrain lod at 400.

Two threads on my 12700k are basically maxed while the others have low loads on them.

I'm curious about how much of an uplift a 13700k/13900k would give me.


----------



## bmagnien

@RaMsiTo @Hulk1988 @Antsu great scores - I’ll have some work when I get home tonight.

about the artifact bug, I think it’s 100% connected to the fact that NV allowed ECC to be turned off. The 3090 was launched as a titan class card, and I think NV underestimated the demand by gamers, and so they tuned the card for production and computation, hence the need for ECC. With the 4090, they gave the option to disable ECC, but the benchmarks weren’t ready apparently. You’d think 3DMark or unigine would be able to inject some calculation authentication into their benchmarks that detects errors at the computational level and fails the run if it doesn’t pass 100%


----------



## warrior-kid

8472 said:


> I'm just realizing how CPU intensive the terrain level of detail is in MSFS. My card is at ~50% utilization while flying over Tokyo and Paris at 4k with all settings maxed including terrain lod at 400.
> 
> Two threads on my 12700k are basically maxed while the others have low loads on them.
> 
> I'm curious about how much of an uplift a 13700k/13900k would give me.


I am interested in that too. My first attempts at 8K in FS2020 have been very successful and I'm keen for the CPU to be used as well as possible too. I have a 3970X Threadripper--if only FS2020 used 32 cores, would be awesome. I'm still not sure about multi-thread use across the broad games spectrum, it feels single thread processing still rules.


----------



## Hulk1988

bmagnien said:


> @RaMsiTo @Hulk1988 @Antsu great scores - I’ll have some work when I get home tonight.
> 
> about the artifact bug, I think it’s 100% connected to the fact that NV allowed ECC to be turned off. The 3090 was launched as a titan class card, and I think NV underestimated the demand by gamers, and so they tuned the card for production and computation, hence the need for ECC. With the 4090, they gave the option to disable ECC, but the benchmarks weren’t ready apparently. You’d think 3DMark or unigine would be able to inject some calculation authentication into their benchmarks that detects errors at the computational level and fails the run if it doesn’t pass 100%


Thanks for the explanation. How is that artifact bug possible? Can everyone do that? And is that still possible? @RaMsiTo Can you create a video with the mobile phone from Windows starting to change settings, running the Benchmark and seeing the score at the end? Or is it just cheating and no one want to show it?


----------



## Benni231990

Here guys for all of you that wont use the Nvidia adapter here is a 3x8pin extension cable to 16pin with 600watt









ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Adapter Cable


Buy ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Adapter Cable for $24.99 with Free Shipping Worldwide (In Stock)




www.moddiy.com


----------



## GRABibus

Has someone paired his 4090 with 5800X3D ?

Is there any bottlenecks in 4K as with 5900X and 5950X ?


----------



## RaMsiTo

Hulk1988 said:


> Thanks for the explanation. How is that artifact bug possible? Can everyone do that? And is that still possible? @RaMsiTo Can you create a video with the mobile phone from Windows starting to change settings, running the Benchmark and seeing the score at the end? Or is it just cheating and no one want to show it?


In that pass superposition I have no artifact errors, in 3dmark portroyal, speedway yes and I have already removed those scores.


----------



## pajonk

GRABibus said:


> Has someone paired his 4090 with 5800X3D ?
> 
> Is there any bottlenecks in 4K as with 5900X and 5950X ?


In Spiderman there is the same bottleneck. And the 5900x is not alway slower than a 5800X3d.
Look at youtube.. The 5800X3d is just in a few games faster. Farcry6 for example.


----------



## alasdairvfr

warrior-kid said:


> I am interested in that too. My first attempts at 8K in FS2020 have been very successful and I'm keen for the CPU to be used as well as possible too. I have a 3970X Threadripper--if only FS2020 used 32 cores, would be awesome. I'm still not sure about multi-thread use across the broad games spectrum, it feels single thread processing still rules.


I have not spent much time in FS2020 but find its single-core bottleneck annoying. The issue is the engine's "main thread" which for whatever reason cannot be multithreaded, and very little CPU processing appears to be done outside of the main thread. I know some people adamantly claim (somehow) that the sim is well optimized (or at least get heated when people say its badly optimized), but in 2020-onwards single core shouldn't be such a bottleneck. Lots of games will have CPU bottleneck without maxing out all cores but I'm getting like barely 20-30% both CPU/GPU utilization and somehow bottlenecked. Usually for me at CPU bottleneck the game can slurp up closer to 50% utilization overall. I've dialed in some settings but didn't help much. 40-50 fps at 4k.




Hulk1988 said:


> Thanks for the explanation. How is that artifact bug possible? Can everyone do that? And is that still possible? @RaMsiTo Can you create a video with the mobile phone from Windows starting to change settings, running the Benchmark and seeing the score at the end? Or is it just cheating and no one want to show it?


I'm finding the artifact bug in unigine isn't doing much for me though. I managed to squeeze out a couple hundred points when its artifacting but push it any harder i get a full system crash. Maybe I need to dial down core and push memory higher? My memory was at like 24200+ and core was in the 3100 range. With or without a glitched score that's crazy


----------



## yzonker

bmagnien said:


> @RaMsiTo @Hulk1988 @Antsu great scores - I’ll have some work when I get home tonight.
> 
> about the artifact bug, I think it’s 100% connected to the fact that NV allowed ECC to be turned off. The 3090 was launched as a titan class card, and I think NV underestimated the demand by gamers, and so they tuned the card for production and computation, hence the need for ECC. With the 4090, they gave the option to disable ECC, but the benchmarks weren’t ready apparently. You’d think 3DMark or unigine would be able to inject some calculation authentication into their benchmarks that detects errors at the computational level and fails the run if it doesn’t pass 100%


I've actually seen that same effect on my 3090, but I don't think the score was significantly higher like we're seeing with the 4090. The 3090 KP 1kw bios does not have the same error correction as the standard factory bios. There is no performance regression with the KP bios. Performance increases until instability is reached. Thus you can cause the same type of artifacting. But like I said, I never saw a big increase in benchmark scores though.


----------



## yzonker

Hulk1988 said:


> Thanks for the explanation. How is that artifact bug possible? Can everyone do that? And is that still possible? @RaMsiTo Can you create a video with the mobile phone from Windows starting to change settings, running the Benchmark and seeing the score at the end? Or is it just cheating and no one want to show it?


You can see it at the end of the GN live stream when they were LN2 benching the 4090. The entire stream is on their channel.


----------



## bmagnien

yzonker said:


> I've actually seen that same effect on my 3090, but I don't think the score was significantly higher like we're seeing with the 4090. The 3090 KP 1kw bios does not have the same error correction as the standard factory bios. There is no performance regression with the KP bios. Performance increases until instability is reached. Thus you can cause the same type of artifacting. But like I said, I never saw a big increase in benchmark scores though.


I think ECC control is driver level, not bios. I could be wrong though.


----------



## yzonker

bmagnien said:


> I think ECC control is driver level, not bios. I could be wrong though.


All I know for certain is I never saw any regression using that bios, or the Galax 1kw bios in my 3080ti. But I definitely did on the stock bios of both cards. And I did A LOT of benchmarking on those cards.


----------



## Hulk1988

Thanks for the answers. I still do not know how these big score differences are possible with almost same GPU and MEM and Temps 

Example:









UNIGINE Benchmarks


Performance benchmarks by Unigine




benchmark.unigine.com





I do not see how 8% differences are possible. 
Or 4% here Result

These cards are not under Water, too. 

Does someone know how big the differences are between a 520W Bios on my Suprim X and a 600W Bios? Is it possible to flash a Strix BIOS on the Suprim Card?


----------



## Muut

Miguelios said:


> Nice! But check your "Effective Clock" via Hwinfo to see your actual clocks..


Here


----------



## Muut

Hulk1988 said:


> Thanks for the answers. I still do not know how these big score differences are possible with almost same GPU and MEM and Temps
> 
> Example:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> UNIGINE Benchmarks
> 
> 
> Performance benchmarks by Unigine
> 
> 
> 
> 
> benchmark.unigine.com
> 
> 
> 
> 
> 
> I do not see how 8% differences are possible.
> Or 4% here Result
> 
> These cards are not under Water, too.
> 
> Does someone know how big the differences are between a 520W Bios on my Suprim X and a 600W Bios? Is it possible to flash a Strix BIOS on the Suprim Card?


This is what I did. I flashed the strix OC bios on my suprim X. Works wonder in those moments where I hit PL (although 90% of time it's voltage which is limiting). 

I have a friend who flashed a 600w suprim X bios on his suprim X (530w), and he got a lot of weird things. Whereas the strix OC bios from techpowerup could be used as a daily bios if 530w wasn't enough (which his plenty enough lol). 

For bench purposes, definitly helps


----------



## Azazil1190

GRABibus said:


> Has someone paired his 4090 with 5800X3D ?
> 
> Is there any bottlenecks in 4K as with 5900X and 5950X ?


Much better with 5800x3d. someone here have this combo.At ts and pr you gonna be 500-600points higher.


----------



## Muut

Antsu said:


> PSA: "The VRAM bug" definitely is a thing in Superposition also, so be careful when comparing results. I am quite confident that this is how the #1 got over 41k with similar clocks.
> View attachment 2578456


It is very likely, especially when we know he got caught doing it in port royal and waited for UL to remove his score from the HOF to finally say his run "might have been bugged". 

But there is also a lot of tweaking behind on LOD and OS optimization. 

This Memory artefact thing isn't something I'm willing to risk just for the sake of making some virtual points. If that degrades the memory of my 2k € gpu over time, it's not a smart gamble IMO.


----------



## GRABibus

pajonk said:


> In Spiderman there is the same bottleneck. And the 5900x is not alway slower than a 5800X3d.
> Look at youtube.. The 5800X3d is just in a few games faster. Farcry6 for exampel.


I am talking here about GPU usage whcvih can be low in 4K even with a 5950


Azazil1190 said:


> Much better with 5800x3d. someone here have this combo.At ts and pr you gonna be 500-600points higher.


Thank you.

What would interesting to know is also GPU usage in most games in 4K with 5800X3D + 4090.


----------



## alasdairvfr

GRABibus said:


> I am talking here about GPU usage whcvih can be low in 4K even with a 5950
> 
> ...
> 
> What would interesting to know is also GPU usage in most games in 4K with 5800X3D + 4090.


I'm seeing 5950x bottlenecking at 4k with 4090. This is a new phenomenon to me since my 3080ti used to be the bottleneck. It's gotta be one or the other, right? Unlikely to be a perfect pairing.

5800X3D does typically perform higher in games so I suppose its not surprising to perform better with the bigger GPU. One thing I haven't looked at is REBAR. Has anyone done a comparison REBAR on-off for 'cpu-bottlenecked' games on 4090? I've had REBAR on at the BIOS level for a while now


----------



## O-VIZ

Hulk1988 said:


> Thanks for the answers. I still do not know how these big score differences are possible with almost same GPU and MEM and Temps
> 
> Example:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> UNIGINE Benchmarks
> 
> 
> Performance benchmarks by Unigine
> 
> 
> 
> 
> benchmark.unigine.com
> 
> 
> 
> 
> 
> I do not see how 8% differences are possible.
> Or 4% here Result
> 
> These cards are not under Water, too.
> 
> Does someone know how big the differences are between a 520W Bios on my Suprim X and a 600W Bios? Is it possible to flash a Strix BIOS on the Suprim Card?












A "well-chosen" LOD value has a great influence on the result...


----------



## MNExSilencer

Got mine a week ago, running 2940Mhz all the time like its locked, memory pushed to 1200. Max power i have seen is 560W. temps are beautiful. I took the Gigabyte gaming OC since I don't see the point of the +500€ premium price of the STRIX. I'm happy where it is don't need any bios or mod. Looks like OC for performance is dead, only for synthetic numbers. 

It's a great overpriced piece of pc hardware, and I'm enjoying it.


----------



## Orcworm

WaXmAn said:


> Anyone else having issues with 4090 FE not showing BIOS screen on boot? Told EVGA with the z690 DARK it's an issue. Wondering if anyone else is seeing this as well. I have tried 3 EVGA BIOS's with no resolve. Reseting CMOS doesn't fix it either.


I had the same issue with a Gigabyte z390 board and an FE, but returned it for coil whine so not sure on resolving it I'm afraid - I now have an Asus TUF which seems to show the BIOS fine though.


----------



## Muut

WaXmAn said:


> Anyone else having issues with 4090 FE not showing BIOS screen on boot? Told EVGA with the z690 DARK it's an issue. Wondering if anyone else is seeing this as well. I have tried 3 EVGA BIOS's with no resolve. Reseting CMOS doesn't fix it either.


There are several other reports on EVGA forum regarding that specific issue. Always involves a 4090 FE. I don't have the issue with my MSI Suprim X. But I'm on bios 1.15 still


----------



## GRABibus

alasdairvfr said:


> I'm seeing 5950x bottlenecking at 4k with 4090. This is a new phenomenon to me since my 3080ti used to be the bottleneck. It's gotta be one or the other, right? Unlikely to be a perfect pairing.
> 
> 5800X3D does typically perform higher in games so I suppose its not surprising to perform better with the bigger GPU. One thing I haven't looked at is REBAR. Has anyone done a comparison REBAR on-off for 'cpu-bottlenecked' games on 4090? I've had REBAR on at the BIOS level for a while now


As I own a 5950X, I will then not get the RTX 4090 I planned to order.

Either I keep my combo 3090 + 5950X, either I go for a complete new build based on 13900K or next Ryzen 7000 x3D.


----------



## pajonk

GRABibus said:


> As I own a 5950X, I will then not get the RTX 4090 I planned to order.
> 
> Either I keep my combo 3090 + 5950X, either I go for a complete new build based on 13900K or next Ryzen 7000 x3D.


Lol. That makes no sense. Got an 5900x. And alle games now has 50-100% more FPS. So when a game like spiderman is bottlenecking my CPU at 80FPS it's better than at 50FPS with a bottlenecked 3090 
There is alway 1 Bottleneck in a PC: Otherwise you'll never need to upgrade.


----------



## Hulk1988

O-VIZ said:


> View attachment 2578541
> 
> 
> A "well-chosen" LOD value has a great influence on the result...


Where I can change that? Everything is greyed out.


----------



## GRABibus

pajonk said:


> Lol. That makes no sense. Got an 5900x. And alle games now has 50-100% more FPS. So when a game like spiderman is bottlenecking my CPU at 80FPS it's better than at 50FPS with a bottlenecked 3090
> There is alway 1 Bottleneck in a PC: Otherwise you'll never need to upgrade.


Of course there is always a bottleneck
But Getting 70% GPU usage in my games at 4K…that makes no sense for me.


----------



## 8472

Pricey. The article says that these are available now but I don't see them on Phanteks' store.


__ https://twitter.com/i/web/status/1585285216209453056


----------



## pajonk

GRABibus said:


> Of course there is always a bottleneck
> But Getting 70% GPU usage in my games at 4K…that makes no sense for me.


It's a bad programmed game for CPU. When you use your 3090 it is a gpu bottlenecked game. But with less frames. So what's better? For me more FPS is better. And there are a lot of games now running in 4k maxed out 120hz with no cpu bottleneck.


----------



## AdamK47

RaMsiTo said:


> In that pass superposition I have no artifact errors, in 3dmark portroyal, speedway yes and I have already removed those scores.
> 
> View attachment 2578505


You guys and your cheating. As if benchmarking with known unstable clocks wasn't enough.


----------



## yzonker

Hulk1988 said:


> Where I can change that? Everything is greyed out.
> 
> View attachment 2578545


Use +3









Negative Lod Bias


Have you ever wanted sharper textures over all in your games? Well with Lod bias that is possible. What I’m going to focus on is texture Lod bias from the Nvidia viewpoint, this can be done o…




melantechgaming.wordpress.com


----------



## KedarWolf

AdamK47 said:


> You guys and your cheating. As if benchmarking with known unstable clocks wasn't enough.


My benches are on my stable 24/7 settings that can run Cyberpunk, EZBench, Port Royal and Time Spy with no artifacts or crashes ever.

3135 core, 12107 memory at 1.1v, 600W Asus BIOS.

Edit: The Strix cooler is quite good, fans at default settings never get over 64C core and 64C hotspot.


----------



## GRABibus

pajonk said:


> It's a bad programmed game for CPU. When you use your 3090 it is a gpu bottlenecked game. But with less frames. So what's better? For me more FPS is better. And there are a lot of games now running in 4k maxed out 120hz with no cpu bottleneck.


ok


----------



## AdamK47

KedarWolf said:


> My benches are on my stable 24/7 settings that can run Cyberpunk, EZBench, Port Royal and Time Spy with no artifacts or crashes ever.
> 
> 3135 core, 12107 memory at 1.1v, 600W Asus BIOS.
> 
> Edit: The Strix cooler is quite good, fans at default settings never get over 64C core and 64C hotspot.


That's what I do too. Only start benchmarking after I have established stable clocks.


----------



## ArcticZero

8472 said:


> Pricey. The article says that these are available now but I don't see them on Phanteks' store.
> 
> 
> __ https://twitter.com/i/web/status/1585285216209453056


Wow these look quite good. If performance is as good as the block looks (and it probably will be) I might get these over Optimus, whichever releases first.


----------



## 8472

They say that they don't feel comfortable selling systems in a lian li case with the 4090 mounted horizontally. Those cablemod 90 and 180 degree adapters are probably going to sell out fast. 


__ https://twitter.com/i/web/status/1585299830594641920


----------



## Azazil1190

The best i can get on 5950x
I dont have a top sample but its ok i think.
















I show some folk's here with same clocks and a 5800x3d /12900k/13900k has better gpu results


----------



## alasdairvfr

Azazil1190 said:


> The best i can get on 5950x
> I dont have a top sample but its ok i think.
> View attachment 2578564
> 
> View attachment 2578563
> 
> I show some folk's here with same clocks and a 5800x3d /12900k/13900k has better gpu results


You are very slightly ahead of me 

Do you have Speedway?


----------



## bmagnien

Phantoms Asus blocks available for purchase:








Phanteks Glacier G40 ASUS GPU Block for ASUS ROG STRIX / TUF gaming RTX 4090


The Phanteks’ Glacier G40 ASUS GPU Block provides a high-performance water-cooling solution custom-designed for the latest Asus ROG Strix / TUF gaming RTX 4090/4080 cards. The Glacier G40 ASUS GPU Block brings the ultimate cooling performance with some unique water block features to build...



www.phanteks.store


----------



## Azazil1190

alasdairvfr said:


> You are very slightly ahead of me
> 
> Do you have Speedway?


Nop unfortunately!
At superposition 4k im at 36200 something like this.If I push harder maybe i can reach difficult the 37k .The 5950 cant push the 4090 right


----------



## pajonk

Azazil1190 said:


> Nop unfortunately!
> At superposition 4k im at 36200 something like this.If I push harder maybe i can reach difficult the 37k .The 5950 cant push the 4090 right


Then try this setting. This will make the gpu and not the cpu bottleneck.


----------



## Azazil1190

pajonk said:


> Then try this setting. This will make the gpu and not the cpu bottleneck.
> View attachment 2578567


Thanks i will try!


----------



## pajonk

Azazil1190 said:


> Thanks i will try!


So here is my result
UV 0,965V 2775Mhz 320Watt


----------



## alasdairvfr

Azazil1190 said:


> Nop unfortunately!
> At superposition 4k im at 36200 something like this.If I push harder maybe i can reach difficult the 37k .The 5950 cant push the 4090 right


I can't get above 34,7xx unfortunately, I'm revisiting PBO/CPU config in BIOS


----------



## J7SC

O-VIZ said:


> View attachment 2578541
> 
> 
> A "well-chosen" LOD value has a great influence on the result...


Kudos for underscoring this. Unigine doesn't do as many control checks on results as 3DM does (never mind the issues with artifacting) and the LOD /TESS issue has been known for years. Last time it came up here was when 2080 Ti released, then again w/ 3090. That is why I don't bother to post results at the Superposition site, but I do enjoy running Superposition and comparing it to my own results from this and my other GPUs.


----------



## bmagnien

rock-solid daily stable


----------



## KedarWolf

J7SC said:


> Kudos for underscoring this. Unigine doesn't do as many control checks on results as 3DM does (never mind the issues with artifacting) and the LOD /TESS issue has been known for years. Last time it came up here was when 2080 Ti released, then again w/ 3090. That is why I don't bother to post results at the Superposition site, but I do enjoy running Superposition and comparing it to my own results from this and my other GPUs.


I've known about the LOD BIOS tweak for years, but I wouldn't even submit a benchmark while using it.

Sucks that peeps do though, it's really a cheat and not a valid benchmark.


----------



## Mad Pistol

GRABibus said:


> Has someone paired his 4090 with 5800X3D ?
> 
> Is there any bottlenecks in 4K as with 5900X and 5950X ?


It depends. If you're strictly comparing benchmarks like 3DMark which have a multi-threaded CPU component, CPUs like the 5950x will be better due to additional cores/threads.

However, in real-world gaming, I have yet to run into a time where the RTX 4090 is being bottlenecked by a 5800x3D at 4K. In fact, in very specific circumstances, the 5800x3D is the fastest gaming CPU on the market. Shadow of the Tomb Raider appears to be one of them. I have yet to find a CPU that can produce a higher framerate at 720p lowest settings.


----------



## yzonker

KedarWolf said:


> I've known about the LOD BIOS tweak for years, but I wouldn't even submit a benchmark while using it.
> 
> Sucks that peeps do though, it's really a cheat and not a valid benchmark.


And now you can add artifacting to that too.  That has pretty much killed my interest in benchmarking at this point, at least in regards to competing in any way with it.


----------



## bmagnien

yzonker said:


> And now you can add artifacting to that too.  That has pretty much killed my interest in benchmarking at this point, at least in regards to competing in any way with it.


Everyone’s still on the same playing field. Everyone has access to the same drivers, the same OS, the same hardware, and the same benchmarks. The hardest part would be access to the same cooling solutions but even that is not unattainable. I don’t really see any issue. It’s all synthetic dick measuring contests at the end of the day, so just have fun with it. Or don’t and just enjoy this beast of a card at stable settings in your favorite games ☺


----------



## yzonker

Mad Pistol said:


> It depends. If you're strictly comparing benchmarks like 3DMark which have a multi-threaded CPU component, CPUs like the 5950x will be better due to additional cores/threads.
> 
> However, in real-world gaming, I have yet to run into a time where the RTX 4090 is being bottlenecked by a 5800x3D at 4K. In fact, in very specific circumstances, the 5800x3D is the fastest gaming CPU on the market. Shadow of the Tomb Raider appears to be one of them. I have yet to find a CPU that can produce a higher framerate at 720p lowest settings.
> 
> View attachment 2578586


Don't forget FS2020. It KILLS in that.


----------



## yzonker

bmagnien said:


> Everyone’s still on the same playing field. Everyone has access to the same drivers, the same OS, the same hardware, and the same benchmarks. The hardest part would be access to the same cooling solutions but even that is not unattainable. I don’t really see any issue. It’s all synthetic dick measuring contests at the end of the day, so just have fun with it. Or don’t and just enjoy this beast of a card at stable settings in your favorite games ☺


Well it made something that was already time consuming and tedious a lot worse. Now you need what is almost a lucky run to get a good score,rather than just the skills and knowledge of how to maximize performance. I got about 500pts in PR from it, but others have gotten 1000pts. So it's not even as simple as getting an artifacted run to complete. Just going past what I'm willing to do.


----------



## bmagnien

yzonker said:


> Well it made something that was already time consuming and tedious a lot worse. Now you need what is almost a lucky run to get a good score,rather than just the skills and knowledge of how to maximize performance. I got about 500pts in PR from it, but others have gotten 1000pts. So it's not even as simple as getting an artifacted run to complete. Just going past what I'm willing to do.


Totally, if it’s not enjoyable anymore no sense toiling away at it. I’ve had fun the last couple nights going back and forth with some of the other guys on this thread, so at least I’ve got some enjoyment out of it.


----------



## Azazil1190

alasdairvfr said:


> I can't get above 34,7xx unfortunately, I'm revisiting PBO/CPU config in BIOS


Do me a favour and try something.
If you have enable the resized bar via bios ,turn it off and make a run of superposition 4k.
With this way i have better scores (only to superposition)


----------



## Xavier233

For those who are having coil whine on your GPU: try another PSU. Yes, try another PSU, because while it is your GPU that is making noise, it very well could be your PSU that is causing it. Why? I dont know, but I know it makes a difference, before you go out and swap 2, 4 or more GPUs


----------



## bmagnien

EK block for MSI Trio and Suprim available for preorder:








EK-Quantum Vector² Trio RTX 4090 D-RGB - Nickel + Plexi


EK-Quantum Vector² water block for the MSI Trio and Suprim RTX 4090




www.ekwb.com


----------



## biigshow666

There's a new gigabyte bios up on their website. Has anyone made the upgrade. The notes are quite vague...


----------



## KedarWolf

yzonker said:


> And now you can add artifacting to that too.  That has pretty much killed my interest in benchmarking at this point, at least in regards to competing in any way with it.


3DMark detects if you use LOD BIAS cheats though, but unstable overclocks, I don't think so.


----------



## Luggage

Mad Pistol said:


> It depends. If you're strictly comparing benchmarks like 3DMark which have a multi-threaded CPU component, CPUs like the 5950x will be better due to additional cores/threads.
> 
> However, in real-world gaming, I have yet to run into a time where the RTX 4090 is being bottlenecked by a 5800x3D at 4K. In fact, in very specific circumstances, the 5800x3D is the fastest gaming CPU on the market. Shadow of the Tomb Raider appears to be one of them. I have yet to find a CPU that can produce a higher framerate at 720p lowest settings.
> 
> View attachment 2578586


Check sottr thread in benchmark section


----------



## GAN77

Mad Pistol said:


> However, in real-world gaming, I have yet to run into a time where the RTX 4090 is being bottlenecked by a 5800x3D at 4K


You need windows 10 for 5800x3d.


----------



## yzonker

Look what just showed up.


----------



## mirkendargen

yzonker said:


> Look what just showed up.
> 
> View attachment 2578631


Jelly....and the ports are hilariously tall.


----------



## GAN77

mirkendargen said:


> Jelly....and the ports are hilariously tall.


Phanteks Glacier G40 ASUS GPU Block for ASUS ROG STRIX / TUF 4090/4080


----------



## mirkendargen

GAN77 said:


> Phanteks Glacier G40 ASUS GPU Block for ASUS ROG STRIX / TUF 4090/4080


Yeah I know it has them on the side. I have a Gigabyte card anyway so my choices so far are EK in a month or so and ???. I'm hoping Bykski drops something before then. If I had an Asus card I'd have ordered the Bykski one immediately


----------



## Sheyster

biigshow666 said:


> There's a new gigabyte bios up on their website. Has anyone made the upgrade. The notes are quite vague...


Version F2 - "Increase compatibility."

Sounds similar to the recent ASUS 4090 BIOS update. If you've not having issues probably no reason to upgrade.


----------



## bmagnien

yzonker said:


> Look what just showed up.
> 
> View attachment 2578631


Nice! Better get some ‘before’ baselines to compare back to because you know we’re all going to be annoying the hell out of you asking for your deltas at various power limits, clock speeds, and loads. Fire up excel and start cranking!


----------



## J7SC

yzonker said:


> Don't forget FS2020. It KILLS in that.
> 
> View attachment 2578589


...Yeah - FS2020 is super-important to me. While I play lay at 4K, not 1080p, FS2020 is so horribly 'not' optimized that it may affect 4K as well. I am going to wait until FS2020 SU11 general release (scheduled for November 11) as NVidia suggests that there will be huge gains via off-loading from the CPU. If that does not turn out to be correct, I'll keep the 5950X but take out the 3950X from the neighboring board in that build and plug in a 5800X3D...


----------



## arvinz

(6) TUF 4090's currently available on Best Buy Canada site. Just ordered one. Strix is coming tomorrow but might just keep the TUF instead. I'm not seeing a major difference performance wise between the two. Thoughts?


----------



## AvengedRobix

yzonker said:


> Look what just showed up.
> 
> View attachment 2578631


For what card?


----------



## Sheyster

arvinz said:


> (6) TUF 4090's currently available on Best Buy Canada site. Just ordered one. Strix is coming tomorrow but might just keep the TUF instead. I'm not seeing a major difference performance wise between the two. Thoughts?


It depends on the card you get (lottery) and if you're planning to bench or not. If this Gigabyte OC I just got is a dud, I'll probably get a Strix next when I can score one.


----------



## Arizor

arvinz said:


> (6) TUF 4090's currently available on Best Buy Canada site. Just ordered one. Strix is coming tomorrow but might just keep the TUF instead. I'm not seeing a major difference performance wise between the two. Thoughts?


If the money matters to you the TUF is an excellent choice, barely any difference between it and the Strix, mostly down to sillicon lottery. Strix has better cooling, but again, TUF cooling is absolutely fine.


----------



## arvinz

Sheyster said:


> It depends on the card you get (lottery) and if you're planning to bench or not. If this Gigabyte OC I just got is a dud, I'll probably get a Strix next when I can score one.


My Strix is arriving tomorrow..if you want it let me know. I'd rather sell to folks on this forum then return it. You can have it for whatever I paid.


----------



## yzonker

AvengedRobix said:


> For what card?


TUF


----------



## Mad Pistol

CableMod to the rescue... why are they having to fix Nvidia's massive engineering snafu?






CableMod 12VHPWR Angled Adapter – CableMod Global Store







store.cablemod.com


----------



## 8472

arvinz said:


> (6) TUF 4090's currently available on Best Buy Canada site. Just ordered one. Strix is coming tomorrow but might just keep the TUF instead. I'm not seeing a major difference performance wise between the two. Thoughts?


FWIW the strix is about 11mm narrower than the TUF so it should be easier to get the side panel closed. 

RGB does matter to some folks. The Strix has more than the TUF. Plus it has fan headers built in if you'd find that useful. 

The red and blue on the Strix might not match the color scheme of your case if you mount it vertically. The TUF is neutral. 

Outside of those things, it's probably just up to the silicon lottery.


----------



## arvinz

8472 said:


> FWIW the strix is about 11mm narrower than the TUF so it should be easier to get the side panel closed.
> 
> RGB does matter to some folks. The Strix has more than the TUF. Plus it has fan headers built in if you'd find that useful.
> 
> The red and blue on the Strix might not match the color scheme of your case if you mount it vertically. The TUF is neutral.
> 
> Outside of those things, it's probably just up to the silicon lottery.


I will be putting it under water with the Strix/TUF Optimus block once that's released. So I suppose in that case there's even less difference between the two for me. I can't imagine them performing too different once it's under water.


----------



## mirkendargen

Mad Pistol said:


> CableMod to the rescue... why are they having to fix Nvidia's massive engineering snafu?
> 
> 
> 
> 
> 
> 
> CableMod 12VHPWR Angled Adapter – CableMod Global Store
> 
> 
> 
> 
> 
> 
> 
> store.cablemod.com


According to a friend of mine that looked into the initial release of them....because they want to charge you $50 for a right angle adapter, with $30 "rush processing" and $20 shipping for a total of $100 for a $1 (if that) part. Yeah, not kidding, they're gouging that hard.


----------



## LuckyImperial

I only have a little 12600K pushing my 4090 and Superpotion 4K optimized gives me just abut 31500.

There's definitely some weird CPU utilization issue going on with Superpositon though. When it gets to the scene where everything starts flying around, my GPU utilization drops to 80% and my CPU utilization starts getting loaded up on C0 T0, but C0 T0 never boosts beyond 3.6GHz. Its like a prioritization/scheduling issue within Windows. At some point I want to try changing the prioritization for Superpsotion within Task Manager.

But...tbh, CP2077 with a mix of High/Ultra (game presets for this hardware) runs incredibly smooth. Certainly far beyond 60fps. Preset DLSS was Auto, and I might try Quality...but like...I think I'm done benching haha. The gaming performance is incredibly.


----------



## J7SC

LuckyImperial said:


> I only have a little 12600K pushing my 4090 and Superpotion 4K optimized gives me just abut 31500.
> 
> There's definitely some weird CPU utilization issue going on with Superpositon though. When it gets to the scene where everything starts flying around, my GPU utilization drops to 80% and my CPU utilization starts getting loaded up on C0 T0, but C0 T0 never boosts beyond 3.6GHz. Its like a prioritization/scheduling issue within Windows. At some point I want to try changing the prioritization for Superpsotion within Task Manager.
> 
> But...tbh, CP2077 with a mix of High/Ultra (game presets for this hardware) runs incredibly smooth. Certainly far beyond 60fps. Preset DLSS was Auto, and I might try Quality...but like...I think I'm done benching haha. The gaming performance is incredibly.


In general, the Unigine engine reacts to both CPU IPC and system RAM. Beyond that, from what I can tell at the time-line-FPS graphs they display on the Superposition leaderboard, scene 10 is a common low point for many, while 13/14 is the low point for other system-GPU combos. I haven't checked lately though, ie. re 4090s...


----------



## cheddardonkey

A 600w bios for the Aorus Extreme Waterforce 4090 please! This card runs waaaay too cool! Need more powah!


----------



## biigshow666

Sheyster said:


> Version F2 - "Increase compatibility."
> 
> Sounds similar to the recent ASUS 4090 BIOS update. If you've not having issues probably no reason to upgrade.


Not really any issues so far but I tried updating anyways. It gave an error saying bios mode does not match.. Its on the OC bios and this is supposed to be the oc update.


----------



## morph.

new vbios for Asus cards avail looks like all the changelog says is "improved compatibility".


----------



## Panchovix

Does GPU Power Stage affect directly the max core clock and such? Seeing the GPU stages of some GPUs on the first page (like the PNY or the Gigabyte Windforce), damn they really have really few power stages (14x50A, 700A), way less than the FE (20x70A, 1400A); basically making paying for the AIBs instead of NVIDIA pointless in those cases

That would also mean that a MSI Suprim or an ASUS Strix (1820A/1680A) overclocks better by default? Or is all just depending of the core itself?


----------



## Azazil1190

After 3hours of cp2077 at 4k max out dlss quality and rt on the card is super cool.
Daily stable oc ,100%pl and max voltage.Those gpus are cooler than my previous 3090ti tuf.
Great job Nvidia to this ...


----------



## GAN77

biigshow666 said:


> Not really any issues so far but I tried updating anyways. It gave an error saying bios mode does not match.. Its on the OC bios and this is supposed to be the oc update.


They seem to have confused the names for OC and Silent bios in the readme.
if you try to flash the silent BIOS with the OC mode set on the video card, then everything is fine


----------



## KedarWolf

morph. said:


> new vbios for Asus cards avail looks like all the changelog says is "improved compatibility".


I flashed it, but nvidia-smi now shows this.

WARNING: infoROM is corrupted at gpu 0000:0A:00.0


----------



## morph.

Have you tried reflashing it again? Else try to revert back to the previous vbios Asus RTX 4090 VBIOS


----------



## biigshow666

GAN77 said:


> They seem to have confused the names for OC and Silent bios in the readme.
> if you try to flash the silent BIOS with the OC mode set on the video card, then everything is fine


Thanks mate, that worked and am benching now.


----------



## Laithan

Jay is trying to make it fail..


----------



## KedarWolf

morph. said:


> Have you tried reflashing it again? Else try to revert back to the previous vbios Asus RTX 4090 VBIOS


It only shows that on the OC BIOS. Other BIOS I flashed also is fine.


----------



## mirkendargen

Laithan said:


> Jay is trying to make it fail..


And...he couldn't... Seems like it might be less about crazy bends and more about manufacturing defects in the adapters.


----------



## Rei86

mirkendargen said:


> And...he couldn't... Seems like it might be less about crazy bends and more about manufacturing defects in the adapters.


Or hours of gaming in a enclosed case with the cable smashed against the side panel?
To many unasked and unanswered questions about these melted connectors, that someone should start up a data base to figure out and someone brave and willing to sacrifice their connector.
However I do agree that this connector was halfassed, I think SIG themselves says to limit them to 400w (was in one of the videos of the emails between them and nVidia, and about concerns of melting connectors).


----------



## Mad Pistol

Laithan said:


> Jay is trying to make it fail..


The fact that Jay can't make it fail is a good sign. It means that so far, it's isolated incidents more than likely due to manufacturing defects with the cable and/or card. I have a hard time believing that Nvidia would let a defective design into the wild. We will continue watching.


----------



## J7SC

Mad Pistol said:


> The fact that Jay can't make it fail is a good sign. It means that so far, it's isolated incidents more than likely due to manufacturing defects with the cable and/or card. I have a hard time believing that Nvidia would let a defective design into the wild. We will continue watching.


I wonder also if a 'tired' / cheaper PSU and cables leading from the PSU to the dongle could be an issue, not just the (agreeable hideous) 4-into-1 dongle, and/or connector on the card. FYI, my 4090 is vertical, which helps re. both bending requirements and expelling heat.


----------



## slayer6288

motivman said:


> flashed 600W suprim X bios to my dud trio, and max power draw 577W. My card is so garbage.. maxes out at +1100 memory and will not run anything over 3000 MHz for core... I need me a good clocking 4090, SMH


How does one obtain a 600 watt MSI suprim x bios? Can you link it to me? Also to pull over 520 watt on ur tri are u using the triple adapter or something with 4 pcie 8 pins?


----------



## LuckyImperial

J7SC said:


> In general, the Unigine engine reacts to both CPU IPC and system RAM. Beyond that, from what I can tell at the time-line-FPS graphs they display on the Superposition leaderboard, scene 10 is a common low point for many, while 13/14 is the low point for other system-GPU combos. I haven't checked lately though, ie. re 4090s...


I guess that's what I think is odd. My 12600k's IPC shouldn't be that far off from a 12900k. And I have DDR5 @ 5600MHz, but it's only gear 2. Maybe the 2800MHz memory controller is my problem. I don't know if I care enough to go find stable 1 gear timings.


----------



## heptilion

KedarWolf said:


> It only shows that on the OC BIOS. Other BIOS I flashed also is fine.


When I updated mine from Asus website, it updated both bios in one go.


----------



## Hanks552

Anyone did a 4090 msi trio bios flash? Wondering if we could just flash suprim and use a 4 cable adapter to get close to 600w


----------



## N19htmare666

cheddardonkey said:


> A 600w bios for the Aorus Extreme Waterforce 4090 please! This card runs waaaay too cool! Need more powah!


Here is the 600w BIOS from the MSI Liquid 4090. Would like to see before and after scores of superposition on the waterforce.



sweepersc said:


> I checked the Suprim Liquid X vbioses uploaded on TPU (95.02.18.40.4C) and it does say 530W. However, that is not the vbios my card came with, mine is 95.02.18.00.CD. Here's the Nvidia SMI screenshots:
> Stock
> View attachment 2577279
> 
> 
> Max PL
> View attachment 2577280
> 
> I also tried uploading in TPU but it said that registration from certain countries are no longer accepted. I put it in GDrive, here is the link . Can someone upload this on TPU please?
> 
> Regards,
> Mark





derthballs said:


> Download nvflash - backup your original bios first for safety, then nvflash64 -6 nameofbios.rom


If that doesnt work you might need to run nvflash64.exe --protectoff first. 

nvflash is linked in the first post of this forum.


----------



## KedarWolf

morph. said:


> Have you tried reflashing it again? Else try to revert back to the previous vbios Asus RTX 4090 VBIOS


After I rebooted, the error went away.

But I have a different problem.

On my 5120x1440 screen at 240Hz or even 120Hz, G-Sync not working, in Fullscreen or Windowed Fullscreen modes. It's enabled on my monitor and Nvidia settings.

Tried Diablo 3 and Cyberpunk.


----------



## heptilion

N19htmare666 said:


> Here is the 600w BIOS from the MSI Liquid 4090. Would like to see before and after scores of superposition.


How to run this NVSMI? i dont have a folder by that name


----------



## KedarWolf

heptilion said:


> How to run this NVSMI? i dont have a folder by that name


Just open an admin command prompt in the System32 folder. Run nvidia-smi for general info.


----------



## heptilion

KedarWolf said:


> Just open an admin command prompt in the System32 folder. Run nvidia-smi for general info.


Thanks!

This is what i have for my strix 4090 with updated bios


----------



## motivman

slayer6288 said:


> How does one obtain a 600 watt MSI suprim x bios? Can you link it to me? Also to pull over 520 watt on ur tri are u using the triple adapter or something with 4 pcie 8 pins?


Yes, I am using the triple adapter and pulling over 570W. The 600W bios was posted a few days ago in this forum.


----------



## KedarWolf

heptilion said:


> Thanks!
> 
> This is what i have for my strix 4090 with updated bios
> 
> View attachment 2578710


You need to lower, set, could then raise again the voltage and power limit sliders in Afterburner to maximum and then set again and it'll go up to 600w, took a bit for me to figure that out.


----------



## heptilion

KedarWolf said:


> You need to lower, set, could then raise again the voltage and power limit sliders in Afterburner to maximum and then set again and it'll go up to 600w, took a bit for me to figure that out.


This is the default stock settings. Not planning to do any overclocking.


----------



## slayer6288

motivman said:


> Nope, with 600W suprim X bios, I can pull as much as 577W


Can you link it? I can only find Asus and Gigabyte 600watt bioses.


----------



## slayer6288

N19htmare666 said:


> Here is the 600w BIOS from the MSI Liquid 4090. Would like to see before and after scores of superposition on the waterforce.
> 
> 
> 
> 
> 
> 
> If that doesnt work you might need to run nvflash64.exe --protectoff first.
> 
> nvflash is linked in the first post of this forum.


Do the triple fan air cooled cards work okay with this bios since its intended for an AIO setup?


----------



## morph.

heptilion said:


> When I updated mine from Asus website, it updated both bios in one go.


How did it go, any differences in performance? Any event log errors with the nvlddmkm? And hows gsync going as per @KedarWolf's comment.


----------



## morph.

KedarWolf said:


> After I rebooted, the error went away.
> 
> But I have a different problem.
> 
> On my 5120x1440 screen at 240Hz or even 120Hz, G-Sync not working, in Fullscreen or Windowed Fullscreen modes. It's enabled on my monitor and Nvidia settings.
> 
> Tried Diablo 3 and Cyberpunk.


that's a freaking dealbreaker if so.. hoping its a quirk....


----------



## Mad Pistol

J7SC said:


> I wonder also if a 'tired' / cheaper PSU and cables leading from the PSU to the dongle could be an issue, not just the (agreeable hideous) 4-into-1 dongle, and/or connector on the card. FYI, my 4090 is vertical, which helps re. both bending requirements and expelling heat.


This might very well be the case. I had an old EVGA 750-watt G2 that I was going to use, but I decided against it and ordered a new case and 1300-watt EVGA PSU. There have been zero issues thus far; the PSU is brand new. I've also put my finger up to the connector under load, and it seems to be a bit warm, but not hot by any means.

Perhaps it's just people pushing their old hardware a bit too hard. We haven't had a halo-class GPU come out that uses this sort of power in a long time.


----------



## KedarWolf

__
https://www.reddit.com/r/buildapc/comments/y4vafs

G-Sync issues with 4090s.


----------



## morph.

the latest gpu tweak 3 also fks with monitor refresh rate when applying OC be careful if you are using that.


----------



## J7SC

Mad Pistol said:


> This might very well be the case. I had an old EVGA 750-watt G2 that I was going to use, but I decided against it and ordered a new case and 1300-watt EVGA PSU. There have been zero issues thus far; the PSU is brand new. I've also put my finger up to the connector under load, and it seems to be a bit warm, but not hot by any means.
> 
> Perhaps it's just people pushing their old hardware a bit too hard. We haven't had a halo-class GPU come out that uses this sort of power in a long time.


...I ended up mounting a 120mm Arctic P12 blowing against the 4090's backplate (I do that w/ most GPUs)...pleasant side-effect: Connector and cable is not even warm during load...


----------



## KedarWolf

morph. said:


> that's a freaking dealbreaker if so.. hoping its a quirk....


Here's what i had to do on my 4090 to get G-Sync working.

Make sure your monitor has G-Sync, FreeSync or Adaptive Sync enabled

Enable G-Sync in Nvidia Control Panel for Fullscreen and Windowed Fullscreen.

Select the Enable Settings For The Selected Display Model.

Turn on V-Sync in Control Panel. Turn OFF any Frame Limiters in the Vidia Control Panel including the Background Frame Rate one and any third-party ones.

Turn OFF V-Sync in your games though.

Reboot your PC. The G-Sync indicator enabled in Display in the Nvidia control panel should be showing now.

I can use the Background Frame Limiter built-in to Diablo 3 though.


----------



## morph.

KedarWolf said:


> I fixed it. Enabled V-Sync in Nvidia Control Panel, disabled V-Sync in the games. Rebooted, G-Sync indicator shows now.
> 
> I also turned off my frame limiters in Nvidia Control Panel, don't need them with V-Sync on, but brb, gonna try the background frame limiter on, I need that.
> 
> Can't use any frame limiters on in the Nvidia Control Panel to have G-Sync working but I can use the frame limiters in Diablo 3 just fine.


hows it benching any errors in event log with nvlddmkm ?


----------



## KedarWolf

morph. said:


> hows it benching any errors in event log with nvlddmkm ?


No errors in the event log other than some services disabled errors I get from having a bunch of services I disabled.

See my edits to my G-Sync post, doing all this fixed G-Sync for me.


----------



## mattskiiau

Anyone have an opinion of Cablemods 12VHPWR vs Corsairs 12VHPWR cable?
It's easier to get the Corsair version for my AX1200 over the cablemod one in my country.


----------



## mirkendargen

J7SC said:


> ...I ended up mounting a 120mm Arctic P12 blowing against the 4090's backplate (I do that w/ most GPUs)...pleasant side-effect: Connector and cable is not even warm during load...


I also set a 120mm Arctic P12 running at 700RPM on the "blow through" part of the backplate, and now my GPU fans don't spin up for loads under 200w. It's nice for older/less demanding games to not have the fans cycling periodically.


----------



## morph.

mattskiiau said:


> Anyone have an opinion of Cablemods 12VHPWR vs Corsairs 12VHPWR cable?
> It's easier to get the Corsair version for my AX1200 over the cablemod one in my country.


Ill leave this here, I'm in Australia too. I ordered it via the cable mod international web store, it is made to order for my ax1000, in the next couple of days they will be taking orders for their 90deg adapter too you may as well do it together.


----------



## mirkendargen

mattskiiau said:


> Anyone have an opinion of Cablemods 12VHPWR vs Corsairs 12VHPWR cable?
> It's easier to get the Corsair version for my AX1200 over the cablemod one in my country.


I personally have the Corsair cable and a Moddiy cable (can't speak to Cablemod specifically). I prefer the Moddiy cable because I can have fun UV reactive mesh on it, and more importantly my case is gigantic and while I can make the 70cm Corsair cable reach, it's a less than ideal route and the custom 90cm Moddiy cable is cleaner. The Corsair cable is also stiff AF, it's solid strand wire I think. This could be a positive because you can basically mold the cable to be the way you want it, and have the side of your case forcing the connectors together, rather than pushing up/down on it.


----------



## AvengedRobix

KedarWolf said:


> __
> https://www.reddit.com/r/buildapc/comments/y4vafs
> 
> G-Sync issues with 4090s.


Knowed problem.. have to wait new driver or new lg firmware..of you look you're TV go on vrr mode and not g-sync


----------



## Hanks552

motivman said:


> Yes, I am using the triple adapter and pulling over 570W. The 600W bios was posted a few days ago in this forum.


Would that work for msi trio?


----------



## mattskiiau

morph. said:


> Ill leave this here, I'm in Australia too. I ordered it via the cable mod international web store, it is made to order for my ax1000, in the next couple of days they will be taking orders for their 90deg adapter too you may as well do it together.
> View attachment 2578723


How long was shipping to AU?
I have a HX1200 type 4 so I can purchase the premade cable instead of the custom.
Appreciate your response


----------



## morph.

mattskiiau said:


> How long was shipping to AU?
> I have a HX1200 type 4 so I can purchase the premade cable instead of the custom.
> Appreciate your response


They use Fedex for shipping part is usually under 1 week, I made custom cable because why not for length & cable management. The wait for me was made to order approx 2-3 weeks.


----------



## J7SC

...we were watching the movie '2010' the other day (good movie, btw); and that gave me an idea for a desktop background all tongue-in-cheek of course


----------



## mirkendargen

The horror has a face - NVIDIA’s hot 12VHPWR adapter for the GeForce RTX 4090 with a built-in breaking point | igor'sLAB


Those who are now beating up on the new 12VHPWR (although I don't really like the part either) may generate nice traffic with it, but they simply haven't recognized the actual problem with the…




www.igorslab.de





Adapter is totally the problem, not the connector itself. Bridging all the connections like that is completely stupid, neither my Corsair or Moddiy cable do that, they both do 1:1 pins from the PSU to the GPU with no bridging. The adapter design has absolutely nothing stopping it from pulling 600w through a single PCIE 8pin->single 14AWG wire with a potentially shoddy solder job if the other solder points fail. This stems from Nvidia trying to be cute with the "smart" 3plug/4plug 450w/600w functionality. If they didn't do that, they could have just mapped pins 1:1 in the adapter and not bridged anything just like the other cables.

A 90deg adapter won't fix that if the solder is already weakened/straight bad from the factory.


----------



## sweepersc

heptilion said:


> Thanks!
> 
> This is what i have for my strix 4090 with updated bios
> 
> View attachment 2578710


Can you also show an nvidia-smi screenshot with the power slider set to max please?


----------



## Azazil1190

mirkendargen said:


> The horror has a face - NVIDIA’s hot 12VHPWR adapter for the GeForce RTX 4090 with a built-in breaking point | igor'sLAB
> 
> 
> Those who are now beating up on the new 12VHPWR (although I don't really like the part either) may generate nice traffic with it, but they simply haven't recognized the actual problem with the…
> 
> 
> 
> 
> www.igorslab.de
> 
> 
> 
> 
> 
> Adapter is totally the problem, not the connector itself. Bridging all the connections like that is completely stupid, neither my Corsair or Moddiy cable do that, they both do 1:1 pins from the PSU to the GPU with no bridging. The adapter design has absolutely nothing stopping it from pulling 600w through a single PCIE 8pin->single 14AWG wire with a potentially shoddy solder job if the other solder points fail. This stems from Nvidia trying to be cute with the "smart" 3plug/4plug 450w/600w functionality. If they didn't do that, they could have just mapped pins 1:1 in the adapter and not bridged anything just like the other cables.
> 
> A 90deg adapter won't fix that if the solder is already weakened/straight bad from the factory.


So we need the right cable like cablemode directly from the psu ?


----------



## heptilion

sweepersc said:


> Can you also show an nvidia-smi screenshot with the power slider set to max please?


----------



## Benni231990

Guys dont worry dont use the Nvidia Adapter buy this Extensioncable and you dont have any problem

ill buyed it yesterday and the Support tells ne With this cable you dont have any Problems they testet the cable in house and you can band as tight as you want and have no Problems


----------



## Nizzen

KedarWolf said:


> I flashed it, but nvidia-smi now shows this.
> 
> WARNING: infoROM is corrupted at gpu 0000:0A:00.0


Nvflash --protectoff ?


----------



## pat182

morph. said:


> new vbios for Asus cards avail looks like all the changelog says is "improved compatibility".


i did the bios update, got a strix last week, GPU is loosing signal randomly on idle or on the web, i need to keep wallpaper engine on to put a light load on it so it doesnt glitch ? idk if the bios update make it more stable or its a driver issue, but loading videos gif or photos on forum seems to trigger it too. anyone with this problem ? putting the nvpanel to maximum performance seems to fix it but its kinda meh to make it run 2700mhz idle

no black screen when gaming

i should add that i have a pg32uqx and the glitch might be related to HDR. theres so many variables to test im not able to pinpoint the cause


----------



## speeduae

am trying to flash my 4090 strix oc to the new vbios version - but am getting the following error "Get current version is null" - anyone knows how to solve this issue?


----------



## AvengedRobix

New Nvidia driver out


----------



## Nizzen

GeForce Game Ready Driver | 526.47 | Windows 10 64-bit, Windows 11 | NVIDIA


Download the English (US) GeForce Game Ready Driver for Windows 10 64-bit, Windows 11 systems. Released 2022.10.27



www.nvidia.com


----------



## Nizzen

Error 404


----------



## schoolofmonkey

Umm just dropped a 13900k in, upgraded from a 10900k, found this Timespy Extreme Result interesting.
Same overclock, same Windows 11 install.


----------



## pat182

Nizzen said:


> GeForce Game Ready Driver | 526.47 | Windows 10 64-bit, Windows 11 | NVIDIA
> 
> 
> Download the English (US) GeForce Game Ready Driver for Windows 10 64-bit, Windows 11 systems. Released 2022.10.27
> 
> 
> 
> www.nvidia.com


hope its gonna fix my black screen issue


----------



## morph.

Updated vbios, didn't get any errors things to seem to be running fine.

Tech GPUz seems to think I'm running the of bios though maybe or the build date wasn't updated?


----------



## speeduae

morph. said:


> Updated vbios, didn't get any errors things to seem to be running fine.
> 
> Tech GPUz seems to think I'm running the of bios though maybe or the build date wasn't updated?
> View attachment 2578778


id could be related to ryzen 7000 platforms that triggers my issue of trying to update the bios


----------



## DokoBG

Benni231990 said:


> Guys dont worry dont use the Nvidia Adapter buy this Extensioncable and you dont have any problem
> 
> ill buyed it yesterday and the Support tells ne With this cable you dont have any Problems they testet the cable in house and you can band as tight as you want and have no Problems


Thank you, just ordered one of these. I use EVGA supernova 1200 P2 for now. Hopefully fits fine.


----------



## Xavier233

pat182 said:


> hope its gonna fix my black screen issue


I am having also black screen issue. What are your symptoms? Is it related to this thread here: MASSIVE RTX 4090 Problems. driver or hardware?


----------



## Benni231990

DokoBG said:


> Thank you, just ordered one of these. I use EVGA supernova 1200 P2 for now. Hopefully fits fine


I ordered my also yesterday because i dont Want to use the Nvidia Adapter


----------



## pat182

Xavier233 said:


> I am having also black screen issue. What are your symptoms? Is it related to this thread here: MASSIVE RTX 4090 Problems. driver or hardware?


black screen when the pc idle or ligh 2d applications, either its an hdr bug, or the card trottle itself so low it loose signal. putting wallpaper engine or nvpanel performance mode fix it. so it a low power-eco mode problem i think, do you have a strix too ?


----------



## Xavier233

pat182 said:


> black screen when the pc idle or ligh 2d applications, either its an hdr bug, or the card trottle itself so low it loose signal. putting wallpaper engine or nvpanel performance mode fix it. so it a low power-eco mode problem i think, do you have a strix too ?


Gigabyte OC here. it happened when playing a youtube video, so am not so sure about idle. I left it for over 3 hours idling, the black screen (driver disconnect) issue did not happen, but it happened 3 times while just playing random youtube vids. No issues in-game


----------



## Spiriva

speeduae said:


> am trying to flash my 4090 strix oc to the new vbios version - but am getting the following error "Get current version is null" - anyone knows how to solve this issue?
> 
> View attachment 2578765


I got to flash mine, make sure you pick the right version OC / None OC, and run it as admin.


----------



## J7SC

On the black-screen issue, I have had it happen twice when the card was connected to an older 27 inch Samsung during initial testing. Since connected to the LG C1 OLED (via the same HDMI 2.1), it has never happened again. So it could be some sort of monitor compatibility issue ?


----------



## Azazil1190

Guys how i can save my bios before flash mine(asus tuf oc) i try via gpuz but you know ..i can't


----------



## alasdairvfr

Azazil1190 said:


> Do me a favour and try something.
> If you have enable the resized bar via bios ,turn it off and make a run of superposition 4k.
> With this way i have better scores (only to superposition)


I didn't see any change, brought me up like 80pts


----------



## CZonin

Are there any newer BIOS for the Suprim X (non-liquid) since this one? MSI RTX 4090 VBIOS


----------



## speeduae

Spiriva said:


> I got to flash mine, make sure you pick the right version OC / None OC, and run it as admin.


yes i chose the correct file and compared sha256 hashes of what i have and whats reported by asus - same issue. it will always prompt for admin rights if you click on it but i tried the run as admin as well before - always the same issue.


----------



## KedarWolf

Azazil1190 said:


> Guys how i can save my bios before flash mine(asus tuf oc) i try via gpuz but you know ..i can't


nvflash64.exe -b backup.rom









NVIDIA NVFlash (5.792.0) Download


NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID




www.techpowerup.com


----------



## yzonker

J7SC said:


> On the black-screen issue, I have had it happen twice when the card was connected to an older 27 inch Samsung during initial testing. Since connected to the LG C1 OLED (via the same HDMI 2.1), it has never happened again. So it could be some sort of monitor compatibility issue ?


Could be the cable. My 2nd monitor is an older Samsung 4k TV. The HDMI ports on the TUF would not recognize it at all. Switching to a different cable fixed it. 

But the original cable worked fine with my 3080ti and 3090. And the original cable is identical to the one I'm still using on my C1 which works fine on the 4090.


----------



## KedarWolf

KedarWolf said:


> nvflash64.exe -b backup.rom
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA NVFlash (5.792.0) Download
> 
> 
> NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID
> 
> 
> 
> 
> www.techpowerup.com


On this NVFlash download, they are named wrong, the 32 one is the 64-bit one and the 64 one is the 32-bit one.

I had to use the 32-bit one on my 64-bit Windows


----------



## J7SC

On the 12VH


yzonker said:


> Could be the cable. My 2nd monitor is an older Samsung 4k TV. The HDMI ports on the TUF would not recognize it at all. Switching to a different cable fixed it.
> 
> But the original cable worked fine with my 3080ti and 3090. And the original cable is identical to the one I'm still using on my C1 which works fine on the 4090.


Could be, though these are the best HDMI 2.1 cables I could find, metal-weave shield and all. Anyway, since I connected the 4090 and said cable to the C1, I never had a black-screen again

---

VERY interesting dissection of the 4-into-1 dongle bit by Igor's Lab as posted on another forum. Just finished watching it. While in German, a picture is worth a thousand words (and or use translate)...in short, it is not the 12VHPWR connector on the 4090 itself, nor the cables in the dongle, but the flimsy soldering in the connector at the end of the 4-inot-1 dongle ...


----------



## yzonker

J7SC said:


> On the 12VH
> 
> 
> Could be, though these are the best HDMI 2.1 cables I could find, metal-weave shield and all. Anyway, since I connected the 4090 and said cable to the C1, I never had a black-screen again
> 
> ---
> 
> VERY interesting dissection of the 4-into-1 dongle bit by Igor's Lab as posted on another forum. Just finished watching it. While in German, a picture is worth a thousand words (and or use translate)...in short, it is not the 12VHPWR connector on the 4090 itself, nor the cables in the dongle, but the flimsy soldering in the connector at the end of the 4-inot-1 dongle ...


Well I wss speaking in general to everyone that is having the black screen issue. For reference, this is the cable that didn't work, 

Amazon.com


----------



## AdamK47

J7SC said:


> On the 12VH
> 
> 
> Could be, though these are the best HDMI 2.1 cables I could find, metal-weave shield and all. Anyway, since I connected the 4090 and said cable to the C1, I never had a black-screen again
> 
> ---
> 
> VERY interesting dissection of the 4-into-1 dongle bit by Igor's Lab as posted on another forum. Just finished watching it. While in German, a picture is worth a thousand words (and or use translate)...in short, it is not the 12VHPWR connector on the 4090 itself, nor the cables in the dongle, but the flimsy soldering in the connector at the end of the 4-inot-1 dongle ...


I timestamped it at the most important part. It doesn't matter what language he's using. We all know what he's talking about with those hand gestures.


----------



## Xavier233

J7SC said:


> On the black-screen issue, I have had it happen twice when the card was connected to an older 27 inch Samsung during initial testing. Since connected to the LG C1 OLED (via the same HDMI 2.1), it has never happened again. So it could be some sort of monitor compatibility issue ?


Am not sure. I might try a DisplayPort cable for my second monitor, maybe it is related to HDMI


----------



## bmagnien

Just spoke to a SilverStone rep for anyone with their PSUs. They're getting a shipment of native 16-pin 12VHPWR cables next week. He said he'll ping me when they're available, so can at least bypass the NV adapter.


----------



## alasdairvfr

When people say they have a black screen issue, is this blanking for a second on the one screen or is this losing the entire driver, multiple screens? I've had blanking before and its a PITA to troubleshoot.


Also, did anyone try the new drivers 526.47 yet? I'll probably install it in a bit and test out.


----------



## N19htmare666

alasdairvfr said:


> When people say they have a black screen issue, is this blanking for a second on the one screen or is this losing the entire driver, multiple screens? I've had blanking before and its a PITA to troubleshoot.
> 
> 
> Also, did anyone try the new drivers 526.47 yet? I'll probably install it in a bit and test out.


Would like to know if the performance is better is worse also. Oven new drivers are slower in my experience but the ones with the 4090 launch were good


----------



## Xavier233

alasdairvfr said:


> When people say they have a black screen issue, is this blanking for a second on the one screen or is this losing the entire driver, multiple screens? I've had blanking before and its a PITA to troubleshoot.
> 
> 
> Also, did anyone try the new drivers 526.47 yet? I'll probably install it in a bit and test out.


Yes blinking, one or both your screen go black, and both come back after a few seconds. If you go to Control Center, under Windows Tools, Event Viewer, Windows Logs, System: you should see some error regarding the driver failing. Can you past them here?


----------



## alasdairvfr

N19htmare666 said:


> Would like to know if the performance is better is worse also. Oven new drivers are slower in my experience but the ones with the 4090 launch were good


Ill try to have a go when I have some time after work



Xavier233 said:


> Yes blinking, one or both your screen go black, and both come back after a few seconds. If you go to Control Center, under Windows Tools, Event Viewer, Windows Logs, System: you should see some error regarding the driver failing. Can you past them here?


I don't have this issue now but when I bought my MSI Optix + 3080 TUF-OC at the beginning of the year some games I'd have it and it drove me nuts. I bought so many god damned cables. I knew it wasn't to do with the card per se and there were 100+ posts online about it in general (I looked at specific monitor as well as general) and what ended up solving it for me was to go into NVCP and set power management mode to max performance, then it never happened again. It used to coincide with wallpaper slide show and a few other things that I could track but it would still occasionally happen at random. GSync off I think fixed it but I had just paid a king's ransom for 4k-144hz-gsync....

Not sure if that will help anyone here but I tend to leave that setting on with my subsequent cards


----------



## Xavier233

Performance mode will run your GPU at high (highest?) frequency all the time, even when idling. This will not only generate continuous heat, but consume a lot more power.


----------



## LunaP

DP spiltter is arriving today so going to temp tear down my loop to add the card in till blocks show up, anyone here w/ an x299 board (asus pref ) that has their card working? Seeing a lot of reports of people needing a bios update but not showing anything on asus' side or 3rd party sites for it.


----------



## 8472

OC3D takes a look at the cablemod 12vhpwr cable.


----------



## cheddardonkey




----------



## Roacoe717

HI I have the Gigabyte OC 4090, Hwinfo64 only reports 489 watts is this normal or should it be more?


----------



## alasdairvfr

Xavier233 said:


> Performance mode will run your GPU at high (highest?) frequency all the time, even when idling. This will not only generate continuous heat, but consume a lot more power.


It idles for me, though YMMV i suppose


----------



## Tideman

So I got my Gigabyte Gaming OC and it kills my TUF in every way!

Firstly... Coil whine is much less severe than the TUF. It's completely tolerable for me.

And here's my max (stress test) core OC. Stays locked at 3090Mhz (+300 offset). My TUF could only do 3015.










Max temp 66C and 77 hotspot.


----------



## cheddardonkey

N19htmare666 said:


> Here is the 600w BIOS from the MSI Liquid 4090. Would like to see before and after scores of superposition on the waterforce.
> 
> 
> Is this compatible with the Aorus 4090 Waterforce?
> 
> 
> 
> 
> 
> If that doesnt work you might need to run nvflash64.exe --protectoff first.
> 
> nvflash is linked in the first post of this forum.


----------



## motivman

Hanks552 said:


> Would that work for msi trio?


my card is a trio


----------



## Madness11

Guys, any one melted pins here ?? Coz I got my liquid X Saturday, and I m worry about this situation(


----------



## Nizzen

Tideman said:


> So I got my Gigabyte Gaming OC and it kills my TUF in every way!
> 
> Firstly... Coil whine is much less severe than the TUF. It's completely tolerable for me.
> 
> And here's my max (stress test) core OC. Stays locked at 3090Mhz (+300 offset). My TUF could only do 3015.
> View attachment 2578868
> 
> 
> 
> Max temp 66C and 77 hotspot.


What gpuscore in port royal and timespy extreme?


----------



## Nizzen

Madness11 said:


> Guys, any one melted pins here ?? Coz I got my liquid X Saturday, and I m worry about this situation(


I'm runnig no powerlimit on Asus 4090 strix, and it aint melted yet


----------



## 8472

This is going to get interesting. 


__ https://twitter.com/i/web/status/1585728173895200768

__ https://twitter.com/i/web/status/1585728177053356038


----------



## cheddardonkey

N19htmare666 said:


> We're you able to get a custom fan curve to fix it?
> Have you thought about the 600w msi liquid BIOS, or the 630w Neptune (also a 360 aio) BIOS?
> 
> Wondering if there is still hope to get past the 500w waterforce limit...


 In this boat myself, I havent tried to flash a 600w bios yet but I'm on the hunt.. are you saying its not worth it?


----------



## biigshow666

Roacoe717 said:


> HI I have the Gigabyte OC 4090, Hwinfo64 only reports 489 watts is this normal or should it be more?
> 
> View attachment 2578867


try 100 on your voltagee to get 1.1V

even then mine only got to 557.8w


----------



## mirkendargen

Confirmation Alphacool is working on Gigabyte blocks. No ETA though.


----------



## 8472

8472 said:


> This is going to get interesting.
> 
> 
> __ https://twitter.com/i/web/status/1585728173895200768
> 
> __ https://twitter.com/i/web/status/1585728177053356038


I wonder if this is the reason that there have not been any significant restocks from Newegg or Bestbuy in the past week. 


__ https://twitter.com/i/web/status/1585732381453221888


----------



## Rei86

8472 said:


> This is going to get interesting.
> 
> 
> Spoiler
> 
> 
> 
> 
> 
> __ https://twitter.com/i/web/status/1585728173895200768
> 
> __ https://twitter.com/i/web/status/1585728177053356038


That sucks, what an oversight in quality control and testing to such an awesome GPU.


----------



## N19htmare666

cheddardonkey said:


> In this boat myself, I havent tried to flash a 600w bios yet but I'm on the hunt.. are you saying its not worth it?


Worth it at the temperatures you're getting. Just remember to do a backup first incase you want to switch back


----------



## AvengedRobix

Just a test in F1 22 with GeForce experience setting enabled (in menù you can't turn on frame generator) and WOW


----------



## cheddardonkey

N19htmare666 said:


> Worth it at the temperatures you're getting. Just remember to do a backup first incase you want to switch back


I dont see the MSI 600w bios link anywhere, only the 530 is listed. can you share it please? I'll do the superposition benchmark comparisons. Thank you!


----------



## 8472

If they are going to investigate the connector issue they might as well try to address the inordinate occurrences of coil whine too.


----------



## zekkragnos

heptilion said:


> There is a bios update for the Strix. Says improve compatibility. Changed a few default settings as well.
> View attachment 2578426
> View attachment 2578428


 How do I update the bios? I downloaded the file from the Asus support but everytime I launch it it gives me an error about get current null error


----------



## N19htmare666

cheddardonkey said:


> I dont see the MSI 600w bios link anywhere, only the 530 is listed. can you share it please? I'll do the superposition benchmark comparisons. Thank you!


Sweepersc uploaded here





SuprimX4090G.rom







drive.google.com


----------



## 8472

TLDW: Nvidia's adapter is the problem, if you have another cable use that instead.


----------



## Sheyster

Well, I went ahead and ordered the Cablemod 16-pin to 4 x 8-pin cable. These new adapters look like crap and I'm not taking any chances with issues like this.


----------



## MrTOOSHORT

Cablemod cable ordered last Friday. Not shipped yet. Processing still.

I checked on the cablemods website, order number and email tracking, eta for shipping is Nov 12th. 😂 

Why doesn’t Evga make one for the highend T2s?


----------



## yt93900

Such a shame Cablemod just doesn't make the cable fit a certain PSU, instead of an extension.


----------



## bmagnien

yt93900 said:


> Such a shame Cablemod just doesn't make the cable fit a certain PSU, instead of an extension.


They do…use their configurator


----------



## yt93900

You're right, it's very expensive in the EU, though. Standard length (600mm) w. 4 alu combs is €53,50 and you can't even order it - their EU store has a minimum order value of €69,90. Guess I'll have to wait for Corsair...
Shame you can't just drive up to their office, Corsair is like 3km away from here  last I had to order the LGA1700 standoffs and they took 3 days for a 3km trip to arrive.


----------



## marc0053

Let's hope my cable mod works.
Follow me for more tricks....hahaha


----------



## dboom

Wrong post. delete pls


----------



## slayer6288

Is this an air cooled suprim or the AIO water cooler one?


----------



## dboom

Tideman said:


> So I got my Gigabyte Gaming OC and it kills my TUF in every way!
> 
> Firstly... Coil whine is much less severe than the TUF. It's completely tolerable for me.
> 
> And here's my max (stress test) core OC. Stays locked at 3090Mhz (+300 offset). My TUF could only do 3015.
> View attachment 2578868
> 
> 
> 
> Max temp 66C and 77 hotspot.











I scored 1 in Time Spy Extreme Stress Test


AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com




Mine is doing well.


----------



## J7SC

Roacoe717 said:


> HI I have the Gigabyte OC 4090, Hwinfo64 only reports 489 watts is this normal or should it be more?
> 
> View attachment 2578867


...might depend on the actual app as well as settings, but I've seen over 560 W on my Giga-G-OC in HWI; hotspot temps are the issue though before really pushing it (once it's on water...)


----------



## slayer6288

motivman said:


> my card is a trio


The 600 watt bios you have is that for the water suprim or the air one?


----------



## slayer6288

N19htmare666 said:


> Sweepersc uploaded here
> 
> 
> 
> 
> 
> SuprimX4090G.rom
> 
> 
> 
> 
> 
> 
> 
> drive.google.com


Is that for the air suprim or the water aio one?


----------



## doom3crazy

slayer6288 said:


> Is that for the air suprim or the water aio one?


 i want to know that as well


----------



## Isaias Angelis

J7SC said:


> ...might depend on the actual app as well as settings, but I've seen over 560 W on my Giga-G-OC in HWI; hotspot temps are the issue though before really pushing it (once it's on water...)
> View attachment 2578917


I have a gb 4090 gaming oc too and with pl slider at 133% and voltage +100% gpuz read only 112% max power usage.why is thet happening?
Tested on both time spy extreme and superposition 4k optimized.


----------



## mirkendargen

8472 said:


> TLDW: Nvidia's adapter is the problem, if you have another cable use that instead.


After multiple trash videos by Jay on this topic, A++ this one is actually great! Bonus points for addressing the 3090 FE/3090TI plug and confirming it's 1:1 pins on the supplied adapter rather than bridging. All this to make the adapter "smart" with 3 or 4 connectors....


----------



## alasdairvfr

Well, I have concluded that in some specific usecases, SVM (virtualization enabled) can tank performance by a fair margin!

The only difference in the below screenshot comparisons would be one has SVM disabled, the other has it enabled but I killed Docker and any sign of VM before running the benchmark. ~6% difference. Not enough to make me reboot to bios every time I wanna play something but more than nothing!


Spoiler


----------



## J7SC

Isaias Angelis said:


> I have a gb 4090 gaming oc too and with pl slider at 133% and voltage +100% gpuz read only 112% max power usage.why is thet happening?
> Tested on both time spy extreme and superposition 4k optimized.


Are you on the quiet or OC vbios (shouldn't be much of a difference, though). Also, voltage slider on full (1.1.V instead of 1.05V) ?


----------



## Isaias Angelis

J7SC said:


> Are you on the quiet or OC vbios (shouldn't be much of a difference, though). Also, voltage slider on full (1.1.V instead of 1.05V) ?


Yes!
Oc bios and voltage slider +100% 1.1v.
And 133% pl.
Max i see is 112 power usage and 515watt...
Why is happening?
Gpu is powered from 1200w hp server psu with four 8pin cables and the pc itself from corsair rm750i.


----------



## bmagnien

Isaias Angelis said:


> Yes!
> Oc bios and voltage slider +100% 1.1v.
> And 133% pl.
> Max i see is 112 power usage and 515watt...
> Why is happening?
> Gpu is powered from 1200w hp server psu with four 8pin cables and the pc itself from corsair rm750i.


Different loads use different amounts of power. No need to freak out. Run furmark and check your rail power it’ll be over 600w


----------



## cheddardonkey

I have loaded the *600w* MSI Suprim Liquid X Bios on my *Gigabyte Aorus Waterforce 4090* which was previously set at 500w. Bios loaded fine, 600w is now available and the power slider now allows increased power. The fans on the AIO don't turn on now until GPU reaches 55 degrees. or to run constantly by choice. At stock they run constantly with no way to turn them off and I like this change, curve seems ok so far. I am also seeing some notable improvements across the board at the cost of very slightly higher temps.. I'll post more comparison results later!


----------



## yt93900

That's why I can't use another VBIOS on my WF 4090...hate the fan stop mode. I was so happy to see the WF 4090 did not have it.


----------



## N19htmare666

slayer6288 said:


> Is that for the air suprim or the water aio one?


It's for aio


----------



## N19htmare666

cheddardonkey said:


> I have loaded the *600w* MSI Suprim Liquid X Bios on my *Gigabyte Aorus Waterforce 4090* which was previously set at 500w. Bios loaded fine, 600w is now available and the power slider now allows increased power. The fans on the AIO don't turn on now until GPU reaches 55 degrees. or to run constantly by choice. At stock they run constantly with no way to turn them off and I like this change, curve seems ok so far. I am also seeing some notable improvements across the board at the cost of very slightly higher temps.. I'll post more comparison results later!


Great to hear. It's you could also post the temps or link a video of benchmarks that would be amazing! 
About the fans, what's the option to make them run constantly?


----------



## KedarWolf

New ASUS BIOS, my 24/7 stable settings.


















































Forgot to screenshot this one.









I scored 28 958 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Xavier233

alasdairvfr said:


> It idles for me, though YMMV i suppose
> 
> View attachment 2578869


----------



## schoolofmonkey

Vertical mount with the stock Thermaltake GF3 12VHPWR cable.


----------



## Jacinto1023

It says there is an update for my 4090 bios on EVGA Precision X1 but i cant find it. on The Gigabyte website that bios wasn't compatible.

I have Gigabyte Gaming OC 4090


----------



## KedarWolf

Okay, I flashed the TechPowerUp old Silent and Performance BIOS's.

Then after rebooting both times, I flashed the BIOS update from ASUS support website.

The Silent BIOS has a 133% Power Limit and 599W.

The Performance BIOS has a 120% Power Limit and 600W.

So now I think the Silent BIOS is actually better due to the higher power limit.


----------



## LunaP

Afraid to close the case door since itll bend a bit, running time spy now, gonna wait for thr 90 degree cable.















Is this normal?

vs my old 2080ti SLI


----------



## Jacinto1023

So you can flash other manufacture bios on different cards? whats the best bios out now?


----------



## Isaias Angelis

bmagnien said:


> Different loads use different amounts of power. No need to freak out. Run furmark and check your rail power it’ll be over 600w


Furmark 4k with 8x antialising doesnt go above 95% power usage...


----------



## LunaP

My power slider goes to 133% so assuming that's max(600W that is), what are good settings to start w/ ? Power max Voltage max ? Gigabyte OC gaming for ref.


----------



## N3M3SYS

Hi guys, Gigabyte OC gaming here, I just saw that there is a new BIOS released (F2), for this card, on 2022/10/26 which increases compatibility, in description. Has anyone tried it yet?


----------



## MrTOOSHORT

LunaP said:


> Afraid to close the case door since itll bend a bit, running time spy now, gonna wait for thr 90 degree cable.
> View attachment 2578947
> View attachment 2578948
> 
> Is this normal?
> 
> vs my old 2080ti SLI
> 
> View attachment 2578956


The support bracket is pretty cool for that card, stealth like. I use mine.


----------



## heptilion

KedarWolf said:


> Okay, I flashed the TechPowerUp old Silent and Performance BIOS's.
> 
> Then after rebooting both times, I flashed the BIOS update from ASUS support website.
> 
> The Silent BIOS has a 133% Power Limit and 599W.
> 
> The Performance BIOS has a 120% Power Limit and 600W.
> 
> So now I think the Silent BIOS is actually better due to the higher power limit.


Did you switch the bios and then flashed each time? For me I just flashed it while in P mode and it worked for both. 
Also I think Silent is on 450w default that's why it's showing 133% I think.


----------



## Deve5tat0R

LunaP said:


> DP spiltter is arriving today so going to temp tear down my loop to add the card in till blocks show up, anyone here w/ an x299 board (asus pref ) that has their card working? Seeing a lot of reports of people needing a bios update but not showing anything on asus' side or 3rd party sites for it.


I have a 10980XE on a Rampage VI Extreme Encore working fine with a Strix 4090 with no BIOS update done


----------



## Jacinto1023

N3M3SYS said:


> Hi guys, Gigabyte OC gaming here, I just saw that there is a new BIOS released (F2), for this card, on 2022/10/26 which increases compatibility, in description. Has anyone tried it yet?


I tried to update it but it didn't work. said it wasn't compatible


----------



## LunaP

Deve5tat0R said:


> I have a 10980XE on a Rampage VI Extreme Encore working fine with a Strix 4090 with no BIOS update done


Yeah was googling and found nothing so took the plunge, Mines actually the encore not the omega, so same boat. 


Went through the thread and tried others settings max performance etc

+150 on CPU
Max Power
+1300 on Mem
+10 on Voltage % for 1050mv

Is it my CPU or am I doing something wrong? Feels very low.


----------



## Deve5tat0R

LunaP said:


> Yeah was googling and found nothing so took the plunge, Mines actually the encore not the omega, so same boat.
> 
> 
> Went through the thread and tried others settings max performance etc
> 
> +150 on CPU
> Max Power
> +1300 on Mem
> +10 on Voltage % for 1050mv
> 
> Is it my CPU or am I doing something wrong? Feels very low.
> 
> View attachment 2578960


GPU core clocks at the ballpark of 3GHz+ is needed for good scores in benchmarks. You need to slide voltage all way to max @ 1.1V to reach those clocks


----------



## bmagnien

KedarWolf said:


> New ASUS BIOS, my 24/7 stable settings.
> 
> View attachment 2578938
> 
> View attachment 2578936
> 
> View attachment 2578935
> 
> View attachment 2578934
> 
> View attachment 2578933
> 
> View attachment 2578937
> 
> 
> Forgot to screenshot this one.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 958 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Great scores man, congrats. Any meaningful difference between different bios in actual performance? Are you running strix on strix or on TUF?


----------



## jcde7ago

For my other Suprim Liquid X owners - we have our first reported case of a Liquid X with a seemingly melted (or was in the process of melting) pin:


__
https://www.reddit.com/r/nvidia/comments/yf9rn8


















I can't be asked to deal with this **** from Nvidia so i've removed my 4090 for the time being, even if i'm undervolted...it ain't worth the risk and i'll go pick up an ATX 3.0 PSU this weekend or try out some different third party cables instead.


----------



## N19htmare666

jcde7ago said:


> For my other Suprim Liquid X owners - we have our first reported case of a Liquid X with a seemingly melted (or was in the process of melting) pin:
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/yf9rn8
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't be asked to deal with this **** from Nvidia so i've removed my 4090 for the time being, even if i'm undervolted...it ain't worth the risk and i'll go pick up an ATX 3.0 PSU this weekend or try out some different third party cables instead.


It's it possible to see where the heat came from, the cable side of the connector it the GPU side?


----------



## biigshow666

Jacinto1023 said:


> I tried to update it but it didn't work. said it wasn't compatible





N3M3SYS said:


> Hi guys, Gigabyte OC gaming here, I just saw that there is a new BIOS released (F2), for this card, on 2022/10/26 which increases compatibility, in description. Has anyone tried it yet?


Yeah I updated yesterday. No difference that I have noticed. They did mix the naming up so flash the silent bios update to the oc toggle on the board.


----------



## Antsu

Nizzen said:


> I'm runnig no powerlimit on Asus 4090 strix, and it aint melted yet


Already a special BIOS out or did you shunt it?


----------



## Jacinto1023

biigshow666 said:


> Yeah I updated yesterday. No difference that I have noticed. They did mix the naming up so flash the silent bios update to the oc toggle on the board.


Ah wth lol


----------



## bhav

Now this is how you make a 4090!










Needs a custom loop though.


----------



## Jacinto1023

damn thats small


----------



## mirkendargen

bhav said:


> Now this is how you make a 4090!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Needs a custom loop though.


It's a reference PCB lol. All the Inno3d cards are apparently.


----------



## Hanks552

motivman said:


> my card is a trio


Did you bios flash it to a suprim bios? To enable 600w


----------



## KedarWolf

heptilion said:


> Did you switch the bios and then flashed each time? For me I just flashed it while in P mode and it worked for both.
> Also I think Silent is on 450w default that's why it's showing 133% I think.


nvidia-smi shows the Slient BIOS as 599W with the power limit maxed out.


----------



## KedarWolf

I scored 18 810 in Time Spy Extreme


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com













I scored 11 174 in Speed Way


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com













I scored 28 968 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## KedarWolf

bmagnien said:


> Great scores man, congrats. Any meaningful difference between different bios in actual performance? Are you running strix on strix or on TUF?


Strix on Strix OC card.


----------



## coelacanth

Buildzoid weighing in on the power connector.


----------



## Nizzen

KedarWolf said:


> I scored 18 810 in Time Spy Extreme
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 174 in Speed Way
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 968 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com











Result







www.3dmark.com




Looks like Amd cpu is good on the last test


----------



## KedarWolf

Nizzen said:


> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> Looks like Amd cpu is good on the last test


I DO have a 7950x, X670E motherboard and A-Die PMIC unlocked DDR5, but I need to lap my CPU block before building my new setup and waiting until Saturday when I have three days off work.


----------



## mirkendargen

biigshow666 said:


> Yeah I updated yesterday. No difference that I have noticed. They did mix the naming up so flash the silent bios update to the oc toggle on the board.


I updated today and it ****ed up my secondary monitor. It would just flash on and off and never show an image after booting, worked fine in BIOS. I flashed the OG BIOS back (I backed it up ahead of time) and it's fine again.


----------



## Hanks552

N19htmare666 said:


> Sweepersc uploaded here
> 
> 
> 
> 
> 
> SuprimX4090G.rom
> 
> 
> 
> 
> 
> 
> 
> drive.google.com


Are you using it on a trio?


----------



## BluePaint

coelacanth said:


> Buildzoid weighing in on the power connector.


Yeah, funnily, he doesn't really agree with Igor. For him, the double slit in the metal pins is his main criticism because it lowers contact safety (especially if cables are bent horizontally). He showed that all non- NVIDIA connectors only have a single slit. He bases his argument on the pictures posted online, essentially only showing the plastic around the insert point of the plug having melted and not where the cables are soldered onto the plug (Igor primarily criticised the thin solder area where the cables are connected to the plug).


----------



## mirkendargen

Bykski N-GV4090AORUS-X GPU BLOCK AORUS 4090 MASTER 24G


Bykski 分体式水冷官方网站




www.bykski.com





Meat's back on the menu boys!


----------



## LunaP

mirkendargen said:


> Bykski N-GV4090AORUS-X GPU BLOCK AORUS 4090 MASTER 24G
> 
> 
> Bykski 分体式水冷官方网站
> 
> 
> 
> 
> www.bykski.com
> 
> 
> 
> 
> 
> Meat's back on the menu boys!


Nice, buying this soon as its available.


----------



## mirkendargen

LunaP said:


> Nice, buying this soon as its available.


It's on Aliexpress, I ordered one.



mirkendargen said:


> I updated today and it ****ed up my secondary monitor. It would just flash on and off and never show an image after booting, worked fine in BIOS. I flashed the OG BIOS back (I backed it up ahead of time) and it's fine again.


Turns out it's actually the new driver that's messing up my secondary output, not the BIOS...


----------



## Benni231990

Guys i have great News from alphacool 

they are done with my suprim x Card and send it back to me so the production has been startet for the waterblock maybe 2-3 we can buy it


----------



## LunaP

mirkendargen said:


> It's on Aliexpress, I ordered one.



Is that how they normally release them? lol 
Checking now, appreciate it, adding a 2nd GPU as well since VR has trouble running if theres more than 3 monitors.


----------



## Tideman

Nizzen said:


> What gpuscore in port royal and timespy extreme?


Honestly not comfortable running anything else at max voltage/power limit, until I get a safer cable. I was pulling over 540w through that adapter.. I just wanted an idea of how good my card is. Will be interesting to see how far I can push its memory next.

Anyone know how fast cablemod shipping in Europe is? Prob going with a cable from them seeing as Corsair has no stock.


----------



## cheddardonkey

N19htmare666 said:


> Great to hear. It's you could also post the temps or link a video of benchmarks that would be amazing!
> About the fans, what's the option to make them run constantly?


cheddardonkey said:
I have loaded the *600w* MSI Suprim Liquid X Bios on my *Gigabyte Aorus Waterforce 4090* which was previously set at 500w. Bios loaded fine, 600w is now available and the power slider now allows increased power. The fans on the AIO don't turn on now until GPU reaches 55 degrees. or to run constantly by choice. At stock they run constantly with no way to turn them off and I like this change, curve seems ok so far. I am also seeing some notable improvements across the board at the cost of very slightly higher temps.. 

Sadly, I cant get the clock up past 3045, doesnt get hot, wattage during superposition pulls a steady 510-530. Furmark will pull 585 but I'm not seeing much clock improvement. Definitely holding higher FPS counts, that 20-30watts seems to be making a difference More results coming after more tests but this looks to be the limit without crashing.

What benchmark settings would you want to see for this?


----------



## ShadowYuna

Tideman said:


> Honestly not comfortable running anything else at max voltage/power limit, until I get a safer cable. I was pulling over 540w through that adapter.. I just wanted an idea of how good my card is. Will be interesting to see how far I can push its memory next.
> 
> Anyone know how fast cablemod shipping in Europe is? Prob going with a cable from them seeing as Corsair has no stock.


Yes cablemod ship to globally if you purchase from their global site. I am in Australia and purchased this Tuesday but it seems they are making the cable when customer order the item. My estimate shipping date is 14th Nov 2022.


----------



## ShadowYuna

mirkendargen said:


> Bykski N-GV4090AORUS-X GPU BLOCK AORUS 4090 MASTER 24G
> 
> 
> Bykski 分体式水冷官方网站
> 
> 
> 
> 
> www.bykski.com
> 
> 
> 
> 
> 
> Meat's back on the menu boys!


Thank you so much. I put preoder on EK but their ETA is 30th Nov.
Just ordered the Bykski block from Formula Mod as they can ship by DHL. 
Finally set up can be finish by next weekend.


----------



## long2905

cheddardonkey said:


> Sadly, I cant get the clock up past 3045, doesnt get hot,


peak 1st world problem? just an observation, means no offense


----------



## Tideman

ShadowYuna said:


> Yes cablemod ship to globally if you purchase from their global site. I am in Australia and purchased this Tuesday but it seems they are making the cable when customer order the item. My estimate shipping date is 14th Nov 2022.


Did you customize or just get the standard cable (for your PSU brand)?

Wondering if it's a safer bet to go with a customized cable for my specific AX1500i model or if the standard ''CableMod C-Series Pro ModMesh Sleeved 12VHPWR PCI-e Cable for Corsair'' would be fine.

EDIT: Seems AXi is supported. It's listed there.


----------



## LunaP

ShadowYuna said:


> Thank you so much. I put preoder on EK but their ETA is 30th Nov.
> Just ordered the Bykski block from Formula Mod as they can ship by DHL.
> Finally set up can be finish by next weekend.


How were u able to order it says no shipping options when I check. Clicking just returns "No List!" 

N/m went back and manually entered address again and it showed up, went w/ DHL as well THanks!


----------



## Spiriva

Tideman said:


> Did you customize or just get the standard cable (for your PSU brand)?
> 
> Wondering if it's a safer bet to go with a customized cable for my specific AX1500i model or if the standard ''CableMod C-Series Pro ModMesh Sleeved 12VHPWR PCI-e Cable for Corsair'' would be fine.
> 
> EDIT: Seems AXi is supported. It's listed there.


I ordered this one: CableMod C-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for Corsair – CableMod EU Store

I got the AX1600i PSU, and its listed there.

the 4x8pin


----------



## Hulk1988

Benni231990 said:


> Guys i have great News from alphacool
> 
> they are done with my suprim x Card and send it back to me so the production has been startet for the waterblock maybe 2-3 we can buy it


Do you get a free block for providing a card?  Great news


----------



## ShadowYuna

Tideman said:


> Did you customize or just get the standard cable (for your PSU brand)?
> 
> Wondering if it's a safer bet to go with a customized cable for my specific AX1500i model or if the standard ''CableMod C-Series Pro ModMesh Sleeved 12VHPWR PCI-e Cable for Corsair'' would be fine.
> 
> EDIT: Seems AXi is supported. It's listed there.


No since my power is corsair AX1600i , I look their list and choose the cable.


----------



## Zogge

I ordered the AX1600i cable which is 2x8 pin only as input. When I tried it on my 4090 Tuf OC, the red light does not turn off and there is no picture. Switched back to Nvidia cable and it works. Strange. It was listed as compatible on the site and the one I should use.


----------



## LunaP

ShadowYuna said:


> No since my power is corsair AX1600i , I look their list and choose the cable.


Thanks for the reminder, just ordered mine.



Zogge said:


> I ordered the AX1600i cable which is 2x8 pin only as input. When I tried it on my 4090 Tuf OC, the red light does not turn off and there is no picture. Switched back to Nvidia cable and it works. Strange. It was listed as compatible on the site and the one I should use.


I grabbed the 4x8 pin to be safe

As for the byski blocks I see they use 1.8 pads? anyone have any suggestions for aftermarket or just leave as is w/ them?


----------



## Nizzen

Tideman said:


> Honestly not comfortable running anything else at max voltage/power limit, until I get a safer cable. I was pulling over 540w through that adapter.. I just wanted an idea of how good my card is. Will be interesting to see how far I can push its memory next.
> 
> Anyone know how fast cablemod shipping in Europe is? Prob going with a cable from them seeing as Corsair has no stock.


So you think another extention cable is more safe?


----------



## J7SC

mirkendargen said:


> Bykski N-GV4090AORUS-X GPU BLOCK AORUS 4090 MASTER 24G
> 
> 
> Bykski 分体式水冷官方网站
> 
> 
> 
> 
> www.bykski.com
> 
> 
> 
> 
> 
> Meat's back on the menu boys!


Thanks for the meat reminder - just ordered mine from Formulamod


----------



## ShadowYuna

LunaP said:


> Thanks for the reminder, just ordered mine.
> 
> 
> 
> I grabbed the 4x8 pin to be safe
> 
> As for the byski blocks I see they use 1.8 pads? anyone have any suggestions for aftermarket or just leave as is w/ them?


I look into the pads but it seems Primochill has 1.8 pads but they are using same pads as Bykski. So i think it should be OK use with pad that comes with block.

I know that pads from Bykski isn't great but I can not find other brand 1.8mm pads...


----------



## vigorito

Why Strix is spinning fans at 30% all the time while not gaming,on 522 was in zero rpm mode in regular non gaming scenario,now is in performance mode like yours but fans are spinning all the time i cant stop them in non gaming scenario


alasdairvfr said:


> It idles for me, though YMMV i suppose
> 
> View attachment 2578869


----------



## N19htmare666

cheddardonkey said:


> cheddardonkey said:
> I have loaded the *600w* MSI Suprim Liquid X Bios on my *Gigabyte Aorus Waterforce 4090* which was previously set at 500w. Bios loaded fine, 600w is now available and the power slider now allows increased power. The fans on the AIO don't turn on now until GPU reaches 55 degrees. or to run constantly by choice. At stock they run constantly with no way to turn them off and I like this change, curve seems ok so far. I am also seeing some notable improvements across the board at the cost of very slightly higher temps..
> 
> Sadly, I cant get the clock up past 3045, doesnt get hot, wattage during superposition pulls a steady 510-530. Furmark will pull 585 but I'm not seeing much clock improvement. Definitely holding higher FPS counts, that 20-30watts seems to be making a difference More results coming after more tests but this looks to be the limit without crashing.
> 
> What benchmark settings would you want to see for this?
> 
> View attachment 2579007


Like this 



 with the same overlays so we can compare side by side


----------



## Placekicker19

Is there a difference between the tuf gaming and tuf gaming oc , I have a chance to pickup a regular tuf locally for a good price and was wondering if there’s any difference between the 2. It’s hard to believe asus would slap a $200 premium on the oc version just for a little higher boost clock .

thanks


----------



## changboy

I cant find any link to buy a Bykski for gigabyte 4090 gaming oc ?


----------



## ALSTER868

Placekicker19 said:


> Is there a difference between the tuf gaming and tuf gaming oc , I have a chance to pickup a regular tuf locally for a good price and was wondering if there’s any difference between the 2. It’s hard to believe asus would slap a $200 premium on the oc version just for a little higher boost clock .
> 
> thanks


A slightly different bios with +40 Mhz to the GPU clock, that's all.


----------



## Placekicker19

ALSTER868 said:


> A slightly different bios with +40 Mhz to the GPU clock, that's all.


200 more for that is just crazy.
thanks man


----------



## BluePaint

Placekicker19 said:


> 200 more for that is just crazy.
> thanks man


I think every manufacturer had to provide one model at 1600 bucks MSRP, otherwise the TPU non-OC would probably cost just 50 or 100 bucks less than the OC version. So it's kinda the other way round that ASUS would like to always have 100 or 200 more but they had to provide one at 1600.


----------



## ZealotKi11er

changboy said:


> I cant find any link to buy a Bykski for gigabyte 4090 gaming oc ?


They dont have. Personally I would spend a bit more and get something else.


----------



## ALSTER868

Placekicker19 said:


> 200 more for that is just crazy.
> thanks man


BTW in the partner's price (for the ASUS distributors) the difference between these two GPUs OC and nonOC is only $10


----------



## Azazil1190

Guys how i can save my bios before flash mine(asus tuf oc) i t


Placekicker19 said:


> Is there a difference between the tuf gaming and tuf gaming oc , I have a chance to pickup a regular tuf locally for a good price and was wondering if there’s any difference between the 2. It’s hard to believe asus would slap a $200 premium on the oc version just for a little higher boost clock .
> 
> thanks


Νο!
They're the same .The only difference is the factory oc.You can buy the tuf and flash the oc bios or the strix oc bios .The rumor about that the oc have more binned chip is just a rumor.
For me buy the tuf and save your pocket.


----------



## WayWayUp

any custom loop users finding clock increases after grabbing a block? are you getting 45Mhz? 100?
people not sharing


----------



## vigorito

i just uninstalled msi afterburner/restart and zero rpm is back now,my question why on idle strix clock is constant 2610mhz? power is 50-70% in NCVP either normal or preffered maximum performance is selected and core clock is the same while typing this


----------



## AKBrian

KedarWolf said:


> New ASUS BIOS, my 24/7 stable settings.
> 
> 
> View attachment 2578936


Nooo, not my FS Ultra!


----------



## bmagnien

New seasonic native cables, free with accompanying photo evidence of ownership:





海韵推出12VHPWR模组线：国行在保电源可免费申请，以连接RTX 40系列 - 超能网


超能网（Expreview）专注于为主流科技产品提供全新视角的资讯，专注于100%高价值原创内容的创造，专注于最真实的体验式报道。




m.expreview.com


----------



## Panchovix

Cable burnt on a Gaming X Trio (3x8 pin adapter) (not mine)


__
https://www.reddit.com/r/nvidia/comments/yfqevh


----------



## cheddardonkey

N19htmare666 said:


> Like this
> 
> 
> 
> with the same overlays so we can compare side by side


Ok, that is my video So I'll remake it with the new bios!


----------



## pajonk

My new video about dlss3 gsync and LG Oled


----------



## Laithan

Isaias Angelis said:


> Furmark 4k with 8x antialising doesnt go above 95% power usage...


Try using the "stress test" option in Furmark... if you're feeling lucky... well do ya?


----------



## Laithan

mirkendargen said:


> Bykski N-GV4090AORUS-X GPU BLOCK AORUS 4090 MASTER 24G
> 
> 
> Bykski 分体式水冷官方网站
> 
> 
> 
> 
> www.bykski.com
> 
> 
> 
> 
> 
> Meat's back on the menu boys!


??? FP?

*Website blocked due to trojan*
Website blocked: www.bykski.com

*Malwarebytes Browser Guard blocked this website because it may contain malware activity.*
*We strongly recommend you do not continue.*


----------



## slayer6288

N19htmare666 said:


> It's for aio


Is there an AIR Suprim 600watt out in the wild or no?


----------



## warrior-kid

Need a little help. Just realised there is a transparent sticker of sorts on the Suprim X card (see photo just above the barcode). Is that something that needs to be removed? Asking because it is not really a flimsy thing like other stickers it had all around.


----------



## warrior-kid

slayer6288 said:


> Is there an AIR Suprim 600watt out in the wild or no?


Don't think so--MSI are playing it cool hey--people are just flashing a Strix one to great effect. Something on my to do list down the line.


----------



## alesk76

hi,
yesterday I upgrade drivers from 5.22 to 5.46 on my Palit 4090 OC and Frames drop to half (during this time also has been done windows 10 Pro update to 22H2 ...
in pimax 8K X is possible no more to get 90FPS ... (iRacing ... assetto corsa ... before 120+ ... now around 60FPS at the same settings )... I have also do then update to Windows 11 pro.. the same problem.. 
am I the only one with this problem... I was thinking that this was solved some time ago... Which Windows 10 or 11 do I have to install back to solve this problem... I read somwhere that was some isue some time ago .... 
please for help


----------



## warrior-kid

alesk76 said:


> hi,
> yesterday I upgrade drivers from 5.22 to 5.46 on my Palit 4090 OC and Frames drop to half (during this time also has been done windows 10 Pro update to 22H2 ...
> in pimax 8K X is possible no more to get 90FPS ... (iRacing ... assetto corsa ... before 120+ ... now around 60FPS)... I have also do then update to Windows 11 pro.. the same problem..
> am I the only eith this problem... I was thinking that this was solved some time ago... Which Windows do I have to install back to solve this problem...
> please for help


Looks like the new driver 546 is pretty awful according to many reports. Try to revert back to 522?


----------



## biigshow666

bmagnien said:


> New seasonic native cables, free with accompanying photo evidence of ownership:
> 
> 
> 
> 
> 
> 海韵推出12VHPWR模组线：国行在保电源可免费申请，以连接RTX 40系列 - 超能网
> 
> 
> 超能网（Expreview）专注于为主流科技产品提供全新视角的资讯，专注于100%高价值原创内容的创造，专注于最真实的体验式报道。
> 
> 
> 
> 
> m.expreview.com


I registered for this a few days ago and was approved within a few hours. hoepfully they ship quick


----------



## alesk76

warrior-kid said:


> Looks like the new driver 546 is pretty awful according to many reports. Try to revert back to 522?


Hi,
I have also do roll back to 5.22 but the problem stays .. I think it also have to be with some isue with latest windows update... my bad that I have step on two ****s at the same time.. windows update and nvidia driver update at the same time  
I work in IT and I have try all thing ... but no luck... which windows 10 pro must be (which works as they should) before they have reports update isue with Windows update and nvidia rivers


----------



## Spicedaddy

biigshow666 said:


> I registered for this a few days ago and was approved within a few hours. hoepfully they ship quick


Nice, I'll pick one up for my PX-1000. That's a million times better than nVidia's ridiculous adapter.

Connector is small also, so it provides good clearance for the side panel.


----------



## Isaias Angelis

Laithan said:


> Try using the "stress test" option in Furmark... if you're feeling lucky... well do ya?


That is what i did....
No progress!


----------



## cheddardonkey

N19htmare666 said:


> Like this
> 
> 
> 
> with the same overlays so we can compare side by side


Working on some comparisons now and will do a video on it. The short update is that in all instances, the memory reliability voltage is the limitation resulting in overall small gains.


----------



## ArcticZero

Reddit Megathread of melting 12VHPWR cables, with cases in the double digits now:


__
https://www.reddit.com/r/nvidia/comments/ydh1mh


----------



## warrior-kid

warrior-kid said:


> Need a little help. Just realised there is a transparent sticker of sorts on the Suprim X card (see photo just above the barcode). Is that something that needs to be removed? Asking because it is not really a flimsy thing like other stickers it had all around.
> View attachment 2579081


Replying myself. Yep, just a sticker


----------



## mirkendargen

Laithan said:


> ??? FP?
> 
> *Website blocked due to trojan*
> Website blocked: www.bykski.com
> 
> *Malwarebytes Browser Guard blocked this website because it may contain malware activity.*
> *We strongly recommend you do not continue.*


I don't know why your browser addons block Bykski other than I assume because China ¯\_(ツ)_/¯

You can guy buy it from multiple Aliexpress stores though.


----------



## vigorito

New driver is a mess,rolled back to 522.25 all perfect.


----------



## J7SC

bmagnien said:


> New seasonic native cables, free with accompanying photo evidence of ownership:
> 
> 
> 
> 
> 
> 海韵推出12VHPWR模组线：国行在保电源可免费申请，以连接RTX 40系列 - 超能网
> 
> 
> 超能网（Expreview）专注于为主流科技产品提供全新视角的资讯，专注于100%高价值原创内容的创造，专注于最真实的体验式报道。
> 
> 
> 
> 
> m.expreview.com


I contacted Seasonic via customer service, and after providing the necessary evidence (PSU serial number & purchase receipt, GPU receipt), I was just notified that it was approved . Bykski block for the Giga-G-OC has also been ordered...


----------



## sniperpowa

Mad Pistol said:


> It depends. If you're strictly comparing benchmarks like 3DMark which have a multi-threaded CPU component, CPUs like the 5950x will be better due to additional cores/threads.
> 
> However, in real-world gaming, I have yet to run into a time where the RTX 4090 is being bottlenecked by a 5800x3D at 4K. In fact, in very specific circumstances, the 5800x3D is the fastest gaming CPU on the market. Shadow of the Tomb Raider appears to be one of them. I have yet to find a CPU that can produce a higher framerate at 720p lowest settings.
> 
> View attachment 2578586


My 13900k scores higher at 720p lowest still good score


----------



## sniperpowa

I never run these weird low res benches but was curious


----------



## Vlados

I noticed that the rtx 4090 gigabyte windforce does not have a thermal pad on one of the four vram drmos and the inductor, should I be worried about this? although there are only 4 of them, and they are not placed near the GPU VRM, I noticed that the 3090 also had models where one thermal pad was missing. It is also worth considering that memory consumption is about 2 times less than that of 3090, respectively, when consuming 65w, one drmos accounts for 16.25w or 12A, given that it is designed for 50A. Also this drmos needs some cooling from the fan of the card itself as it is placed next to the pcie and the air needs to bounce off the motherboard


----------



## GAN77

sniperpowa said:


> I never run these weird low res benches but was curious


720p? graphics settings?


----------



## sniperpowa

GAN77 said:


> 720p? graphics settings?


720 low I can run with a real screenshot I was randomly testing for myself last night.


----------



## cheddardonkey

N19htmare666 said:


> Sweepersc uploaded here
> 
> 
> 
> 
> 
> SuprimX4090G.rom
> 
> 
> 
> 
> 
> 
> 
> drive.google.com


With this linked bios, the power slider goes to 125% a TDP of 450w would make this a 562w max and that is close to what I am see the maximum draw at.with a general steady draw of 520-530w.

Is this considered the "600w bios"


----------



## HAQ0901

Guys I have a quick question and I hope somebody knows… wanted to source a 4090 Aorus, but I’ve instead purchased an Aorus Xtreme. Can anybody confirm they are the same pcbs? So that I can purchase an Ek waterblock and insert into my custom loop…


----------



## vigorito

Did anyone manage to turn off round rgb strip from strix? im using aura sync Lighting_Control_1.07.84_v2 and there is separate options for gpu/mobo etc but non of them are unable to turn off led strip,basicly i want only republic of gamers to lit,maybe armory crate can do that,but i had so many troubles with that program,i just dont want to use it even if i have asus mobo/gpu


----------



## Arizor

vigorito said:


> Did anyone manage to turn off round rgb strip from strix? im using aura sync Lighting_Control_1.07.84_v2 and there is separate options for gpu/mobo etc but non of them are unable to turn off led strip,basicly i want only republic of gamers to lit,maybe armory crate can do that,but i had so many troubles with that program,i just dont want to use it even if i have asus mobo/gpu


Sadly, at least for the TUF, you need armoury crate, so I assume it’s the same for Strix.


----------



## Jordyn

I have a spare 120mm fan so in terms of disspelling heat from the GB Gaming OC backplate, what would be the most effective method? Blowing or pulling air at the metal part backplate or from section with the exposed finstacks.


----------



## bmagnien

Jordyn said:


> I have a spare 120mm fan so in terms of disspelling heat from the GB Gaming OC backplate, what would be the most effective method? Blowing or pulling air at the metal part backplate or from section with the exposed finstacks.


Blowing fresh air on to the solid backplate closest to the I/O, directly behind the core. You could get some of those stick on aluminum heat sinks from Amazon to add surface area for the fan to cool.


----------



## J7SC

Jordyn said:


> I have a spare 120mm fan so in terms of disspelling heat from the GB Gaming OC backplate, what would be the most effective method? Blowing or pulling air at the metal part backplate or from section with the exposed finstacks.





bmagnien said:


> Blowing fresh air on to the solid backplate closest to the I/O, directly behind the core. You could get some of those stick on aluminum heat sinks from Amazon to add surface area for the fan to cool.


^^ this...and I actually angled it just slightly so that some of the air redirects back over the (in)famous connector / dongle area.


----------



## N19htmare666

cheddardonkey said:


> Working on some comparisons now and will do a video on it. The short update is that in all instances, the memory reliability voltage is the limitation resulting in overall small gains.


Assuming you are using Max voltage in MSI afterburner? (And settings and check the box for "Unlock voltage control.")


----------



## Panchovix

Got the 4090 TUF! Surprisingly, it was at stock here on Chile.

Though installed it, and limited the power and the same instant to 300W-250W, TUF 4090 seems to be one of the cards that get burnt the most, so can't risk that until I get the cable (and I'm actually thinking about using the 3080 meanwhile I get the cable)

Some size comparisons vs Zotac 2080Ti AMP and ASUS 3080 TUF



Spoiler: Comparison pics


----------



## N19htmare666

vigorito said:


> New driver is a mess,rolled back to 522.25 all perfect.


Probably will need to wait for next card launch for a driver that works better than 522.25


----------



## LunaP

N19htmare666 said:


> Probably will need to wait for next card launch for a driver that works better than 522.25


Is 522.25 the recommended? I'm on 526.47 atm havne't noticed issues but only gamed and done VR so far.


----------



## N19htmare666

cheddardonkey said:


> With this linked bios, the power slider goes to 125% a TDP of 450w would make this a 562w max and that is close to what I am see the maximum draw at.with a general steady draw of 520-530w.
> 
> Is this considered the "600w bios"


Can you run nvidia-smi and gpu-z, MSI afterburner and screenshot like in the second screenshot in this post:


sweepersc said:


> I checked the Suprim Liquid X vbioses uploaded on TPU (95.02.18.40.4C) and it does say 530W. However, that is not the vbios my card came with, mine is 95.02.18.00.CD. Here's the Nvidia SMI screenshots:
> Stock
> View attachment 2577279
> 
> 
> Max PL
> View attachment 2577280
> 
> I also tried uploading in TPU but it said that registration from certain countries are no longer accepted. I put it in GDrive, here is the link . Can someone upload this on TPU please?
> 
> Regards,
> Mark


----------



## N19htmare666

LunaP said:


> Is 522.25 the recommended? I'm on 526.47 atm havne't noticed issues but only gamed and done VR so far.


In my testing 512.15 was fast, then every driver after was slower until 522.25. coincidence that 522.25 came out with the launch of the 4090... I don't think so. Nvidia giving themselves an extra boost for the launch day benchmarks in my opinion.


----------



## yzonker

I think I'm going to delay mounting my card in the block. I really don't want to take it apart until the dust settles on connectorgate. I'm using the Corsair cable so maybe a non-issue but I don't think anyone knows for what's really going wrong with them. 

Buildzoid had a different opinion in his video that I tend agree with. Just poor contact in the pins themselves, not a bad solder job or something like that. Not sure why the NVIDIA adapter would be worse in this case, but possibly it's just cheap as he mentioned. 

The real issue IMO is the connector just doesn't have enough safety margin. If everything is making good contact then it works, but if even one has somewhat bad contact, there isn't enough margin to handle the current then. 

Yes six 12v pins are capable of carrying the current as I well know since I ran a 2x8pin 3090 on the KP 1kw bios for almost 2 years. But it is definitely pushing the limits and I always had good cables, airflow, and never used extension cables or anything that would increase resistance.


----------



## Arizor

yzonker said:


> I think I'm going to delay mounting my card in the block. I really don't want to take it apart until the dust settles on connectorgate. I'm using the Corsair cable so maybe a non-issue but I don't think anyone knows for what's really going wrong with them.
> 
> Buildzoid had a different opinion in his video that I tend agree with. Just poor contact in the pins themselves, not a bad solder job or something like that. Not sure why the NVIDIA adapter would be worse in this case, but possibly it's just cheap as he mentioned.
> 
> The real issue IMO is the connector just doesn't have enough safety margin. If everything is making good contact then it works, but if even one has somewhat bad contact, there isn't enough margin to handle the current then.
> 
> Yes six 12v pins are capable of carrying the current as I well know since I ran a 2x8pin 3090 on the KP 1kw bios for almost 2 years. But it is definitely pushing the limits and I always had good cables, airflow, and never used extension cables or anything that would increase resistance.


Yeah I don't think anyone's conclusively proved anything yet.

As I await my Cablemod delivery, still find myself doing my daily ritual of unplugging and checking for melted plastic. So far so good, but every time I find myself thinking through the dilemma (after finding no issues) "Well there's no issue here, but now, plugging it back in again, am I _creating_ the issue?"

Schrodinger's cable...


----------



## J7SC

Arizor said:


> Yeah I don't think anyone's conclusively proved anything yet.
> 
> As I await my Cablemod delivery, still find myself doing my daily ritual of unplugging and checking for melted plastic. So far so good, but every time I find myself thinking through the dilemma (after finding no issues) "Well there's no issue here, but now, plugging it back in again, am I _creating_ the issue?"
> 
> Schrodinger's cable...


...also, wasn't there a limited number of plug-in / unplug cycles for this (and other style) connectors ? But per @yzonker 's post above, it pays to be careful for now...I've got a proper replacement cable coming from Seasonic, but until then, I won't do much full-bore oc'ing, just stock everything (which frankly, is still very fast). Once I am satisfied that that will deal with the issues and the connector card-side is fine (and my ordered waterblock arrived in the meantime), I'll w-cool the card.


----------



## DokoBG

I am not overclocking my card and using it with 50% power limit until my cables arrive.


----------



## Jordyn

bmagnien said:


> Blowing fresh air on to the solid backplate closest to the I/O, directly behind the core. You could get some of those stick on aluminum heat sinks from Amazon to add surface area for the fan to cool.





J7SC said:


> ^^ this...and I actually angled it just slightly so that some of the air redirects back over the (in)famous connector / dongle area.


Thanks Guys, I am planning to open up my case to check my cable so will add the fan back in at the same time.



Arizor said:


> Yeah I don't think anyone's conclusively proved anything yet.
> 
> As I await my Cablemod delivery, still find myself doing my daily ritual of unplugging and checking for melted plastic. So far so good, but every time I find myself thinking through the dilemma (after finding no issues) "Well there's no issue here, but now, plugging it back in again, am I _creating_ the issue?"
> 
> Schrodinger's cable...


Based on everything I have seen so far and primarily from Buildzoid's video limiting the risk of the pins separating seems like the best bet at this stage. Still curious to see what is happening with mine as have been running at 133% power since day one with some farily long gaming sessions.


----------



## Jetlain

Hello, I'm getting only 69878 GPU score in Fire strike test. Why is my firestrike score so low?
timespy extreme looks fine.


----------



## Arizor

J7SC said:


> ...also, wasn't there a limited number of plug-in / unplug cycles for this (and other style) connectors ? But per @yzonker 's post above, it pays to be careful for now...I've got a proper replacement cable coming from Seasonic, but until then, I won't do much full-bore oc'ing, just stock everything (which frankly, is still very fast). Once I am satisfied that that will deal with the issues and the connector card-side is fine (and my ordered waterblock arrived in the meantime), I'll w-cool the card.


yep 30 plugs/unplugs is the guideline. Running at my 0.95v undervolt, very cautiously playing short sessions of plague tale until my Cablemod arrives hopefully Monday…!


----------



## ArcticZero

Has anyone tried these cables from ModDIY?









ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable


Buy ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable for $24.99 with Free Shipping Worldwide (In Stock)




www.moddiy.com





Curious if they are 16AWG since these seem like a more immediately available option here vs Cablemod. Seasonic needs proof of purchase of the GPU so until I get my order in, I won't be able to claim a free cable, and I really don't feel comfortable using the card on the provided adapter.


----------



## Antsu

Guys, chill out on the connector. Just don't bend it too much and it will be fine.



Jetlain said:


> View attachment 2579139
> 
> 
> 
> 
> View attachment 2579140
> 
> 
> Hello, I'm getting only 69878 GPU score in Fire strike test. Why is my firestrike score so low?
> timespy extreme looks fine.


Windows 10 with E-cores enabled? Could be that FS is using only E-cores... You should score more like 90K on graphics with 13th gen.


----------



## KedarWolf

About the two BIOS's.

If you check in nvidia-smi the 133% power limit BIOS is 599W with the Afterburner power slider maxed out and the 120% BIOS is 600W with the power slider maxed out.

From what I understand, a higher power limit BIOS is a better one.

Isn't the higher power limit BIOS the Performance one and better?


----------



## mirkendargen

ArcticZero said:


> Has anyone tried these cables from ModDIY?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable
> 
> 
> Buy ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable for $24.99 with Free Shipping Worldwide (In Stock)
> 
> 
> 
> 
> www.moddiy.com
> 
> 
> 
> 
> 
> Curious if they are 16AWG since these seem like a more immediately available option here vs Cablemod. Seasonic needs proof of purchase of the GPU so until I get my order in, I won't be able to claim a free cable, and I really don't feel comfortable using the card on the provided adapter.


The wire gauge really isn't a deal maker/breaker here unless it's something ridiculously small. 18AWG is rated at more than enough amperage. I'm using a custom Moddiy cable, not that specific one but probably the same wire. It's not even warm to the touch at high load.

No one's had a problem with the insulation on their cables burning off, it's all in the adapter.


----------



## cheddardonkey

N19htmare666 said:


> Can you run nvidia-smi and gpu-z, MSI afterburner and screenshot like in the second screenshot in this post:


Here it is, and yes iI think it is 600, 480 up from 450 x 1.25


----------



## Panchovix

DokoBG said:


> I am not overclocking my card and using it with 50% power limit until my cables arrive.


Same here for now, luckily at 250-300W it still performs really good, but can't risk overclocking, benching etc for now.


----------



## cheddardonkey

N19htmare666 said:


> Assuming you are using Max voltage in MSI afterburner? (And settings and check the box for "Unlock voltage control.")


Yes, you'll see my afterburner settings on the screenshot, I cannot get it stable beyond these values.


----------



## KedarWolf

cheddardonkey said:


> Yes, you'll see my afterburner settings on the screenshot, I cannot get it stable beyond these values.
> View attachment 2579147


Sorry man.

People dissed me for paying so much for a Strix OC, but I run this 24/7 Time Spy, Port Royal, EZBench, Superposition and Cyberpunk totally stable with zero artefacts or crashes.

I only locked the voltages to show how high it's clocked, but I don't run it locked.

And I get max 64C core, 64C hotspot with the default fan profile.


----------



## mirkendargen

KedarWolf said:


> Sorry man.
> 
> People dissed me for paying so much for a Strix OC, but I run this 24/7 Time Spy, Port Royal, EZBench, Superposition and Cyberpunk totally stable with zero artefacts or crashes.
> 
> I only locked the voltages to show how high it's clocked, but I don't run it locked.
> 
> And I get max 64C core, 64C hotspot with the default fan profile.
> 
> View attachment 2579154


The clocks are probably luck of the draw not any kind of Strix binning, but the temps are definitely the massive Strix cooler doing work. No delta between core and hotspot is pretty remarkable (means an amazing paste job and cold plate uniformity).


----------



## ZealotKi11er

KedarWolf said:


> Sorry man.
> 
> People dissed me for paying so much for a Strix OC, but I run this 24/7 Time Spy, Port Royal, EZBench, Superposition and Cyberpunk totally stable with zero artefacts or crashes.
> 
> I only locked the voltages to show how high it's clocked, but I don't run it locked.
> 
> And I get max 64C core, 64C hotspot with the default fan profile.
> 
> View attachment 2579154


You just got lucky. Nothing to do with being Strix OC. In the end of the day was $500 over something like Giga OC worth 2-3% more perf?


----------



## Antsu

KedarWolf said:


> Sorry man.
> 
> People dissed me for paying so much for a Strix OC, but I run this 24/7 Time Spy, Port Royal, EZBench, Superposition and Cyberpunk totally stable with zero artefacts or crashes.
> 
> I only locked the voltages to show how high it's clocked, but I don't run it locked.
> 
> And I get max 64C core, 64C hotspot with the default fan profile.
> 
> View attachment 2579154


Strix has never been a binned chip, maybe it's different this time around, but I don't think so. You just got lucky, and it happened to be a Strix. Very nice sample if you can actually sustain those clocks on air. My first 3090 Strix was worse than the TUF 3090 my friend bought from the same batch.


----------



## changboy

Worst thing from this 4090 its not the cable but there is no new nice game to use it.
I already done 2 or 3 time red dead 2 and shadow of tomb raider with my 3090 and some other game.
I check what game is coming even in 2023 and i dont see much game to use this card, benchmark is fun for a time but we really need new title game to use it. When i got my 3090, i remember have many new game but now nothing 

Most of the time they laugh a new card it come with a new game to use the new tech, like now its dlss 3.0, when i bought the 3090 they gave watch dog legion with dlss 2.0


----------



## Sayenah

KedarWolf said:


> Sorry man.
> 
> People dissed me for paying so much for a Strix OC, but I run this 24/7 Time Spy, Port Royal, EZBench, Superposition and Cyberpunk totally stable with zero artefacts or crashes.
> 
> I only locked the voltages to show how high it's clocked, but I don't run it locked.
> 
> And I get max 64C core, 64C hotspot with the default fan profile.
> 
> View attachment 2579154


So happy to see that! I have a Strix as well. How are you “locking” the voltage? Aren’t these cards voltage locked?


----------



## KedarWolf

mirkendargen said:


> The clocks are probably luck of the draw not any kind of Strix binning, but the temps are definitely the massive Strix cooler doing work. No delta between core and hotspot is pretty remarkable (means an amazing paste job and cold plate uniformity).


Actually, I tested at 4K Port Royal Custom run, temps on the core hit 61 or so, near the end of the run hotspot was around 73c to 77C max with a custom fan curve, fans maxing out at 66% at 61C.

Warm ambient temps though.

brb, going to test Cyberpunk maxed out.


----------



## Arizor

changboy said:


> Worst thing from this 4090 its not the cable but there is no new nice game to use it.
> I already done 2 or 3 time red dead 2 and shadow of tomb raider with my 3090 and some other game.
> I check what game is coming even in 2023 and i dont see much game to use this card, benchmark is fun for a time but we really need new title game to use it. When i got my 3090, i remember have many new game but now nothing
> 
> Most of the time they laugh a new card it come with a new game to use the new tech, like now its dlss 3.0, when i bought the 3090 they gave watch dog legion with dlss 2.0


Plague Tale is great. I’m also looking forward to Portal RTX, Cyberpunk Overdrive mode (both November apparently), RTX Witcher 3, and Callisto Project (eta December).


----------



## KedarWolf

Cyberpunk maxed out, core, 51C, hotspot varies from 59-69C but mostly stays around 63-64C.


----------



## KedarWolf

Sayenah said:


> So happy to see that! I have a Strix as well. How are you “locking” the voltage? Aren’t these cards voltage locked?


If you hit CTRL F while clicked on Afterburner OSD, it'll being up the voltage curve then click CTRL L on a voltage point on the curve, it'll lock it to that. If you hit CTRL L on a voltage point higher than 1.1v, it'll lock it to the clock you have at 1.1v. It'll still down bin with temps etc when a game is running, but with no game running it'll show the max clock you have at 1.1v.

This is what it looks like locked at 1.1v.


----------



## Laithan

Antsu said:


> Strix has never been a binned chip, maybe it's different this time around, but I don't think so. You just got lucky, and it happened to be a Strix. Very nice sample if you can actually sustain those clocks on air. My first 3090 Strix was worse than the TUF 3090 my friend bought from the same batch.


yup... another example, EVGA was selling 3090Ti Black, regular and Ultra models.. with a $200 price difference across the 3 models yet identical boards just a different BIOS... many people that bought Ultras were getting lower clocks than those with blacks... there was definitely no distinguishable pattern or advantage to buy the Ultra. I think binning is certainly a real possibility and can be done... but I don't think any company exercises the opportunity to actually do it (thoroughly and properly) anymore.


----------



## Sayenah

KedarWolf said:


> If you hit CTRL F while clicked on Afterburner OSD, it'll being up the voltage curve then click CTRL L on a voltage point on the curve, it'll lock it to that. If you hit CTRL L on a voltage point higher than 1.1v, it'll lock it to the clock you have at 1.1v. It'll still down bin with temps etc when a game is running, but with no game running it'll show the max clock you have at 1.1v.
> 
> This is what it looks like locked at 1.1v.
> 
> View attachment 2579157


Oh thank you! I am trying this now. I couldn’t get any stability above 3075… but I think I can push it more with this.

also, with afterburner, no matter what I do I can’t control the fan curve. I don’t know why.Even if I define them manually to, say, 85% they fall back down to default auto 30%


----------



## Arizor

Sayenah said:


> Oh thank you! I am trying this now. I couldn’t get any stability above 3075… but I think I can push it more with this.
> 
> also, with afterburner, no matter what I do I can’t control the fan curve. I don’t know why.Even if I define them manually to, say, 85% they fall back down to default auto 30%


afterburner can be very selective with what boards work with its fan control.

I always use this app instead for all fans, fantastic software GitHub - Rem0o/FanControl.Releases: This is the release repository for Fan Control, a highly customizable fan controlling software for Windows.


----------



## KedarWolf

Sayenah said:


> Oh thank you! I am trying this now. I couldn’t get any stability above 3075… but I think I can push it more with this.
> 
> also, with afterburner, no matter what I do I can’t control the fan curve. I don’t know why.Even if I define them manually to, say, 85% they fall back down to default auto 30%


Edit: You need to manually set a curve in Afterburner properties. and have Auto On, my bad.


----------



## Arizor

What FPS are we getting with the Cyberpunk benchmark folks? Set to Raytracing Ultra preset and turn DLSS off?

I get 43 with my 0.95V undervolt, 46 with my OC settings.


----------



## N19htmare666

cheddardonkey said:


> Here it is, and yes iI think it is 600, 480 up from 450 x 1.25
> View attachment 2579143


Just one more. Gpu-z - advanced tab, 'nvidia bios' from the drop-down menu. Can you screenshot the power limit section. Many thanks


----------



## Sayenah

Arizor said:


> What FPS are we getting with the Cyberpunk benchmark folks? Set to Raytracing Ultra preset and turn DLSS off?
> 
> I get 43 with my 0.95V undervolt, 46 with my OC settings.


just ran it. 47 on the dot here! My OC is +220 and +1650 on mem

can you share your Port Royal score? I am at 28098 (same username). Was hoping to be in the top 100 haha

EDIT: +1700 on mem clock. Not +1650

12900k/z690Apex/6400c32/AX1600i


----------



## J7SC

Arizor said:


> What FPS are we getting with the Cyberpunk benchmark folks? Set to Raytracing Ultra preset and turn DLSS off?
> 
> I get 43 with my 0.95V undervolt, 46 with my OC settings.


...last time I checked 46.13 fps w/Ultra, DLSS off, but that was day one w/o optimizations.


----------



## KedarWolf

Sayenah said:


> just ran it. 47 on the dot here! My OC is +220 and +1650 on mem
> 
> can you share your Port Royal score? I am at 28098 (same username). Was hoping to be in the top 100 haha
> 
> EDIT: +1700 on mem clock. Not +1650
> 
> 12900k/z690Apex/6400c32/AX1600i











I scored 28 983 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Arizor

Sayenah said:


> just ran it. 47 on the dot here! My OC is +220 and +1650 on mem
> 
> can you share your Port Royal score? I am at 28098 (same username). Was hoping to be in the top 100 haha
> 
> EDIT: +1700 on mem clock. Not +1650
> 
> 12900k/z690Apex/6400c32/AX1600i


Thanks gents. Port Royal is acting really weird for me, so I don't run it often. I can run other Stress Tests for days, games for hours, but Port Royal has the habit of crashing instantly for me no matter what my GPU clocks.

When it was running ok, here's my best score - I scored 28 045 in Port Royal


----------



## Sayenah

KedarWolf said:


> I scored 28 983 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


I… need to buy and try three more Strix cards haha 😂 that’s a lovely card you have there man. Mine has trouble remaining stable above 3070 (average 3045). But at 3070 with memory at 1710, I am rock solid stable. Passed a bunch of stress tests, no crashes, no artifacting… and, no coil whine. Zero. I have to open the case and put my ear next to the damn thing to hear that buzz.

Tried a SuprimX Liquid before the Strix and the coil whine drove me nuts. Returned. The card was practically singing


----------



## Antsu

Sayenah said:


> I… need to buy and try three more Strix cards haha 😂 that’s a lovely card you have there man. Mine has trouble remaining stable above 3070 (average 3045). But at 3070 with memory at 1710, I am rock solid stable. Passed a bunch of stress tests, no crashes, no artifacting… and, no coil whine. Zero. I have to open the case and put my ear next to the damn thing to hear that buzz.
> 
> Tried a SuprimX Liquid before the Strix and the coil whine drove me nuts. Returned. The card was practically singing


Lucky you! My 3090 STRIX had insane coil whine, the worst I've heard in the 20 years I've been building PC's. It was barely audible with the stock cooler, but after slapping a WB backplate on it she turned into a real singer.


----------



## Sayenah

Antsu said:


> Lucky you! My 3090 STRIX had insane coil whine, the worst I've heard in the 20 years I've been building PC's. It was barely audible with the stock cooler, but after slapping a WB backplate on it she turned into a real singer.


Yup, the absolute lack of coil whine is the reason. I haven’t returned this Strix yet. Not too happy with the chip but this seems to be a good compromise. 

My 3090 and 3090Ti though… those were a different story altogether.


----------



## ZealotKi11er

changboy said:


> Worst thing from this 4090 its not the cable but there is no new nice game to use it.
> I already done 2 or 3 time red dead 2 and shadow of tomb raider with my 3090 and some other game.
> I check what game is coming even in 2023 and i dont see much game to use this card, benchmark is fun for a time but we really need new title game to use it. When i got my 3090, i remember have many new game but now nothing
> 
> Most of the time they laugh a new card it come with a new game to use the new tech, like now its dlss 3.0, when i bought the 3090 they gave watch dog legion with dlss 2.0


Yep I feel the same. At least with RTX 30 I played CP2077. Next big thing in PC is Starfield but who knows if you need 4090 for that. Most next gen UE5 games are late 2023-24.


----------



## J7SC

Arizor said:


> Thanks gents. Port Royal is acting really weird for me, so I don't run it often. I can run other Stress Tests for days, games for hours, but Port Royal has the habit of crashing instantly for me no matter what my GPU clocks.
> 
> When it was running ok, here's my best score - I scored 28 045 in Port Royal


...Weird that Port Royal is weird on your system  ...That's the one bench I never have any problems with. Highest I subbed was 28443, though there was a low 29k one but it seemed to have some (minor) artifacts near the end, so not subbed. 

...I haven't benched in almost a week because a.) the 4-into-1 dongle issue and b.) while the card has more clock headroom (3.2 GHz on light 3D loads < 40 C), it just gets too hot (hotspot > 90C, 25 C ambient). Waterblock is ordered, as is a 12VHPWR single strand cable from Seasonic (free w/ their review process). Once all that is updated, I look forward to really give the Giga-G-OC the beans. I just hope that the update' slippery slope bug' doesn't bite me in the meantime no,no,no,no no 13900K no no


----------



## Arizor

J7SC said:


> ...Weird that Port Royal is weird on your system  ...That's the one bench I never have any problems with. Highest I subbed was 28443, though there was a low 29k one but it seemed to have some (minor) artifacts near the end, so not subbed.
> 
> ...I haven't benched in almost a week because a.) the 4-into-1 dongle issue and b.) while the card has more clock headroom (3.2 GHz on light 3D loads < 40 C), it just gets too hot (hotspot > 90C, 25 C ambient). Waterblock is ordered, as is a 12VHPWR single strand cable from Seasonic (free w/ their review process). Once all that is updated, I look forward to really give the Giga-G-OC the beans. I just hope that the update' slippery slope bug' doesn't bite me in the meantime no,no,no,no no 13900K no no


Yep same, Port Royal used to run no matter what whilst TS Extreme would crash all the time for my 3 series.

Definitely holding off too, trying to limit my wattage to 300 draw until my cablemod gets here.

Look forward to seeing your new 13900k rig set up soon


----------



## J7SC

Arizor said:


> Yep same, Port Royal used to run no matter what whilst TS Extreme would crash all the time for my 3 series.
> 
> Definitely holding off too, trying to limit my wattage to 300 draw until my cablemod gets here.
> 
> Look forward to seeing your new 13900k rig set up soon


I actually posted re. purchase of a MSI Unify X last spring, but it is doing s.th. else now w/Alderlake and it would be too complex to reconfig - so I slurp up all the leaks about Meteor Lake and AM5 3DVCache...


----------



## cheddardonkey

N19htmare666 said:


> Just one more. Gpu-z - advanced tab, 'nvidia bios' from the drop-down menu. Can you screenshot the power limit section. Many thanks



*







*


----------



## Sayenah

J7SC said:


> ...Weird that Port Royal is weird on your system  ...That's the one bench I never have any problems with. Highest I subbed was 28443, though there was a low 29k one but it seemed to have some (minor) artifacts near the end, so not subbed.
> 
> ...I haven't benched in almost a week because a.) the 4-into-1 dongle issue and b.) while the card has more clock headroom (3.2 GHz on light 3D loads < 40 C), it just gets too hot (hotspot > 90C, 25 C ambient). Waterblock is ordered, as is a 12VHPWR single strand cable from Seasonic (free w/ their review process). Once all that is updated, I look forward to really give the Giga-G-OC the beans. I just hope that the update' slippery slope bug' doesn't bite me in the meantime no,no,no,no no 13900K no no


Hi, which card do you have?


----------



## J7SC

Sayenah said:


> Hi, which card do you have?


Gigabyte Gaming OC


----------



## originxt

KedarWolf said:


> If you hit CTRL F while clicked on Afterburner OSD, it'll being up the voltage curve then click CTRL L on a voltage point on the curve, it'll lock it to that. If you hit CTRL L on a voltage point higher than 1.1v, it'll lock it to the clock you have at 1.1v. It'll still down bin with temps etc when a game is running, but with no game running it'll show the max clock you have at 1.1v.
> 
> This is what it looks like locked at 1.1v.
> 
> View attachment 2579157


At the risk of sounding daft, how did you get your y-axis values past 3k? Literally stuck lol.


----------



## Recipe7

I was able to get my EKWB for my FE card installed the other day and had time today to do some benching. I am saddened to barely pass 25k in port royal.

I’m running a 9900k at 5ghz and the 4090 at a power limit of 133%. I am not able to pass 450 watts in any benchmarking or gaming.

I reinstalled the drivers and reinstalled the apps but getting the same results.

I am using 4 dedicated/separate pci-e cables, connected to the nvidia adapter. I am waiting for the cablemod 12 pin to 3x8 pin to come In.

Correct me if I’m wrong, but I read about 40 pages back that no matter the configuration of the cables I’m using with the psu (Corsair ax860), I will be limited by the psu if it can’t handle the 600 watts? I thought that as long as I have 4 cables plugged into the adapter it will run past 450 watts. How am I able to change the power target to 133% and still be unable to pass 450 watts?

also, msi afterburner will not lock my voltage to what I set it at. If I set it to 1.0v, it will fluctuate back and forth between 1.0v and 1.05v. Anyone else having that issue?


----------



## mirkendargen

Recipe7 said:


> I was able to get my EKWB for my FE card installed the other day and had time today to do some benching. I am saddened to barely pass 25k in port royal.
> 
> I’m running a 9900k at 5ghz and the 4090 at a power limit of 133%. I am not able to pass 450 watts in any benchmarking or gaming.
> 
> I reinstalled the drivers and reinstalled the apps but getting the same results.
> 
> I am using 4 dedicated/separate pci-e cables, connected to the nvidia adapter. I am waiting for the cablemod 12 pin to 3x8 pin to come In.
> 
> Correct me if I’m wrong, but I read about 40 pages back that no matter the configuration of the cables I’m using with the psu (Corsair ax860), I will be limited by the psu if it can’t handle the 600 watts? I thought that as long as I have 4 cables plugged into the adapter it will run past 450 watts. How am I able to change the power target to 133% and still be unable to pass 450 watts?
> 
> also, msi afterburner will not lock my voltage to what I set it at. If I set it to 1.0v, it will fluctuate back and forth between 1.0v and 1.05v. Anyone else having that issue?


I remember reading from some reviewer that if you've ever had the card with only 3 connectors plugged in you need to run DDU and clean install the drivers after plugging 4 in to correctly be able to get 600w.


----------



## Sayenah

I have the Strix 4090 on a Z690 Apex board with 12900k. I can’t for the life of me figure out this mystery: why don’t I get any video during POST? This only happens with HDMI though, and not DP (tried multiple HDMI cables and multiple monitors).


----------



## EarlZ

Sayenah said:


> I have the Strix 4090 on a Z690 Apex board with 12900k. I can’t for the life of me figure out this mystery: why don’t I get any video during POST? This only happens with HDMI though, and not DP (tried multiple HDMI cables and multiple monitors).


I think Jay2cents also had a similar problem and is waiting for vbios update.


----------



## cheddardonkey

Well, these are some interesting 500w vs 600w Bios test results.. N19htmare666 

500w bios is outperforming 600w bios it would seem. 

*MSI SUPRIM LIQUID 600W BIOS - AORUS WATERFORCE 4090 AIO CARD.*








*AORUS - CONTROL - FACTORY 500W BIOS*


----------



## GanMenglin

Sayenah said:


> I have the Strix 4090 on a Z690 Apex board with 12900k. I can’t for the life of me figure out this mystery: why don’t I get any video during POST? This only happens with HDMI though, and not DP (tried multiple HDMI cables and multiple monitors).


I got the same build, it works fine. Have you tried to update the vBios? Latest version is V1 on ROG website.


----------



## Sayenah

GanMenglin said:


> I got the same build, it works fine. Have you tried to update the vBios? Latest version is V1 on ROG website.


Yup updated everything. The vBIOS, board BIOS.

Are you running on HDMI? My main monitor is a G9 Neo, and my OLED upstairs through fiber HDMI run. Same issues on both.


----------



## Pollomir

Antsu said:


> Lucky you! My 3090 STRIX had insane coil whine, the worst I've heard in the 20 years I've been building PC's. It was barely audible with the stock cooler, but after slapping a WB backplate on it she turned into a real singer.


Thats what I'm afraid of. Mine have a very slight coil whine, but with the waterblock I'm pretty sure that it's gonna be much worse.


----------



## mattskiiau

Gonna take 2 months to get a cablemod 12vhpwr to my door in Australia. Corsair cable about the same.

A race between stock and my house burning down. Who will win? 😢

Question: I saw someone mention they are limiting power limit to 50%. Think that'll prevent overheating the connector? 🤔


----------



## Arizor

mattskiiau said:


> Gonna take 2 months to get a cablemod 12vhpwr to my door in Australia. Corsair cable about the same.
> 
> A race between stock and my house burning down. Who will win? 😢
> 
> Question: I saw someone mention they are limiting power limit to 50%. Think that'll prevent overheating the connector? 🤔


Hey mate you should be able to grab one in a week or so from BPC Tech here in Oz - Corsair 600W PCIe 5.0 12VHPWR Type-4 PSU Power Cable


----------



## katates

Hello are the card with coil whine more common? I have resent my gaming x trio and bought tuf oc. The result is still the same. Should I also return this TUF? I am checking and it looks like this power cable problem is much more common than the coil whine threads. So i am thinking like most of the cards does not have coil whine?


----------



## Daemon_xd

This is how the 12VHPWR is done correctly: Crimping instead of soldering and how to supply the GeForce RTX 4090 with power properly | igor'sLAB


First there were bent cables, then annoyed buyers of the at least 2000 euro expensive NVIDIA GeForce RTX 4090, no matter which model or manufacturer. The manufacturer of the 4-pin adapter from 12VHPWR…




www.igorslab.de




The follow up post from Igorslab about 12vhpwr connector


----------



## Tideman

katates said:


> Hello are the card with coil whine more common? I have resent my gaming x trio and bought tuf oc. The result is still the same. Should I also return this TUF? I am checking and it looks like this power cable problem is much more common than the coil whine threads. So i am thinking like most of the cards does not have coil whine?


Prob all have some degree of coil whine but it seems the Asus cards are among the worst. I'm returning my TUF for this reason.

My Gigabyte Gaming OC has coil whine but it's quieter and also more 'stable' than the TUF which would constantly change pitch.


----------



## N19htmare666

cheddardonkey said:


> *
> View attachment 2579172
> *


Not this one, you have to select 'Nvidia bios' from the drop down first


----------



## N19htmare666

cheddardonkey said:


> Well, these are some interesting 500w vs 600w Bios test results.. N19htmare666
> 
> 500w bios is outperforming 600w bios it would seem.
> 
> *MSI SUPRIM LIQUID 600W BIOS - AORUS WATERFORCE 4090 AIO CARD.*
> View attachment 2579189
> 
> *AORUS - CONTROL - FACTORY 500W BIOS*
> View attachment 2579188


Something doesn't add up. Hoping it will be clearer with screenshot of the gpu-z bios page.


----------



## alitayyab

originxt said:


> At the risk of sounding daft, how did you get your y-axis values past 3k? Literally stuck lol.
> 
> View attachment 2579183


Close the App. Open "MSIAfterburner.cfg" (located in the installation folder). Look for "VFCurveEditorMaxFrequency" and enter a value like 4000 or 3500 to extend the frequency axis. Restart App.


----------



## arieldeboca

Hi guys. Is it worth buying a strix 4090 instead of a tuf?
I saw that the pcb are very similar. Tuf and strix do not reach 600w?


----------



## Sayenah

arieldeboca said:


> Hi guys. Is it worth buying a strix 4090 instead of a tuf?
> I saw that the pcb are very similar. Tuf and strix do not reach 600w?


If you have to ask, then the Strix isn’t worth it, at all. You will always be better served by a Suprim x or an FE. Frankly, with the Suprim X available, I don’t even think the TUF is worth it. 

I bought the Strix because it is a huge card with a huge cooler and runs super quiet. Plus, I love the way it looks.


----------



## Recipe7

mirkendargen said:


> I remember reading from some reviewer that if you've ever had the card with only 3 connectors plugged in you need to run DDU and clean install the drivers after plugging 4 in to correctly be able to get 600w.


Wow, that might be my case. When I started I only had 3 pci-e connected to the adapter. After seeing my low PR scores I used DDU, then at that point I decided to use four dedicated cables.
I will try it out, thanks Mirkendargen


----------



## arieldeboca

Sayenah said:


> If you have to ask, then the Strix isn’t worth it, at all. You will always be better served by a Suprim x or an FE. Frankly, with the Suprim X available, I don’t even think the TUF is worth it.
> 
> I bought the Strix because it is a huge card with a huge cooler and runs super quiet. Plus, I love the way it looks.


Hello. Ask why in this generation the pcb's are very similar. In the future, the card will be used with a water block.


----------



## LunaP

Sayenah said:


> If you have to ask, then the Strix isn’t worth it, at all. You will always be better served by a Suprim x or an FE. Frankly, with the Suprim X available, I don’t even think the TUF is worth it.
> 
> I bought the Strix because it is a huge card with a huge cooler and runs super quiet. Plus, I love the way it looks.


Gigabyte gaming OC is great too, 600W out of the box no bios update needed + good source of blocks now. Absolutely loving mine atm.


----------



## jl434

mattskiiau said:


> Gonna take 2 months to get a cablemod 12vhpwr to my door in Australia. Corsair cable about the same.
> 
> A race between stock and my house burning down. Who will win? 😢
> 
> Question: I saw someone mention they are limiting power limit to 50%. Think that'll prevent overheating the connector? 🤔


I also ordered the 12vhpwr from them, still no update and tracking, hope won't take 2 month to arrive AU
So sad the EVGA PSU don't have official 12vhpwr cable


----------



## LunaP

jl434 said:


> I also ordered the 12vhpwr from them, still no update and tracking, hope won't take 2 month to arrive AU
> So sad the EVGA PSU don't have official 12vhpwr cable


Hopefully higher end 3.0 PSU's surface next year. I'm in a double wide case and non vertical mount so hoping my cable mod cord will reach my PSU, didn't think about this till after ordering it, I just know it has the clearance from teh door at least.


----------



## Panchovix

arieldeboca said:


> Hi guys. Is it worth buying a strix 4090 instead of a tuf?
> I saw that the pcb are very similar. Tuf and strix do not reach 600w?


I would get a Strix if you can, based on the reddit post of burnt cables, TUF 4090 seems to be the most affected card
There isn't a Strix card reported (and confirmed) with a burnt cable.
*List of Known Issues*

Date​Post​Card Brand/Model​Adapter Type​October 24​Link Here​Gigabyte 4090​4x 8 pins​October 24​Link Here​Asus TUF 4090​4x 8 pins​October 25​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Galax 4090 SG​4x 8 pins??​October 27​Link Here​MSI Suprim X 4090​4x 8 pins​October 27​Link Here​Asus TUF 4090​4x 8 pins​October 27​Link Here​MSI Suprim Liquid X 4090​4x 8 pins​October 28​Link Here​Asus TUF 4090​4x 8 pins​October 28​Link Here​MSI Gaming X Trio 4090​3x 8 pins​
*List of Unconfirmed Issues*

Date​Post​Card Brand/Model​Adapter Type​October 28​Link Here 1​Zotac Amp Extreme Airo 4090​4x 8 pins​October 28​Link Here 2​Asus Strix 4090​4x 8 pins​
[1] - The adapter cracked but not yet melted. Card still works. I put it on the list for overabundance of caution

[2] - The quality of the image is low and doesn't seem to be any sign of melting but it seems there's some sign of discoloration. Card still works.



Spoiler: Reddit Link



Source:

__
https://www.reddit.com/r/nvidia/comments/ydh1mh


----------



## Panchovix

Also, when seeing the power readings, should I focus on the GPU Power measurement, or the GPU Rail Power one? I limited the card to 300W on Afterburner, but seeing it way above that scares me lol


----------



## ArcticZero

Ordered a ModDIY custom cable in advance while waiting for my Strix to arrive. I know the risk is low, but better safe than sorry I suppose.


----------



## Sheyster

arieldeboca said:


> Hi guys. Is it worth buying a strix 4090 instead of a tuf?
> I saw that the pcb are very similar. Tuf and strix do not reach 600w?


It depends. Some people are really into the aesthetics of their rig. They want everything to match and look great! If you have an ROG Strix motherboard, the Strix 4090 will look better in your rig.

All that aside, if you want to save money just look for the base model TUF. Don't bother to get the OC version unless you don't have a choice, it's basically identical except for the very small core OC bump. Just flash the OC TUF BIOS and you'll essentially have the OC version for free. Both TUF cards and the Strix have 600w BIOS.


----------



## Sayenah

Sheyster said:


> It depends. Some people are really into the aesthetics of their rig. They want everything to match and look great! If you have an ROG Strix motherboard, the Strix 4090 will look better in your rig.
> 
> All that aside, if you want to save money just look for the base model TUF. Don't bother to get the OC version unless you don't have a choice, it's basically identical except for the very small core OC bump. Just flash the OC TUF BIOS and you'll essentially have the OC version for free. Both TUF cards and the Strix have 600w BIOS.


I am waiting to pair my Strix with a Z790 Dark Kingpin or the white Apex. Yes, aesthetics matter haha


----------



## Sheyster

LunaP said:


> Gigabyte gaming OC is great too, 600W out of the box no bios update needed + good source of blocks now. Absolutely loving mine atm.


FWIW, Gigabyte released a BIOS update themselves (version F2) to "increase compatibility":









GeForce RTX™ 4090 GAMING OC 24G Support | Graphics Card - GIGABYTE Global


Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!




www.gigabyte.com


----------



## KedarWolf

KedarWolf said:


> Here's what i had to do on my 4090 to get G-Sync working.
> 
> Make sure your monitor has G-Sync, FreeSync or Adaptive Sync enabled
> 
> Enable G-Sync in Nvidia Control Panel for Fullscreen and Windowed Fullscreen.
> 
> Select the Enable Settings For The Selected Display Model.
> 
> Turn on V-Sync in Control Panel. Turn OFF any Frame Limiters in the Vidia Control Panel including the Background Frame Rate one and any third-party ones.
> 
> Turn OFF V-Sync in your games though.
> 
> Reboot your PC. The G-Sync indicator enabled in Display in the Nvidia control panel should be showing now.
> 
> I can use the Background Frame Limiter built-in to Diablo 3 though.


Edit: The other thing I had to do was disable my second NOT G-Sync screen, set G-Sync ON in my main screen, and then re-enable my second screen. 

Even if I had my main screen selected and did everything with the second screen on, the G-Sync indicator wouldn't show in games.


----------



## yzonker

Panchovix said:


> I would get a Strix if you can, based on the reddit post of burnt cables, TUF 4090 seems to be the most affected card
> There isn't a Strix card reported (and confirmed) with a burnt cable.
> *List of Known Issues*
> 
> Date​Post​Card Brand/Model​Adapter Type​October 24​Link Here​Gigabyte 4090​4x 8 pins​October 24​Link Here​Asus TUF 4090​4x 8 pins​October 25​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Galax 4090 SG​4x 8 pins??​October 27​Link Here​MSI Suprim X 4090​4x 8 pins​October 27​Link Here​Asus TUF 4090​4x 8 pins​October 27​Link Here​MSI Suprim Liquid X 4090​4x 8 pins​October 28​Link Here​Asus TUF 4090​4x 8 pins​October 28​Link Here​MSI Gaming X Trio 4090​3x 8 pins​
> *List of Unconfirmed Issues*
> 
> Date​Post​Card Brand/Model​Adapter Type​October 28​Link Here 1​Zotac Amp Extreme Airo 4090​4x 8 pins​October 28​Link Here 2​Asus Strix 4090​4x 8 pins​
> [1] - The adapter cracked but not yet melted. Card still works. I put it on the list for overabundance of caution
> 
> [2] - The quality of the image is low and doesn't seem to be any sign of melting but it seems there's some sign of discoloration. Card still works.
> 
> 
> 
> Spoiler: Reddit Link
> 
> 
> 
> Source:
> 
> __
> https://www.reddit.com/r/nvidia/comments/ydh1mh


I'd be willing to bet that is installed base skewing the result. There were a lot of TUFs sold I think. That's how I ended up with one. Microcenter had a pile of TUFs and Gigabytes left the day after release.


----------



## KedarWolf

ArcticZero said:


> Ordered a ModDIY custom cable in advance while waiting for my Strix to arrive. I know the risk is low, but better safe than sorry I suppose.


I ordered the ModDIY silicon cable, the 2xPSU one for my AX1600i.

The silicon one because it's by far the most flexible and bendable one.

And this is about the Corsair AX1600i and 2 into the PSU from a forum. And ModDIY confirmed it's fine.

"So my concern for max power per 8-pin connection is unfounded if this is accurate. With the quality Corsair AX1600i that I am using, each 8-pin can supply 288W for a max of 576W + the PCIe slot at 75W. I’m underestimating the capacity of the cable. It appears to be able to supply much more juice than I had originally anticipated. I just wanted confirmation of this before proceeding with any change in how I’m powering the card."

I kinda wish I went with the 3 into 1 though.

Edit: I'm preordering a CableMod 90-degree adapter Monday as well.


----------



## bhav

RTX 5090 was just announced:






No point making a new thread as its just a meme.


----------



## Sheyster

For those using G-Sync + V-Sync + FPS cap...

Are you using V-Sync set to ON or FAST in the NVCP? I don't think it will matter much because of the FPS cap, just curious what others are using.

Link for reference:









G-SYNC 101: Optimal G-SYNC Settings & Conclusion | Blur Busters


Many G-SYNC input lag tests & graphs -- on a 240Hz eSports monitor -- via 2 months of testing via high speed video!




blurbusters.com


----------



## ZealotKi11er

Really not happy that EVGA does not have a cable yet. If its their way of retaliating against Nvidia I will no longer buy PSUs from them.


----------



## arieldeboca

Sheyster said:


> Eso depende. Algunas personas están realmente interesadas en la estética de su plataforma. ¡Quieren que todo combine y se vea genial! Si tiene una placa base ROG Strix, la Strix 4090 se verá mejor en su plataforma.
> 
> Aparte de todo eso, si desea ahorrar dinero, busque el modelo base TUF. No se moleste en obtener la versión OC a menos que no tenga otra opción, es básicamente idéntica, excepto por la pequeña protuberancia central OC. Simplemente actualice el BIOS OC TUF y esencialmente tendrá la versión OC de forma gratuita. Tanto las tarjetas TUF como la Strix tienen BIOS de 600w.
> [/COTIZAR]





Sheyster said:


> It depends. Some people are really into the aesthetics of their rig. They want everything to match and look great! If you have an ROG Strix motherboard, the Strix 4090 will look better in your rig.
> 
> All that aside, if you want to save money just look for the base model TUF. Don't bother to get the OC version unless you don't have a choice, it's basically identical except for the very small core OC bump. Just flash the OC TUF BIOS and you'll essentially have the OC version for free. Both TUF cards and the Strix have 600w BIOS.


but strix or tuf reach 600w? I have a 3090 FTW3 with 500w bios and it never consumed more than 470w.


----------



## arieldeboca

Sheyster said:


> Eso depende. Algunas personas están realmente interesadas en la estética de su plataforma. ¡Quieren que todo combine y se vea genial! Si tiene una placa base ROG Strix, la Strix 4090 se verá mejor en su plataforma.
> 
> Aparte de todo eso, si desea ahorrar dinero, busque el modelo base TUF. No se moleste en obtener la versión OC a menos que no tenga otra opción, es básicamente idéntica, excepto por la pequeña protuberancia central OC. Simplemente actualice el BIOS OC TUF y esencialmente tendrá la versión OC de forma gratuita. Tanto las tarjetas TUF como la Strix tienen BIOS de 600w.
> [/COTIZAR]





Panchovix said:


> I would get a Strix if you can, based on the reddit post of burnt cables, TUF 4090 seems to be the most affected card
> There isn't a Strix card reported (and confirmed) with a burnt cable.
> *List of Known Issues*
> 
> Date​Post​Card Brand/Model​Adapter Type​October 24​Link Here​Gigabyte 4090​4x 8 pins​October 24​Link Here​Asus TUF 4090​4x 8 pins​October 25​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Galax 4090 SG​4x 8 pins??​October 27​Link Here​MSI Suprim X 4090​4x 8 pins​October 27​Link Here​Asus TUF 4090​4x 8 pins​October 27​Link Here​MSI Suprim Liquid X 4090​4x 8 pins​October 28​Link Here​Asus TUF 4090​4x 8 pins​October 28​Link Here​MSI Gaming X Trio 4090​3x 8 pins​
> *List of Unconfirmed Issues*
> 
> Date​Post​Card Brand/Model​Adapter Type​October 28​Link Here 1​Zotac Amp Extreme Airo 4090​4x 8 pins​October 28​Link Here 2​Asus Strix 4090​4x 8 pins​
> [1] - The adapter cracked but not yet melted. Card still works. I put it on the list for overabundance of caution
> 
> [2] - The quality of the image is low and doesn't seem to be any sign of melting but it seems there's some sign of discoloration. Card still works.
> 
> 
> 
> Spoiler: Reddit Link
> 
> 
> 
> Source:
> 
> __
> https://www.reddit.com/r/nvidia/comments/ydh1mh


.


----------



## arieldeboca

Panchovix said:


> I would get a Strix if you can, based on the reddit post of burnt cables, TUF 4090 seems to be the most affected card
> There isn't a Strix card reported (and confirmed) with a burnt cable.
> *List of Known Issues*
> 
> Date​Post​Card Brand/Model​Adapter Type​October 24​Link Here​Gigabyte 4090​4x 8 pins​October 24​Link Here​Asus TUF 4090​4x 8 pins​October 25​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Galax 4090 SG​4x 8 pins??​October 27​Link Here​MSI Suprim X 4090​4x 8 pins​October 27​Link Here​Asus TUF 4090​4x 8 pins​October 27​Link Here​MSI Suprim Liquid X 4090​4x 8 pins​October 28​Link Here​Asus TUF 4090​4x 8 pins​October 28​Link Here​MSI Gaming X Trio 4090​3x 8 pins​
> *List of Unconfirmed Issues*
> 
> Date​Post​Card Brand/Model​Adapter Type​October 28​Link Here 1​Zotac Amp Extreme Airo 4090​4x 8 pins​October 28​Link Here 2​Asus Strix 4090​4x 8 pins​
> [1] - The adapter cracked but not yet melted. Card still works. I put it on the list for overabundance of caution
> 
> [2] - The quality of the image is low and doesn't seem to be any sign of melting but it seems there's some sign of discoloration. Card still works.
> 
> 
> 
> Spoiler: Reddit Link
> 
> 
> 
> Source:
> 
> __
> https://www.reddit.com/r/nvidia/comments/ydh1mh


Gracias hermano! Conseguiste alguna 4090? Vivís en Chile?


----------



## Panchovix

arieldeboca said:


> Gracias hermano! Conseguiste alguna 4090? Vivís en Chile?


Sip! Conseguí una TUF 4090, y bastante a gusto, pero me da miedo quemar el cable 

English:
Yup! I got a TUF 4090, and I love it, but I'm afraid of burning the cable


----------



## Sheyster

ZealotKi11er said:


> Really not happy that EVGA does not have a cable yet. If its their way of retaliating against Nvidia I will no longer buy PSUs from them.


Cablemod...






CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store







store.cablemod.com





EDIT: If they're truly being vindictive it will only hurt them. If anything this is a great opportunity to offer a high-margin cable to their existing PSU customers.


----------



## Sheyster

arieldeboca said:


> but strix or tuf reach 600w? I have a 3090 FTW3 with 500w bios and it never consumed more than 470w.


If you lock voltage at 1.1v in Afterburner and run a heavy workload (Furmark is one example but be careful with that one) it should hit 600w or very close to it at 133% power limit.


----------



## ArcticZero

KedarWolf said:


> I ordered the ModDIY silicon cable, the 2xPSU one for my AX1600i.
> 
> The silicon one because it's by far the most flexible and bendable one.
> 
> And this is about the Corsair AX1600i and 2 into the PSU from a forum. And ModDIY confirmed it's fine.
> 
> "So my concern for max power per 8-pin connection is unfounded if this is accurate. With the quality Corsair AX1600i that I am using, each 8-pin can supply 288W for a max of 576W + the PCIe slot at 75W. I’m underestimating the capacity of the cable. It appears to be able to supply much more juice than I had originally anticipated. I just wanted confirmation of this before proceeding with any change in how I’m powering the card."
> 
> I kinda wish I went with the 3 into 1 though.
> 
> Edit: I'm preordering a CableMod 90-degree adapter Monday as well.


I went with the sleeved one but in hindsight maybe I should have gone for the silicon one due to that exact reason. But I suspect it should be fine either way. Also waiting for CableMod's 180-degree cables (they said they would announce them very soon) which would be a lot better for vertical mounts like in my case.


----------



## mirkendargen

KedarWolf said:


> I ordered the ModDIY silicon cable, the 2xPSU one for my AX1600i.
> 
> The silicon one because it's by far the most flexible and bendable one.
> 
> And this is about the Corsair AX1600i and 2 into the PSU from a forum. And ModDIY confirmed it's fine.
> 
> "So my concern for max power per 8-pin connection is unfounded if this is accurate. With the quality Corsair AX1600i that I am using, each 8-pin can supply 288W for a max of 576W + the PCIe slot at 75W. I’m underestimating the capacity of the cable. It appears to be able to supply much more juice than I had originally anticipated. I just wanted confirmation of this before proceeding with any change in how I’m powering the card."
> 
> I kinda wish I went with the 3 into 1 though.
> 
> Edit: I'm preordering a CableMod 90-degree adapter Monday as well.


I have the 3->1 Moddiy cable, I should have gotten then 2->1 because there's no point in anything more. They have the same number of pins, same number of wires, same current per pin/wire, they're both 1:1 pins/wires between the connectors on both sides of the cable without any bridging/splicing. The 3->1 cable just puts 2 +12v/ground wires into each PSU 8pin, and the 2->1 cable put 3 in each PSU 8pin. The only difference it would make is if you had some multirail PSU that's not common at all these days, and needed to spread the load across more rails.


----------



## Pollomir

ZealotKi11er said:


> Really not happy that EVGA does not have a cable yet. If its their way of retaliating against Nvidia I will no longer buy PSUs from them.


In their forums, they said that are working on the cable, but didn't say anything about the specific release date.


----------



## ZealotKi11er

Sheyster said:


> Cablemod...
> 
> 
> 
> 
> 
> 
> CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store
> 
> 
> 
> 
> 
> 
> 
> store.cablemod.com
> 
> 
> 
> 
> 
> EDIT: If they're truly being vindictive it will only hurt them. If anything this is a great opportunity to offer a high-margin cable to their existing PSU customers.


Already purchased the CabeMod but wanted a official version of the cable.


----------



## Laithan

ZealotKi11er said:


> Already purchased the CabeMod but wanted a official version of the cable.


I have a Corsair AX1500i and bought the 12VHPWR cable directly from Corsair... which has *(2)* PSU connectors into one 12VHPWR cable... but then when I saw the CableMod cable was using *(4)* PSU connectors into one 12VHPWR cable (also made for Corsair) I did a head scratch and bought that one too. I read that corsair is confident that 2 PSU connectors can supply enough power but it seems that corsair's approach would be the equivalent of a standard 8-pin PCIe connector with a 2nd 8-pin _pigtail_ coming off the same cable... which many advise against using for high power cards.


----------



## cheddardonkey

N19htmare666 said:


> Something doesn't add up. Hoping it will be clearer with screenshot of the gpu-z bios page.


Not clearing it up.. LOL, The page shows exactly what I expected, 500w for the Aorus and 600w for the MSI. Still not explaining results though. I am wondering if there are parameters I should look at changing to increase use of the 600w capacity. Everything I have tried crashes beyond the settings I have shown on the other screenshots.. pushing +160 on GPU TO 3015 and memory+1400.. Neither can be increased to a higher stability because 600w is unlocked. Voltage always becomes the limiter. Still does not explain also why 500w is getting higher performance marks.


----------



## Spiriva

Panchovix said:


> I would get a Strix if you can, based on the reddit post of burnt cables, TUF 4090 seems to be the most affected card
> There isn't a Strix card reported (and confirmed) with a burnt cable.
> *List of Known Issues*
> 
> Date​Post​Card Brand/Model​Adapter Type​October 24​Link Here​Gigabyte 4090​4x 8 pins​October 24​Link Here​Asus TUF 4090​4x 8 pins​October 25​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Asus TUF 4090​4x 8 pins​October 26​Link Here​Galax 4090 SG​4x 8 pins??​October 27​Link Here​MSI Suprim X 4090​4x 8 pins​October 27​Link Here​Asus TUF 4090​4x 8 pins​October 27​Link Here​MSI Suprim Liquid X 4090​4x 8 pins​October 28​Link Here​Asus TUF 4090​4x 8 pins​October 28​Link Here​MSI Gaming X Trio 4090​3x 8 pins​
> *List of Unconfirmed Issues*
> 
> Date​Post​Card Brand/Model​Adapter Type​October 28​Link Here 1​Zotac Amp Extreme Airo 4090​4x 8 pins​October 28​Link Here 2​Asus Strix 4090​4x 8 pins​
> [1] - The adapter cracked but not yet melted. Card still works. I put it on the list for overabundance of caution
> 
> [2] - The quality of the image is low and doesn't seem to be any sign of melting but it seems there's some sign of discoloration. Card still works.
> 
> 
> 
> Spoiler: Reddit Link
> 
> 
> 
> Source:
> 
> __
> https://www.reddit.com/r/nvidia/comments/ydh1mh


I would guess there is most TUF card sold in the world too.


----------



## BeZol

I am not following the thread any more, so if it has already been, then sorry in advance.

Inno3D X3 OC RTX 4090
3x8pin adapter
TDP 450W

It's basically one of the cheapest models you can get.

With 4x8pin adapter nothing changes.

BUT
if you put a Gigabyte Gaming OC BIOS (450W +33% = 600W TDP limit) on the GPU,
coupled with the 4x8pin adapter I managed to unlock the graphics card.

Interestingly I had voltage limit at 1.1V at my Port Royal run (28.928 score) with peak TDP of 550W.


----------



## mirkendargen

Laithan said:


> I have a Corsair AX1500i and bought the 12VHPWR cable directly from Corsair... which has *(2)* PSU connectors into one 12VHPWR cable... but then when I saw the CableMod cable was using *(4)* PSU connectors into one 12VHPWR cable (also made for Corsair) I did a head scratch and bought that one too. I read that corsair is confident that 2 PSU connectors can supply enough power but it seems that corsair's approach would be the equivalent of a standard 8-pin PCIe connector with a 2nd 8-pin _pigtail_ coming off the same cable... which many advise against using for high power cards.
> 
> In this case I think I am going to trust the 3rd part cable vendor's cable over the PSU manufacturer's solution.


Look carefully when you talk about X number of connectors combined into Y connector. You'll find that the X connectors don't have all their pins used, and it's ultimately the same amount of current going through the same pin connections and wires. A 12VHPWR connector only has 6 +12V pins/wires and 6 ground pins/wires. Corsair PSU connectors have 4 +12V and 4 ground. The official Corsair cable uses 3 +12V/ground pins from each connector. The 3 connector Moddiy cable uses 2 +12V/ground pins from each connector. The Cablemod one will do the same thing unless it's bridging wires, which is just another connection to potentially fail and not great if not necessary. A 4 connector on the PSU side cable doesn't even divide evenly into a 12VHPWR cable.


----------



## J7SC

yzonker said:


> I'd be willing to bet that is installed base skewing the result. There were a lot of TUFs sold I think. That's how I ended up with one. Microcenter had a pile of TUFs and Gigabytes left the day after release.


The 'installed base' issue is certainly a prerequisite for the potential issue to happen. One store here had about 25 of the Gigabyte G OC but sold out briskly for its market catchment area, now, there's none with the big chains though I noticed that some have raised the price for it (which they don't currently have in stock) by ~ C$ 100. But things may change again next week.


----------



## Laithan

mirkendargen said:


> Look carefully when you talk about X number of connectors combined into Y connector. You'll find that the X connectors don't have all their pins used, and it's ultimately the same amount of current going through the same pin connections and wires. A 12VHPWR connector only has 6 +12V pins/wires and 6 ground pins/wires. Corsair PSU connectors have 4 +12V and 4 ground. The official Corsair cable uses 3 +12V/ground pins from each connector. The 3 connector Moddiy cable uses 2 +12V/ground pins from each connector. The Cablemod one will do the same thing unless it's bridging wires, which is just another connection to potentially fail and not great if not necessary. A 4 connector on the PSU side cable doesn't even divide evenly into a 12VHPWR cable.


Makes sense ty.. check this out (using your info)

comparison between the old style 8-pin and the new 12-pin 12VHPWR (power and ground only). 
An 8-pin PCIe power connector is rated at *150W* each (can do more, but sticking to specs)
Inside an 8-pin there are (3) 12V wires.. assuming the power is drawn equally across all 3 that is 50W each 12V cable. This is considered safe and we bend these cables any way we want...

A 12-pin PCIe power connector is rated at *600W* each.
Inside a 12-pin there are (6) 12V wires (and a smaller gauge).. assuming the power is drawn equally across all 6 wires that is 100W each 12V cable.
_*So the power has doubled across a smaller gauge wire*_.

The (4) 8-pin PCIe cables needed to power a single 12VHPWR seems to have a relation to the 150W per 8-pin standard... so if you want 600W you need (4) of them... and likely why there is bridging going on here... but as your pointed out, technically only (2) 8-pins would be needed to get (6) 12V...and seems to be what corsair is doing.

Regardless, to emphasize - the connector, past the bridging point (if there is bridging), at each individual 12V wire, *the power has doubled and the gauge of the wire is smaller*. Looking at it like this, I am not surprised we are seeing melted connectors... the issue has to be with the reduction of the wire gauge inside the connector itself compounded with the doubling of power..

Will using a different cable really even matter if the connector and wire gauge is still the same? That's where it is melting..


----------



## mirkendargen

Laithan said:


> Makes sense ty.. check this out (using your info)
> 
> comparison between the old style 8-pin and the new 12-pin 12VHPWR (power and ground only).
> An 8-pin PCIe power connector is rated at *150W* each (can do more, but sticking to specs)
> Inside an 8-pin there are (3) 12V wires.. assuming the power is drawn equally across all 3 that is 50W each 12V cable. This is considered safe and we bend these cables any way we want...
> 
> A 12-pin PCIe power connector is rated at *600W* each.
> Inside a 12-pin there are (6) 12V wires (and a smaller gauge).. assuming the power is drawn equally across all 6 wires that is 100W each 12V cable.
> _*So the power has doubled across a smaller gauge wire*_.
> 
> The (4) 8-pin PCIe cables needed to power a single 12VHPWR seems to have a relation to the 150W per 8-pin standard... so if you want 600W you need (4) of them... and likely why there is bridging going on here... but as your pointed out, technically only (2) 8-pins would be needed to get (6) 12V...and seems to be what corsair is doing.
> 
> Regardless, to emphasize - _*the connector*_, past the bridging point (if there is bridging), at each individual 12V wire, *the power has doubled and the gauge of the wire is smaller*. Looking at it like this, I am not surprised we are seeing melted connectors... the issue has to be with the reduction of the wire gauge inside the connector itself compounded with the doubling of power..
> 
> Will using a different cable really even matter if the connector and wire gauge is still the same? That's where it is melting..


Saying the connector is a "smaller gauge" is deceptive. The wire spec in 12VHPWR cables is actually a larger gauge (16AWG I think). I think the ATX spec for PCIE 8pins is 20AWG (but cable companies/PSU manufacturers all overbuild them with 18AWG or 16AWG).

But that's the wire, on to the connector. The pins are smaller, but what matters is the cross section in contact with the wire, and the total area of contact between the pin and it's receptacle in it's mate when connected. If the pins are crimped on correctly, no issues with the connection to the wire, totally fine. If a garbage solder job and an ultra thin bus bar thing bridging all the wires is used, that's asking for trouble (Igor failing adapter theory). If the pins mate correctly, they have way more contact area than the cross-sectional area of even a 16AWG wire, totally fine. Problems arise when the pins don't mate snuggly, and only have a tiny contact area, or even worse no actual contact area but are close enough to arc. This is one of the things that may be happening with the adapter (Buildzoid theory on the pins bending too far open). The physical connector being smaller isn't a problem here, it's actually a good thing because it forces tighter tolerances and less literal "wiggle room" for the pins to make poor contact (although it could still be possible).

Also incidentally Intel are the ones that came up with the 600w 12VHPWR spec, with input from others. It's not like Nvidia just YOLO'ed their way into something for fun all on their own and this is the first time anyone is considering whether it actually works or not. Nvidia being cheap and unaware/indifferent their adapter supplier is trash is waaaayyyyy more plausible.


----------



## Glottis

Laithan said:


> I have a Corsair AX1500i and bought the 12VHPWR cable directly from Corsair... which has *(2)* PSU connectors into one 12VHPWR cable... but then when I saw the CableMod cable was using *(4)* PSU connectors into one 12VHPWR cable (also made for Corsair) I did a head scratch and bought that one too. I read that corsair is confident that 2 PSU connectors can supply enough power but it seems that corsair's approach would be the equivalent of a standard 8-pin PCIe connector with a 2nd 8-pin _pigtail_ coming off the same cable... which many advise against using for high power cards.
> 
> In this case I think I am going to trust the 3rd part cable vendor's cable over the PSU manufacturer's solution.


Congrats you wasted your money unnecessarily on another cable and also spread FUD and misinformation. Corsair cable is fine. Corsair knowns better what is appropriate for their own PSUs, unlike random forum posters and clickbait youtubers (speaking of which, I don't even recall a single report that something is wrong with Corsair cable, you just made things up right here on the spot).


----------



## fitnessgrampacertest

Panchovix said:


> Does GPU Power Stage affect directly the max core clock and such? Seeing the GPU stages of some GPUs on the first page (like the PNY or the Gigabyte Windforce), damn they really have really few power stages (14x50A, 700A), way less than the FE (20x70A, 1400A); basically making paying for the AIBs instead of NVIDIA pointless in those cases
> 
> That would also mean that a MSI Suprim or an ASUS Strix (1820A/1680A) overclocks better by default? Or is all just depending of the core itself?


Yes, in a sense. The amount (and quality) of the power stages directly effects the Core and Memory OC potential. Typically more = better. 

Out of the box, Suprim and ROG Strix 4090's are going to overclock better and higher than the PNY/Palit/Gigabyte/MSI Gaming Trio pretty much 10 times out of ten.

The "cheaper" 4090's are cheaper for a reason. 

The expensive 4090's are expensive for a reason

You get what you pay for


----------



## Laithan

Glottis said:


> Congrats your wasted your money unnecessarily on another cable and also spread FUD and misinformation. Corsair cable is fine. Corsair knowns better what is appropriate for their own PSUs, than random forum posters and clickbait youtubers (speaking of which, I don't even recall a single report that something is wrong with Corsair cable, you just made things up right here on the spot).


I'm sure you felt better after that.. you're welcome! I aim to please. I'm always here for you lol.

For the record, nobody (not me for sure) said "Corsair's cable is bad".. and there was additional discussion about the pins and cable differences after that (which evolved the opinion). In fact the discussion IS about the differences between cables <smh>. Reading is fundamental. . Cheers 🍻


----------



## fitnessgrampacertest

slayer6288 said:


> How does one obtain a 600 watt MSI suprim x bios? Can you link it to me? Also to pull over 520 watt on ur tri are u using the triple adapter or something with 4 pcie 8 pins?


I wouldn't recommend flashing a 600 watt BIOS onto your gaming trio. If you are use the 3x 8pin adapter with a 600 watt bios, you are going to run a serious risk of starting a fire. *DO NOT FLASH A 600 WATT BIOS WHILE USING A 3X 8PIN ADAPTER.* You absolutely need the 4 pin adapter for the 600 watt bios.

Even if you did flash the 600 watt bios, a bios with a higher TDP rating is going to give you almost no performance increase at all.

Unfortunately, you purchased one of the worst 4090's in terms of electrical components. I believe the gaming trio only has a measly 14 power stages, which is 16 stages fewer than the MSI suprim. Your performance and overclocking capability is going to be limited by your GPU's lack of VRM phases, regardless of what bios you are running

Please do not do this. And if you do still feel compelled to flash a 600w bios, dont do it unless you are using a 4x 8pin adapter

You get what you pay for


----------



## mirkendargen

fitnessgrampacertest said:


> I wouldn't recommend flashing a 600 watt BIOS onto your gaming trio. If you are use the 3x 8pin adapter with a 600 watt bios, you are going to run a serious risk of starting a fire. You absolutely need the 4 pin adapter for the 600 watt bios.
> 
> Even if you did flash the 600 watt bios, a bios with a higher TDP rating is going to give you almost no performance increase at all.
> 
> Unfortunately, you purchased one of the worst 4090's in terms of electrical components. I believe the gaming trio only has a measly 14 power stages, which is 16 stages fewer than the MSI suprim. Your performance and overclocking capability is going to be limited by your GPU's lack of VRM phases, regardless of what bios you are running
> 
> Please do not do this. And if you do still feel compelled to flash a 600w bios, dont do it unless you are using a 4x 8pin adapter
> 
> You get what you pay for


The card physically won't draw more than 450W regardless of the BIOS if the sense pins on the connector are only grounded for 450W (like the 3-connector Nvidia adapter). Not to mention plenty of people have run cards over 600w with 3x8pins before this connector was ever a thing.

So nothing's going to blow up/catch on fire flashing the BIOS, but there's also no point in doing so.

And your VRM breakdown there is way wrong.


----------



## fitnessgrampacertest

LuckyImperial said:


> I guess that's what I think is odd. My 12600k's IPC shouldn't be that far off from a 12900k. And I have DDR5 @ 5600MHz, but it's only gear 2. Maybe the 2800MHz memory controller is my problem. I don't know if I care enough to go find stable 1 gear timings.


You will never find stable gear 1 timings, your memory controller is not capable of running a 1:1 frequency ratio. Its simply not possible


----------



## yzonker

Nothin' but adapter talk. I was surprised this benchmark went up this much as it's fairly heavily CPU/mem bound. Wish I had run it on my 4090 with the 12900k so I could see if the 13900k was much faster. Obviously not that much.

13900k + 4090










12900k + 3090


----------



## Panchovix

fitnessgrampacertest said:


> Yes, in a sense. The amount (and quality) of the power stages directly effects the Core and Memory OC potential. Typically more = better.
> 
> Out of the box, Suprim and ROG Strix 4090's are going to overclock better and higher than the PNY/Palit/Gigabyte/MSI Gaming Trio pretty much 10 times out of ten.
> 
> The "cheaper" 4090's are cheaper for a reason.
> 
> The expensive 4090's are expensive for a reason
> 
> You get what you pay for


Wow so it was important? Didn't expected it, thanks for the info man! 

Performance/Price then, the FE looks way better than everything else


----------



## Recipe7

Is anyone having an issue with undervolting? Afterburner will not save my undervolt when using the curve editor, and it will bounce between 1.05v and the undervolt, mostly staying under 1.05v still.


----------



## yzonker

Panchovix said:


> Wow so it was important? Didn't expected it, thanks for the info man!
> 
> Performance/Price then, the FE looks way better than everything else


That was never true for 30 series. Seems unlikely to be true for 40 series either. I've seen reference 2x8pin 3090s kick butt and compete with big VRM cards like the KP. And I've seen KP's be total turds.


----------



## motivman

mirkendargen said:


> The card physically won't draw more than 450W regardless of the BIOS if the sense pins on the connector are only grounded for 450W (like the 3-connector Nvidia adapter). Not to mention plenty of people have run cards over 600w with 3x8pins before this connector was ever a thing.
> 
> So nothing's going to blow up/catch on fire flashing the BIOS, but there's also no point in doing so.
> 
> And your VRM breakdown there is way wrong.


I beg to differ. 600W bios on my 4090 trio, card draws over 570W using 3X8 cable


----------



## N19htmare666

cheddardonkey said:


> Well, these are some interesting 500w vs 600w Bios test results.. N19htmare666
> 
> 500w bios is outperforming 600w bios it would seem.
> 
> *MSI SUPRIM LIQUID 600W BIOS - AORUS WATERFORCE 4090 AIO CARD.*
> View attachment 2579189
> 
> *AORUS - CONTROL - FACTORY 500W BIOS*
> View attachment 2579188


EDIT: the 'min' temps on the superposition screens are a lot lower in the original bios. What results do you get after forcing the fans on?


Over to the other experts in this forum. I'm also interested in the answer. 


So with all things being equal how can increasing the power limit on a non thermally throttled card reduce performance? What are we missing?


----------



## Jordyn

So just checked my connection and low and behold:



http://imgur.com/a/FCvZENY


Added to the mega-thread on reddit. Will speak to my retailer tomorrow when they are open and organise a return.

Back to my 3090 in the meantime.


----------



## N19htmare666

motivman said:


> I beg to differ. 600W bios on my 4090 trio, card draws over 570W using 3X8 cable
> 
> View attachment 2579319


Not all implementations of the cable connection checking are equal across all 4090s or cables.


----------



## Hanks552

fitnessgrampacertest said:


> I wouldn't recommend flashing a 600 watt BIOS onto your gaming trio. If you are use the 3x 8pin adapter with a 600 watt bios, you are going to run a serious risk of starting a fire. *DO NOT FLASH A 600 WATT BIOS WHILE USING A 3X 8PIN ADAPTER.* You absolutely need the 4 pin adapter for the 600 watt bios.
> 
> Even if you did flash the 600 watt bios, a bios with a higher TDP rating is going to give you almost no performance increase at all.
> 
> Unfortunately, you purchased one of the worst 4090's in terms of electrical components. I believe the gaming trio only has a measly 14 power stages, which is 16 stages fewer than the MSI suprim. Your performance and overclocking capability is going to be limited by your GPU's lack of VRM phases, regardless of what bios you are running
> 
> Please do not do this. And if you do still feel compelled to flash a 600w bios, dont do it unless you are using a 4x 8pin adapter
> 
> You get what you pay for


Is not the worst but is not that bad, still better then gigabyte wind force and others. But yeah, I’m also with a trio and I want a suprim


----------



## Hanks552

motivman said:


> I beg to differ. 600W bios on my 4090 trio, card draws over 570W using 3X8 cable
> 
> View attachment 2579319


Did you flash it with suprim bios?


----------



## jcde7ago

The contrast of the continued discussion around bios flashing for higher voltage/power/benchmark clocks (for very marginal gains at that for this gen of cards at least) vs Reddit/other social media that's focused on the rising case count of melting adapters...is quite funny to see...

As someone that's been around for 13-14 years now on this site...never change, OCN.


----------



## N19htmare666

jcde7ago said:


> The contrast of the continued discussion around bios flashing for higher voltage/power/benchmark clocks (for very marginal gains at that for this gen of cards at least) vs Reddit/other social media that's focused on the rising case count of melting adapters...is quite funny to see...
> 
> As someone that's been around for 13-14 years now on this site...never change, OCN.


Guessing most people on here are not using the Nvidia cable


----------



## mirkendargen

motivman said:


> I beg to differ. 600W bios on my 4090 trio, card draws over 570W using 3X8 cable
> 
> View attachment 2579319


What 3x8 cable? The 3x8PCIE adapter that came with it, or some 3x8 PSU modular cable? Those are different things. Like I said in my post, what matters to the card is whether both sense pins are grounded, that's the only thing it cares about. There are 3rd party cables that ground both sense pins in all sorts of connector configs.


----------



## Panchovix

jcde7ago said:


> The contrast of the continued discussion around bios flashing for higher voltage/power/benchmark clocks (for very marginal gains at that for this gen of cards at least) vs Reddit/other social media that's focused on the rising case count of melting adapters...is quite funny to see...
> 
> As someone that's been around for 13-14 years now on this site...never change, OCN.


Probably on the forum, most people are using another cable and not the NVIDIA one, or an ATX 3.0 PSU, so they don't have to worry

But damn, the amount of burnt cards lately, gonna go back to my 3080 for now definitely until I get a new cable.


----------



## DokoBG

Gaming Trio has 18x50A + 4x50A power phases for the memory. Who is the guy spewing wrong information about this card ?

Also, if you buy CableMod 3x8pin adapter - it is 600W adapter. Hell, even Corsair official adapter is 2x8pin for 600W.

here is direct from CableMod:










Yea , you read that right, 2x8pin is PLENTY for the 600W power limit of these cards but it needs to be correctly made adapter. Not some garbage made Nvidia adapter.


----------



## MrTOOSHORT

Well I couldn't wait to get my cablemods 4x 8pin to 16pin cable, eta November 14th being shipped. So I bought this 600w capable extension for the mean time to get rid of the stock Nvidia death strap cable. I can confirm no difference in OC from the stock cable, over 500w easy. Now I can tuck a single cable to behind the case:









Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.ca: Electronics


Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.ca: Electronics



www.amazon.ca


----------



## jcde7ago

N19htmare666 said:


> Guessing most people on here are not using the Nvidia cable





Panchovix said:


> Probably on the forum, most people are using another cable and not the NVIDIA one, or an ATX 3.0 PSU, so they don't have to worry
> 
> But damn, the amount of burnt cards lately, gonna go back to my 3080 for now definitely until I get a new cable.


Absolutely - people should avoid using the Nvidia adapter like the plague.

I've removed my 4090 Suprim Liquid X from my rig and put a 2080 back in for now, until my Cablemod + Moddiy cables arrive on Wednesday...just not worth the risk. I wish I had the foresight to have ordered the Cablemod one a week early though, but i'm glad it's at least already shipped out.


----------



## Panchovix

jcde7ago said:


> Absolutely - people should avoid using the Nvidia adapter like the plague.
> 
> I've removed my 4090 Suprim Liquid X from my rig and put a 2080 back in for now, until my Cablemod + Moddiy cables arrive on Wednesday...just not worth the risk. I wish I had the foresight to have ordered the Cablemod one a week early though, but i'm glad it's at least already shipped out.


I bought the Moddiy one, but with the free shipment, so 3 weeks or so haha, 3080 time, my 4090 was a whole day installed LOL


----------



## motivman

Hanks552 said:


> Did you flash it with suprim bios?


yes. 600w suprim X bios


----------



## motivman

mirkendargen said:


> What 3x8 cable? The 3x8PCIE adapter that came with it, or some 3x8 PSU modular cable? Those are different things. Like I said in my post, what matters to the card is whether both sense pins are grounded, that's the only thing it cares about. There are 3rd party cables that ground both sense pins in all sorts of connector configs.


yup, the cable that came with it, connected to 3 separate 8 pin pcie cables from my evga 1200 p2 PSU. I checked my connector today, no melting (knock on wood)... too bad I lost the silicon lottery on my card. Memory maxes out at +1100 in afterburner, and core taps out at 3000mhz


----------



## DokoBG

MrTOOSHORT said:


> Well I couldn't wait to get my cablemods 4x 8pin to 16pin cable, eta November 14th being shipped. So I bought this 600w capable extension for the mean time to get rid of the stock Nvidia death strap cable. I can confirm no difference in OC from the stock cable, over 500w easy. Now I can tuck a single cable to behind the case:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.ca: Electronics
> 
> 
> Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.ca: Electronics
> 
> 
> 
> www.amazon.ca


Thanks I just ordered one of these in white while waiting for my CableMod adapter. It is arriving tomorrow - Hurray for Amazon 1 day shipping !


----------



## AvengedRobix

For Who have Zotac AIRO AMP.. Is normal the Red led on between Power connector and the backplate?


----------



## J7SC

This whole situation was avoidable with a bit of sturdier engineering and execution (QC) for expensive cards carrying up to 600 W...for now, I'm not doing any 3D at all (much less w/ OC) on that system. I am getting the Seasonic replacement and also ordered another from Cablemod for 'future reference' / different projects.

All that said, having 'movement' of any pin even in an PCIe 8 pin can melt things...below is my 3090 Strix (see arrow) w/ 520W; the PCIe cables had a fairly sharp bend (using an older PSU, now replaced with Seasonic PX1300). FYI, cards still works great...


----------



## LunaP

Sheyster said:


> FWIW, Gigabyte released a BIOS update themselves (version F2) to "increase compatibility":
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GeForce RTX™ 4090 GAMING OC 24G Support | Graphics Card - GIGABYTE Global
> 
> 
> Discover AORUS premium graphics cards, ft. WINDFORCE cooling, RGB lighting, PCB protection, and VR friendly features for the best gaming and VR experience!
> 
> 
> 
> 
> www.gigabyte.com


I'm guessing for the newer boards or something?



yzonker said:


> Nothin' but adapter talk. I was surprised this benchmark went up this much as it's fairly heavily CPU/mem bound. Wish I had run it on my 4090 with the 12900k so I could see if the 13900k was much faster. Obviously not that much.
> [spoier]
> 13900k + 4090
> 
> View attachment 2579301
> 
> 
> 12900k + 3090
> 
> View attachment 2579300
> 
> [/spoiler]


Nice another FFXIV player, coming from SLI 2080ti's here which FFXIV absolutely loved, that piled w/ Gshade ( really good shaders (16+) keeping 85-90 fps minimum (dungeons and big areas) in game and loving it @ 4k

I'm barely seeing a diff gpu side so those CPU's are def showing a diff. Same CPU for both, the lower one (since I didn't have the screen itself saved ) were 2080ti's in SLI so barely a 1400 point diff, I'm almost tempted to actually upgrade now holy crap lol.











*edit*

Tried at your settings and its DEF CPU bound lol, the 13900 looks like a beast holy crap. If it wasn't for the lack of AVX I'd upgrade today...


----------



## J7SC

...meanwhile, back at the engineering department for RTX 5090 Ti...


----------



## Sheyster

MrTOOSHORT said:


> Well I couldn't wait to get my cablemods 4x 8pin to 16pin cable, eta November 14th being shipped. So I bought this 600w capable extension for the mean time to get rid of the stock Nvidia death strap cable. I can confirm no difference in OC from the stock cable, over 500w easy.





DokoBG said:


> Thanks I just ordered one of these in white while waiting for my CableMod adapter. It is arriving tomorrow - Hurray for Amazon 1 day shipping !


Apparently these guys are also offering a 3 x 8-pin cable (not an extension cable) that is compatible with EVGA PSUs:









Amazon.com: Fasgear PCIe 5.0 GPU Power Cable Sleeved 70cm | 16pin (12+4) 12VHPWR Connector for RTX 3090 Ti 4080 4090 | 3x8pin (6+2) PCI-e Male Plugs Compatible for ASUS EVGA Seasonic Modular Power Supply : Electronics


Amazon.com: Fasgear PCIe 5.0 GPU Power Cable Sleeved 70cm | 16pin (12+4) 12VHPWR Connector for RTX 3090 Ti 4080 4090 | 3x8pin (6+2) PCI-e Male Plugs Compatible for ASUS EVGA Seasonic Modular Power Supply : Electronics



www.amazon.com





"16AWG Copper Wire & Up to 600W Output"


----------



## jcde7ago

J7SC said:


> ...meanwhile, back at the engineering department for RTX 5090 Ti...
> View attachment 2579356


The other end goes into the GPU while those prongs will require one of these:


----------



## AdamK47

motivman said:


> I beg to differ. 600W bios on my 4090 trio, card draws over 570W using 3X8 cable
> 
> View attachment 2579319


I feel like people choose to ignore facts. They've been presented this evidence before and yet continue to believe untrue information.


----------



## MrTOOSHORT

Sheyster said:


> Apparently these guys are also offering a 3 x 8-pin cable (not an extension cable) that is compatible with EVGA PSUs:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Amazon.com: Fasgear PCIe 5.0 GPU Power Cable Sleeved 70cm | 16pin (12+4) 12VHPWR Connector for RTX 3090 Ti 4080 4090 | 3x8pin (6+2) PCI-e Male Plugs Compatible for ASUS EVGA Seasonic Modular Power Supply : Electronics
> 
> 
> Amazon.com: Fasgear PCIe 5.0 GPU Power Cable Sleeved 70cm | 16pin (12+4) 12VHPWR Connector for RTX 3090 Ti 4080 4090 | 3x8pin (6+2) PCI-e Male Plugs Compatible for ASUS EVGA Seasonic Modular Power Supply : Electronics
> 
> 
> 
> www.amazon.com
> 
> 
> 
> 
> 
> "16AWG Copper Wire & Up to 600W Output"


Thanks but unfortunately they won't ship to my address.


----------



## DokoBG

I dont want to lock up the adapter to EVGA psu's only. I am planning to give away this adapter later on when i sell the card so ppl can use it with any PSU.


----------



## Alelau18

This is nothing new but yes, you can draw 600W out of a 3x8pin adapter (even 2x8pin if the 3 12V wires per 8pin are wired to the 6 12V pins on the 12VHPWR connector) as long as the sense wires required are grounded. The NV adapter has an IC that senses if only 3 are connected and wont ground those sense pins locking you at 450W


----------



## mattskiiau

Is Fasgear reputable? I can get one in a few days here as well.


----------



## jcde7ago

mattskiiau said:


> Is Fasgear reputable? I can get one in a few days here as well.


Reputable in the same way that Cablemod and Moddiy are? No, not even close...most people have probably never heard of that or any of the other no-name brands on Amazon until this whole melting adapter debacle.

Now, are Fasgear and these other no-name brands a worse option than the Nvidia adapter? Probably not, and it looks like at least one or two people in this thread are having no issues with the Fasgear one at least.

I'm looking at the "Sirlyr" branded one myself, looks pretty much the same as the Fasgear, and is the only one available with next day shipping in the US on Amazon...but unsure yet since i've got a week or less of waiting for my Cablemod one.


----------



## MrTOOSHORT

mattskiiau said:


> Is Fasgear reputable? I can get one in a few days here as well.


Feels and looks like quality to me.


----------



## mirkendargen

mattskiiau said:


> Is Fasgear reputable? I can get one in a few days here as well.


It's tough to screw up 1:1 pins crimped on to some wire, it would be abundantly obvious if the wire is hanging off, a pin is falling out, etc. It's when you do weird stuff like the Nvidia adapter is that the room for error increases, and then hide it under a bunch of heat wrap.

The random 90/180deg power adapters are another place where quality concerns come up because they generally involve soldering to a PCB that may not be easy to inspect, but none of that in these simple cables/adapters.


----------



## mattskiiau

Thanks for the replies. 
I've ordered one, fairly cheap of here, will come tomorrow. 

Will use the Fasgear until the Cablemod or Corsair one is delivered in a few weeks/months!


----------



## DokoBG

Can i pitch a Halloween idea

Can someone make a Halloween costume of melted "4090 cables" - maybe not exactly 4090 cables but molex connectors all melted with a sign of "4090 melted cables". That would be sick, and send us pictures.


----------



## slayer6288

fitnessgrampacertest said:


> I wouldn't recommend flashing a 600 watt BIOS onto your gaming trio. If you are use the 3x 8pin adapter with a 600 watt bios, you are going to run a serious risk of starting a fire. *DO NOT FLASH A 600 WATT BIOS WHILE USING A 3X 8PIN ADAPTER.* You absolutely need the 4 pin adapter for the 600 watt bios.
> 
> Even if you did flash the 600 watt bios, a bios with a higher TDP rating is going to give you almost no performance increase at all.
> 
> Unfortunately, you purchased one of the worst 4090's in terms of electrical components. I believe the gaming trio only has a measly 14 power stages, which is 16 stages fewer than the MSI suprim. Your performance and overclocking capability is going to be limited by your GPU's lack of VRM phases, regardless of what bios you are running
> 
> Please do not do this. And if you do still feel compelled to flash a 600w bios, dont do it unless you are using a 4x 8pin adapter
> 
> You get what you pay for


You don't know what you are talking about which is apparent from your lack of knowledge of the VRMs the card has.


----------



## Hanks552

motivman said:


> yup, the cable that came with it, connected to 3 separate 8 pin pcie cables from my evga 1200 p2 PSU. I checked my connector today, no melting (knock on wood)... too bad I lost the silicon lottery on my card. Memory maxes out at +1100 in afterburner, and core taps out at 3000mhz


I was able to run 3025 mhz and +1550 memory with mine, stock bios


----------



## motivman

Hanks552 said:


> I was able to run 3025 mhz and +1550 memory with mine, stock bios


most likely returning my card, its GARBAGE... lol


----------



## Mad Pistol

motivman said:


> most likely returning my card, its GARBAGE... lol


Which one did you get?


----------



## Mad Pistol

Im currently running my Windforce 4090 on a stress test loop with Port Royal, as that seems to be the one thing that crashes this card.

+180 core
+1500 memory
1.1V
106% power <—ugh

Core seems to be sitting between 2850-2880 around 68-70c. We will see if it completes this.


----------



## motivman

Mad Pistol said:


> Which one did you get?


4090 trio


----------



## mattskiiau

motivman said:


> 4090 trio


Damn that's a shame. My Trio is able to bench at 3100Mhz at stock.


----------



## motivman

mattskiiau said:


> Damn that's a shame. My Trio is able to bench at 3100Mhz at stock.


wow. I think I must have gotten the WORST 4090 ever, lol


----------



## Hanks552

mattskiiau said:


> Damn that's a shame. My Trio is able to bench at 3100Mhz at stock.


Nice, I still need to tweak mine, I might just keep it and water cool it, I wanted a TUF but seems like I’m going to have to settle for this trio


----------



## Hanks552

motivman said:


> wow. I think I must have gotten the WORST 4090 ever, lol


Did you overclock it before doing the bios flash? Just to have a baseline?


----------



## mattskiiau

Hanks552 said:


> Nice, I still need to tweak mine, I might just keep it and water cool it, I wanted a TUF but seems like I’m going to have to settle for this trio


For now I've settled to keep it at 200w while I wait for a new adaptor 



motivman said:


> wow. I think I must have gotten the WORST 4090 ever, lol


Damn that sucks, seems I got the opposite from you!
Before OC, naturally boost to 2900mhz~.
It's not too stable at 3100mhz but able to bench on it. Will keep it undervolted @ 3000mhz when new adaptor arrives. 
Not gonna bother with any 600w BIOS either.


----------



## Arizor

Yep playing a lot of games undervolted at 0.9v at the minute to keep things safely below 300w  

Kind of ridiculous really such a QC oversight occurred in such an obvious point of failure, interested to see NVIDIA's response.

Ah well, Cablemod should arrive tomorrow with any luck.


----------



## mattskiiau

Arizor said:


> Yep playing a lot of games undervolted at 0.9v at the minute to keep things safely below 300w
> 
> Kind of ridiculous really such a QC oversight occurred in such an obvious point of failure, interested to see NVIDIA's response.
> 
> Ah well, Cablemod should arrive tomorrow with any luck.


I wonder what produces more heat or easier on the adaptor to run? I have no idea about that sorta thing.
For example, 300w @ 0.9v vs 200w @ 1.1v. 

I'm just being overly paranoid lol.


----------



## Arizor

mattskiiau said:


> I wonder what produces more heat or easier on the adaptor to run? I have no idea about that sorta thing.
> For example, 300w @ 0.9v vs 200w @ 1.1v.
> 
> I'm just being overly paranoid lol.


I would assume wattage is the biggest factor, but I honestly have no clue, lots of OGs here that might have insight.

Yep, I'm super paranoid. I've stopped doing the daily checks out of fear of buggering it up by checking (Schrodinger cable), but I do reach over and give a feel of the connector as I play sometimes still...


----------



## mattskiiau

Arizor said:


> Yep, I'm super paranoid. I've stopped doing the daily checks out of fear of buggering it up by checking (Schrodinger cable), but I do reach over and give a feel of the connector as I play sometimes still...


Are you ME? I think you are me.


----------



## Arizor

mattskiiau said:


> Are you ME? I think you are me.


I think there are a lot of us out there! Speaking of which, odd thing, but there's a load of BIOSs available, but no one has uploaded TUF OC?

Not that it matters much, my "standard" TUF overclocks like a champ, but I am intrigued by the TUF OC BIOS nonetheless...


----------



## LunaP

Looking through but missed it, how are u guys undervolting, I've been running stock + temp oc's during gaming since install. Haven't received shipment notice from cablemods yet though. Hoping the cables long enough. Anyone know if the preorders for the 90 degree are up yet and if those are deemed safe , saw warnings somewhere on that.


----------



## mattskiiau

LunaP said:


> Looking through but missed it, how are u guys undervolting, I've been running stock + temp oc's during gaming since install. Haven't received shipment notice from cablemods yet though. Hoping the cables long enough. Anyone know if the preorders for the 90 degree are up yet and if those are deemed safe , saw warnings somewhere on that.


You can easily just slide the power limit to 50%~ which will limit you to around 220w.
If you want to specifically undervolt, you can lock voltage in the AB curve editor be using CTRL+ L on the respective voltage you want to lock at.


----------



## Arizor

LunaP said:


> Looking through but missed it, how are u guys undervolting, I've been running stock + temp oc's during gaming since install. Haven't received shipment notice from cablemods yet though. Hoping the cables long enough. Anyone know if the preorders for the 90 degree are up yet and if those are deemed safe , saw warnings somewhere on that.


What I do: find a stable overclock. For me, on core, it's +250, but of course YMMV.

So add +250 on core, your usual +MEM, leave power on 100%.

Press CTRL + F. This will bring up the Curve Editor. Click where you want the undervolt limit (for me, it's 0.925).

Hold SHIFT + click and drag all the way to the right from one node onwards from your desired undervolt.

Once everything is highlighted, pull it all DOWN _below_ your desired undervolt limit. Hit apply.


----------



## alitayyab

AvengedRobix said:


> For Who have Zotac AIRO AMP.. Is normal the Red led on between Power connector and the backplate?


red = normal (amplify) bios
green = quiet bios

From what i understand, not possible to turn the light off as of yet.


----------



## mirkendargen

mattskiiau said:


> I wonder what produces more heat or easier on the adaptor to run? I have no idea about that sorta thing.
> For example, 300w @ 0.9v vs 200w @ 1.1v.
> 
> I'm just being overly paranoid lol.


12V is going into the card at the connector regardless of what you set in Afterburner because that's what your PSU is supplying, the VRM's are doing the work to bring the voltage to whatever you have set. Current is what matters for the connector, which is going to be based entirely on power load given the fixed voltage.


----------



## N19htmare666

Does anyone have the new modern warfare 2? If yes could you try running the benchmark at 4k and post the results?


----------



## J7SC

Arizor said:


> I would assume wattage is the biggest factor, but I honestly have no clue, lots of OGs here that might have insight.
> 
> Yep, I'm super paranoid. I've stopped doing the daily checks out of fear of buggering it up by checking (Schrodinger cable), but I do reach over and give a feel of the connector as I play sometimes still...


..._"just because you're paranoid doesn't mean there isn't someone a cable out to get you burn you"_

I have yet to unplug mine - that will happen when I get the replacement cables - but I do reach over and feel the connector - warmer than I'd like but not to the point that I have to let go. Then again, plastic isn't that good at heat transfer anyway, and I do have a fan pointing at the backplate that also cools the connector. 

I will be really happy when both the new cables and the water-block arrive...nobody say nuthin about supply chain issues


----------



## mirkendargen

J7SC said:


> ..._"just because you're paranoid doesn't mean there isn't someone a cable out to get you burn you"_
> 
> I have yet to unplug mine - that will happen when I get the replacement cables - but I do reach over and feel the connector - warmer than I'd like but not to the point that I have to let go. Then again, plastic isn't that good at heat transfer anyway, and I do have a fan pointing at the backplate that also cools the connector.
> 
> I will be really happy when both the new cables and the water-block arrive...nobody say nuthin about supply chain issues


How are you really getting anything fron feeling the connector? On mine it's warm....but everything around there is warm cause it's surrounded by an ungodly huge heatsink radiating heat with fans blowing the hot air on everything. I'd expect anything in that vicinity to be warm whether it's producing any heat on it's own or not.. Once I have a waterblock it should be easier to tell.

Has your block from Formulamod shipped yet? Mine hasn't, I'm wondering if they actually dropship from Bykski or something and the supply isn't real yet.


----------



## mattskiiau

mirkendargen said:


> 12V is going into the card at the connector regardless of what you set in Afterburner because that's what your PSU is supplying, the VRM's are doing the work to bring the voltage to whatever you have set. Current is what matters for the connector, which is going to be based entirely on power load given the fixed voltage.


So limiting power and wattage would be the best way to lower current/heat to the connector? Sorry if I misunderstand.


----------



## J7SC

mirkendargen said:


> How are you really getting anything fron feeling the connector? On mine it's warm....but everything around there is warm cause it's surrounded by an ungodly huge heatsink radiating heat with fans blowing the hot air on everything. I'd expect anything in that vicinity to be warm whether it's producing any heat on it's own or not.. Once I have a waterblock it should be easier to tell.
> 
> Has your block from Formulamod shipped yet? Mine hasn't, I'm wondering if they actually dropship from Bykski or something and the supply isn't real yet.


...my 4090 is vertically mounted and not _in _the case but in front of it (PCIe 4.0 riser) and just touching the connector shows that it gets warm (not scolding hot though). As to my Formulamod water block order, mine went from 'processing' to 'waiting for shipping' earlier today.


----------



## cheddardonkey

N19htmare666 said:


> EDIT: the 'min' temps on the superposition screens are a lot lower in the original bios. What results do you get after forcing the fans on?
> 
> 
> Over to the other experts in this forum. I'm also interested in the answer.
> 
> 
> So with all things being equal how can increasing the power limit on a non thermally throttled card reduce performance? What are we missing?


Forcing fans up will drop temps by roughly 4 with the suprim bios. The Aorus bios the fans stay running but at low speed for most of the test duration.
,
After some in game testing, I am also seeing the Aorus bios perform bettter in temps (similar to superposition), and slightly better FPS (+5-7)while holding a slightly lower clock time. I'd like to get a better analysis than what I've been able to put together but it seems a mod to the Aorus bios allowing more power/voltage control would be the start.


----------



## mirkendargen

mattskiiau said:


> So limiting power and wattage would be the best way to lower current/heat to the connector? Sorry if I misunderstand.


They're the same thing. Lowering the power limit will set a wattage ceiling. From there you could also lower the voltage to try and get better performance within that wattage ceiling.


----------



## wheelsun87

Mad Pistol said:


> Im currently running my Windforce 4090 on a stress test loop with Port Royal, as that seems to be the one thing that crashes this card.
> 
> +180 core
> +1500 memory
> 1.1V
> 106% power <—ugh
> 
> Core seems to be sitting between 2850-2880 around 68-70c. We will see if it completes this.


Flash the bios to the Gigabyte Gaming OC, you'll be able to run 133% power limit using MSI Afterburner!
I'm able to +300 core clock and +1675 memory clock completely stable, +1700 starts artifacting

Edit: Even with the power slider moved up to 133%, I've only been able to get the card to draw 540w max. Maybe I should try some different benchmarks...? Although from what I'm seeing in this thread, its pointless to try and reach 600w in terms of efficiency and performance


----------



## RaMsiTo

N19htmare666 said:


> Does anyone have the new modern warfare 2? If yes could you try running the benchmark at 4k and post the results?


gpu stock, preset extreme, dlss off


----------



## Azazil1190

Guys just flash my tuf oc with strix oc bios.
Everything is ok but in afterburner i can go max +120pl .Is this normal ? To tuf oc bios the max is 133


----------



## Nico67

Azazil1190 said:


> Guys just flash my tuf oc with strix oc bios.
> Everything is ok but in afterburner i can go max +120pl .Is this normal ? To tuf oc bios the max is 133


Yes, base is 500w, so 500x120%=600w


----------



## jl434

mattskiiau said:


> Thanks for the replies.
> I've ordered one, fairly cheap of here, will come tomorrow.
> 
> Will use the Fasgear until the Cablemod or Corsair one is delivered in a few weeks/months!


Did u check your PSU in the support list?
I notice they have two types of 12VHPWR cables, which support different brands of PSU:

ASUS / EVGA / Seasonic








Fasgear PCI-e Gen 5.0 Power Cable for EVGA/ASUS/Seasonic Power Supply - 70cm 12VHPWR 16pin (12+4) to PCIE 3x8pin (6+2) Male Cable - for RTX 3090 Ti & 4080 4090 Graphics Card


Fasgear PCI-e Gen 5.0 Power Cable for EVGA/ASUS/Seasonic Power Supply - 70cm 12VHPWR 16pin (12+4) to PCIE 3x8pin (6+2) Male Cable - for RTX 3090 Ti & 4080 4090 Graphics Card



www.fasgear.com





Corsair / Great Wall / Thermaltake / NZXT








Fasgear PCIe 5.0 GPU Power Cable Only for Corsair/NZXT/Great Wall/Thermaltake Modular Power Supply - 70cm | 16pin(12+4) 12VHPWR Connector for RTX 3090 Ti 4080 4090 |


Fasgear PCIe 5.0 GPU Power Cable Only for Corsair/NZXT/Great Wall/Thermaltake Modular Power Supply - 70cm | 16pin(12+4) 12VHPWR Connector for RTX 3090 Ti 4080 4090 |



www.fasgear.com





And the one selling in Amazon AU is for ASUS/EVGA/Seasonic


----------



## Arizor

Great vid from GN (as usual) on the adaptor issue


----------



## LunaP

Arizor said:


> Great vid from GN (as usual) on the adaptor issue


The 150 vs 300 cable part was probably the most interesting find. I'm tempted to check mine.


----------



## narrn2761

I have available instock to me locally to purchase:

TUF OC
Zotac Extreme AMP
iChill X3

I don't like the looks of the TUF OC and Extreme AMP, but I dig the RGB on the iChill X3. Should I get it, or should I really pay the extra few pennies for the TUF OC? AMP Extreme is the cheapest out of these 3 at my local shops.


----------



## GAN77




----------



## Tideman

Taken out my Gaming OC for the moment until my custom cable arrives. No damage so far, but I only got it 3 days ago. My TUF was used for nearly 2 weeks and has no signs of any issue either.

Don't want to take a chance on it being one of the weaker adapters.

I'd much rather have the official corsair cable but no chance of that..


----------



## LunaP

whats the normal turn around time for shipment once u order the cable? Figured its the weekend so won't see anything till monday since I ordered thursday morning.


----------



## mattskiiau

jl434 said:


> Did u check your PSU in the support list?
> I notice they have two types of 12VHPWR cables, which support different brands of PSU:



I just purchased the adaptor to hold me over while I wait for the official corsair version.








Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.com.au: Computers


Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.com.au: Computers



www.amazon.com.au


----------



## Pollomir

narrn2761 said:


> I have available instock to me locally to purchase:
> 
> TUF OC
> Zotac Extreme AMP
> iChill X3
> 
> I don't like the looks of the TUF OC and Extreme AMP, but I dig the RGB on the iChill X3. Should I get it, or should I really pay the extra few pennies for the TUF OC? AMP Extreme is the cheapest out of these 3 at my local shops.


The Inno 3D 4090s are the most basics in terms of PCB. 14 + 3 power stages.


----------



## biigshow666

N19htmare666 said:


> Does anyone have the new modern warfare 2? If yes could you try running the benchmark at 4k and post the results?


here's one more for ya. extreme preset + 2.25x dldsr + ultra image scaling = render resolution of 4432x1856.


----------



## newls1

please help me out with a simple question.... IS there a preferred brand/model of 4090 to get this go around? I feel naked with out eVGA now, so im looking for knowledgable input here. My local MC has MSI gaming trio on the shelf, but not liking the low power limit (yes I can flash bios to something else, but dont feel i want to when spending this amount of money) . Id appreciate any input here on what card/s are the preferred ones for the 40xx series gpus. thank you


----------



## Sheyster

Did a quick n dirty Superposition 4K run with my Giga-G-OC, 3030 core, +1000 mem, stock voltage. I'm done OC'ing it until my CableMod gets here in 2 weeks, running the card at stock settings until then.

I'm not publishing this, but it's good enough for #9 behind Kedarwolf. FWIW, this 13700K is running stock ASUS optimized defaults and RAM is set to the default XMP profile (6400C32). Seems like even at stock settings the 13700K is a nice chip!


----------



## Mad Pistol

newls1 said:


> please help me out with a simple question.... IS there a preferred brand/model of 4090 to get this go around? I feel naked with out eVGA now, so im looking for knowledgable input here. My local MC has MSI gaming trio on the shelf, but not liking the low power limit (yes I can flash bios to something else, but dont feel i want to when spending this amount of money) . Id appreciate any input here on what card/s are the preferred ones for the 40xx series gpus. thank you


Everyone seems to want the RTX 4090 FE, so I would say that's the preferred model to get.


----------



## Sheyster

Mad Pistol said:


> Everyone seems to want the RTX 4090 FE, so I would say that's the preferred model to get.


I would have bought one for sure, if one were available. I would also suggest the Gigabyte Gaming OC and ASUS TUF (non OC) as good choices. All of these have a 600w PL BIOS.


----------



## Mad Pistol

LunaP said:


> The 150 vs 300 cable part was probably the most interesting find. I'm tempted to check mine.


I'm very curious now if the burned cables were all 150V rated. If so, problem solved. Otherwise, the mystery continues.


----------



## Sayenah

All, Corsair has this cable in stock https://www.corsair.com/us/en/Categ...-Supplies/12-pin-GPU-Power-Cable/p/CP-8920274

Is it drastically different than the VH12PWR one which they have under a different SKU?


----------



## mirkendargen

Sayenah said:


> All, Corsair has this cable in stock https://www.corsair.com/us/en/Categories/Products/Accessories-|-Parts/PC-Components/Power-Supplies/12-pin-GPU-Power-Cable/p/CP-8920274
> 
> Is it drastically different than the VH12PWR one which they have under a different SKU?


Yes, that cable will not work, different connector. It's for 3090/3080 FE cards only.


----------



## ZealotKi11er

Mad Pistol said:


> Everyone seems to want the RTX 4090 FE, so I would say that's the preferred model to get.


Mainly because it can fit in more cases and its usually cheaper. Here in Canada FE is at least $150 CAD cheaper than the cheapest AIB model. Also has 600w bios and has WB support.


----------



## Pollomir

LunaP said:


> The 150 vs 300 cable part was probably the most interesting find. I'm tempted to check mine.


I already did, couldnt't resist. 300V 14AWG 105C. Asus Strix OC.


----------



## yzonker

Pollomir said:


> I already did, couldnt't resist. 300V 14AWG 105C. Asus Strix OC.


Does that change anything other than the insulation thickness? For example,






Comparison Chart For UL Wires & Cables - Standard Wire & Cable Co.


Standard Wire & Cable Co. design, engineer, and manufacture commercial, military, and industrial electronic and electrical wire and cable.



www.standard-wire.com


----------



## mirkendargen

Pollomir said:


> I already did, couldnt't resist. 300V 14AWG 105C. Asus Strix OC.


Oh what the hell, I never used the thing and almost certainly never will anyway. Also 300V 14AWG 105C (I actually pealed it back at one of the 8pin connectors rather than the 12VHPWR connector since there's less going on there)

I don't think the different 150V vs. 300V wire changes anything functionally for carrying 12V, but it could be an indication of a different source/factory that's also doing something else wrong in the construction that *does* change the functionality.


----------



## narrn2761

Pollomir said:


> The Inno 3D 4090s are the most basics in terms of PCB. 14 + 3 power stages.


What about the Zotac AMP Extreme?


----------



## Pollomir

yzonker said:


> Does that change anything other than the insulation thickness? For example,
> 
> 
> 
> 
> 
> 
> Comparison Chart For UL Wires & Cables - Standard Wire & Cable Co.
> 
> 
> Standard Wire & Cable Co. design, engineer, and manufacture commercial, military, and industrial electronic and electrical wire and cable.
> 
> 
> 
> www.standard-wire.com





mirkendargen said:


> Oh what the hell, I never used the thing and almost certainly never will anyway. Also 300V 14AWG 105C (I actually pealed it back at one of the 8pin connectors rather than the 12VHPWR connector since there's less going on there)
> 
> I don't think the different 150V vs. 300V wire changes anything functionally for carrying 12V, but it could be an indication of a different source/factory that's also doing something else wrong in the construction that *does* change the functionality.


In the GN video they mention that the cable should be 300V, at least that is what they have been told by some Nvidia partners. Just wanted to check if my cable was within the "specs" or was one of the odd ones.

Maybe the 150V were presenting problems and were recalled before launch and some get out to the wild or was an specific batch of those cables that had soldering problems. As soon as more info is released by users, earlier they could find a patern.



narrn2761 said:


> What about the Zotac AMP Extreme?
> 
> Zotac AMP Extreme AIRO


24+4.


----------



## originxt

I scored 26 653 in Port Royal


Intel Core i9-10980XE Extreme Edition Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





#2 on 10980xe's with 4090s lol, though it doesn't mean much.

I don't know how some of you guys are breaking 27-28k.

Quick and dirty OC. Definitely capping out memory though.















4090 FE. Running off of corsair's cable. No ReBar.

Also, someone please tell me how to get past the 3k barrier on voltage control lol.


----------



## Pollomir

originxt said:


> I scored 26 653 in Port Royal
> 
> 
> Intel Core i9-10980XE Extreme Edition Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> #2 on 10980xe's with 4090s lol, though it doesn't mean much.
> 
> I don't know how some of you guys are breaking 27-28k.
> 
> Quick and dirty OC. Definitely capping out memory though.
> 
> View attachment 2579574
> View attachment 2579575
> 
> 
> 4090 FE. Running off of corsair's cable. No ReBar.
> 
> Also, someone please tell me how to get past the 3k barrier on voltage control lol.
> 
> View attachment 2579576


ReBAR on. You'll gain several points. Also max performance in power profile in the drivers.

About the 3k graph.



alitayyab said:


> Close the App. Open "MSIAfterburner.cfg" (located in the installation folder). Look for "VFCurveEditorMaxFrequency" and enter a value like 4000 or 3500 to extend the frequency axis. Restart App.


----------



## DokoBG

Got my cable that i ordered yesterday from Amazon. Just installed it. The cable feels MUCH more robust than the Nvidia one.















Another thing i noticed is that the pins of the connector of the Amazon cable have full squares with no slits, where the pins of the Nvidia adapter have 2 slits at the top and bottom of each pin.


----------



## originxt

Pollomir said:


> ReBAR on. You'll gain several points. Also max performance in power profile in the drivers.
> 
> About the 3k graph.


Cool, thanks for the ReBar tip. Hopped onto my standard win10 install and got a much improved scored. Silicon limited at this point I think or maybe power issue. Only hitting about 510 watts but I know this can go 600+ via fuzzy donut.









I scored 27 338 in Port Royal


Intel Core i9-10980XE Extreme Edition Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





I am unsure how I missed the reply for my 3k limit, changed and fixed it.


----------



## vigorito

How this cable is good vs. this one









ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable


Buy ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable for $24.99 with Free Shipping Worldwide (In Stock)




www.moddiy.com





Vs.

https://www.etsy.com/listing/124820...ick_sum=f04f5b5d&ref=shop_home_active_2&crt=1


----------



## KedarWolf

vigorito said:


> How this cable is good vs. this one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable
> 
> 
> Buy ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable for $24.99 with Free Shipping Worldwide (In Stock)
> 
> 
> 
> 
> www.moddiy.com
> 
> 
> 
> 
> 
> Vs.
> 
> https://www.etsy.com/listing/124820...ick_sum=f04f5b5d&ref=shop_home_active_2&crt=1


That one on Etsy is a 12-pin cable and won't work on 4090s. See the picture. He may have the wrong picture though, it says 12x4-pin cable which is 16-pin.

17 gauge is better than 18 gauge, thicker and silicon is great because it's much more flexible.


----------



## DokoBG

I also ordered this moddiy and also the CableMod...  The moddiy looks like it is similar quality as the Amazon one but i cant tell until i get it to compare.


----------



## newls1

jesus i can already tell its gonna be forever before i find a gigabyte or FE card in stock anywhere anytime soon.. this is so damn frusturating. Took 6 months for me to get a 3090, but was easy to get a 3090ti, now getting a 4090 is as hard as getting a 3090 like last year. Damn this industry and all the bastards that scalp


----------



## Jordyn

DokoBG said:


> I also ordered this moddiy and also the CableMod...  The moddiy looks like it is similar quality as the Amazon one but i cant tell until i get it to compare.


How hot does the connection get on the Amazon cable under load?


----------



## vigorito

KedarWolf said:


> That one on Etsy is a 12-pin cable and won't work on 4090s. See the picture. He may have the wrong picture though, it says 12x4-pin cable which is 16-pin.
> 
> 17 gauge is better than 18 gauge, thicker and silicon is great because it's much more flexible.


the one on moddiy is using 16agw,not sure about the difference in terminals,but i think the one from moddiy is good too


----------



## DokoBG

Just barely warm, my card is at 480W bios. I haven't flashed any of that 600W stuff.


----------



## vigorito

Does anyone have strix with power slider to 133%,i can only do 120% in msi afterburner 465 beta 2,using seasonic oem cable 16 to 2x8 eps pin
and what is real gpu power usage gpu usage or rail power usage


----------



## mirkendargen

vigorito said:


> Does anyone have strix with power slider to 133%,i can only do 120% in msi afterburner 465 beta 2,using seasonic oem cable 16 to 2x8 eps pin
> and what is real gpu power usage gpu usage or rail power usage


Strix 100% is 500w, 120% of that is 600w, it's correct.


----------



## vigorito

im seeing 450 info not 500


----------



## LunaP

newls1 said:


> jesus i can already tell its gonna be forever before i find a gigabyte or FE card in stock anywhere anytime soon.. this is so damn frusturating. Took 6 months for me to get a 3090, but was easy to get a 3090ti, now getting a 4090 is as hard as getting a 3090 like last year. Damn this industry and all the bastards that scalp


Join the discord they have a BB sku locator bot that will inform u if ur bestbuys near u + newegg/amazon get them instock (offline for BB) people have been getti g theirs due to this, its how I got mine.

I linked earlier will relink when back at pc


----------



## originxt

If someone is in the oc, CA area, I can cancel my Fe order at a specific store to pick up. It's ready for pick up right now but I already bought a previously canceled FE.


----------



## AdamK47

newls1 said:


> jesus i can already tell its gonna be forever before i find a gigabyte or FE card in stock anywhere anytime soon.. this is so damn frusturating. Took 6 months for me to get a 3090, but was easy to get a 3090ti, now getting a 4090 is as hard as getting a 3090 like last year. Damn this industry and all the bastards that scalp


It doesn't help that Nvidia is limiting the 40 series release in order to move existing 30 series stock.


----------



## Mad Pistol

It appears this is the 24/7 OC settings for my 4090 Windforce.


----------



## N19htmare666

Mad Pistol said:


> It appears this is the 24/7 OC settings for my 4090 Windforce.
> 
> View attachment 2579632


Too late to send it back?


----------



## Mad Pistol

N19htmare666 said:


> Too late to send it back?


I kind of figured it wouldn't be great, so I'm ok with it. I'm not planning to overclock it to the moon. In fact, I normally kept my 3080 FE at stock. The only reason I'm not doing that here is because these cards (even the crappy Windforce) have monster coolers that keep these GPUs under 70c.


----------



## DokoBG

Also what would you gain with a different card ? 50mhz core and 300mhz memory if you are lucky ?? That's like 3FPS - who cares.


----------



## Mad Pistol

DokoBG said:


> Also what would you gain with a different card ? 50mhz core and 300mhz memory if you are lucky ?? That's like 3FPS - who cares.


Exactly. 30 Mhz on a card that can boost to 2900+ is... 1-2% max? For the amount of money I would have to spend to secure a better card + silicon lottery, it wouldn't even be close to worth it.

I thought about selling it for a 4090 FE, but that's honestly a stupid idea since I'm not planning on putting it on water or doing OC competitions. I thought I would be disappointed getting a lowly Windforce model, but no. It's a 4090, and it's pretty good.


----------



## yzonker

Mad Pistol said:


> Exactly. 30 Mhz on a card that can boost to 2900+ is... 1-2% max? For the amount of money I would have to spend to secure a better card + silicon lottery, it wouldn't even be close to worth it.
> 
> I thought about selling it for a 4090 FE, but that's honestly a stupid idea since I'm not planning on putting it on water or doing OC competitions. I thought I would be disappointed getting a lowly Windforce model, but no. It's a 4090, and it's pretty good.


I tested mine in CP2077. 4k, max settings. It wasn't much. I should have taken screenshots, but anyway I tested memory first, then core with the memory OC still set,

Mem +1800: 3.3% over stock
Core +195: 1.7% over stock with +1800 mem

Yes, adding almost 200mhz to the core was less than 2%.


----------



## N19htmare666

yzonker said:


> I tested mine in CP2077. 4k, max settings. It wasn't much. I should have taken screenshots, but anyway I tested memory first, then core with the memory OC still set,
> 
> Mem +1800: 3.3% over stock
> Core +195: 1.7% over stock with +1800 mem
> 
> Yes, adding almost 200mhz to the core was less than 2%.


And think of all the money you could save in energy bills of you have a good chip and decide to under volt without degrading performance. You could argue that buying a good 4090 is really about saving money


----------



## Mad Pistol

N19htmare666 said:


> And think of all the money you could save in energy bills of you have a good chip and decide to under volt without degrading performance. You could argue that buying a good 4090 is really about saving money


Buying a $1600 GPU to save money.


----------



## DokoBG

I am curious to see what kind of undervolt you would need to "save money" from a $500 price premium top bin Strix for example that clocks like a GOD... 

Anyways, this generation is just not clocking that great. Gamers Nexus did a LN2 overclock of a Suprim X and the poor thing scored like 10-12% faster than a regular air cooler card. Underwhelming for such an extreme cooling.


----------



## N19htmare666

Mad Pistol said:


> Buying a $1600 GPU to save money.


200w saving could result in a £600 saving per year. Only problem is I would need to be gaming all the time. It seems the best thing I could do for the environment is start gaming 24 hours a day for 365 days a year. Hmmm 🧐 3 years of 'work' ahead


----------



## Mad Pistol

N19htmare666 said:


> 200w saving could result in a £600 saving per year. Only problem is I would need to be gaming all the time. It seems the best thing I could do for the environment is start gaming 24 hours a day for 365 days a year. Hmmm 🧐 3 years of 'work' ahead


If you're going to do this, at least Twitch stream it. We all deserve to see you suffer... I mean, enjoy gaming!


----------



## BattlePhenom

Mad Pistol said:


> It appears this is the 24/7 OC settings for my 4090 Windforce.
> 
> View attachment 2579632













Looks like this is 24/7 for me on Gigabyte Gaming OC.


----------



## Mad Pistol

BattlePhenom said:


> View attachment 2579648
> 
> 
> 
> Looks like this is 24/7 for me on Gigabyte Gaming OC.


Have you tested those settings on Port Royal? That seems to be the great barrier to a 24/7 GPU overclock. If those settings pass on Port Royal, you've got a good card.

Also, you've got your fan to 90% manually? Shame! That's cheating.


----------



## DokoBG

Mad Pistol said:


> If you're going to do this, at least Twitch stream it. We all deserve to see you suffer... I mean, enjoy gaming!


Yea but remember 4090's crash/reset driver when you watch twitch unless you have "Prefer Maximum Performance" in the NV driver....


----------



## Arizor

BattlePhenom said:


> View attachment 2579648
> 
> 
> 
> Looks like this is 24/7 for me on Gigabyte Gaming OC.


Yeah this is pretty much my TUF. I _can_ get the memory to go 1800 stable, but frankly it worries me, and the hot spot can climb to 90C, so 1500 does me fine...!


----------



## LunaP

Mad Pistol said:


> Buying a $1600 GPU to save money.


coming from 2080ti's in SLI at a high OC I'm technically saving money here lol


----------



## Arizor

yzonker said:


> I tested mine in CP2077. 4k, max settings. It wasn't much. I should have taken screenshots, but anyway I tested memory first, then core with the memory OC still set,
> 
> Mem +1800: 3.3% over stock
> Core +195: 1.7% over stock with +1800 mem
> 
> Yes, adding almost 200mhz to the core was less than 2%.


Yep I get pretty much the exact same result.

Basically, everything set to RT Ultra preset, DLSS off, with max OC enabled (+1800 mem, +280 on the GPU for 3030mhz), I get 46.something frames. This draws at max, around 515W.

Alternatively, I have my 0.95 undervolt, clock set to 2730, mem +1500. I get 43.something frames. This draws, max, 370W.

I know which one I'm going for...


----------



## Demonkevy666

LunaP said:


> coming from 2080ti's in SLI at a high OC I'm technically saving money here lol


what cpu were those RTX 2080 ti's running on?
I'm betting there weren't many games you played that used SLI or mGPU?


----------



## yzonker

Arizor said:


> Yep I get pretty much the exact same result.
> 
> Basically, everything set to RT Ultra preset, DLSS off, with max OC enabled (+1800 mem, +280 on the GPU for 3030mhz), I get 46.something frames. This draws at max, around 515W.
> 
> Alternatively, I have my 0.95 undervolt, clock set to 2730, mem +1500. I get 43.something frames. This draws, max, 370W.
> 
> I know which one I'm going for...


This is the OCN forum, so you've abandoned the undervolt for another 3fps. Got it.


----------



## LunaP

Demonkevy666 said:


> what cpu were those RTX 2080 ti's running on?
> I'm betting there weren't many games you played that used SLI or mGPU?


10980XE, played a lot of games, all SLI hungry, and others just had to add the flags in profiler, majority of the ones I played long term ( and still do ) loved it. Was the major reason why I went SLI. the 100% scaling due to nvlink was the greatest part of that series. Was a major reason why I didn't upgrade to ampere since they removed backwards SLI support for older games and anything really ( minus benchmarking) since mine still outperformed friends 3090's by a decent amount in the games I played. ESPECIALLY with 3rd party plugins like Gshade involved at 4k.


----------



## J7SC

LunaP said:


> 10980XE, played a lot of games, all SLI hungry, and others just had to add the flags in profiler, majority of the ones I played long term ( and still do ) loved it. Was the major reason why I went SLI. the 100% scaling due to nvlink was the greatest part of that series.


I also still have my 2x 2080 Ti system (w/ Threadripper 2950X, all w-cooled)...GPUs alone will swallow a combined 760 W max...4090 is an efficiency upgrade...yeah, that's why I did it.


----------



## Arizor

Cablemod just arrived and installed. Can’t have it _exactly_ straight in a line from socket, but hopefully this tiny horizontal lean isn’t an issue…!


----------



## yzonker

Arizor said:


> Cablemod just arrived and installed. Can’t have it _exactly_ straight in a line from socket, but hopefully this tiny horizontal lean isn’t an issue…!


If it is then we're screwed probably. Can't expect to have cables running dead straight in a PC.


----------



## jl434

mattskiiau said:


> I just purchased the adaptor to hold me over while I wait for the official corsair version.
> 
> 
> 
> 
> 
> 
> 
> 
> Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.com.au: Computers
> 
> 
> Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.com.au: Computers
> 
> 
> 
> www.amazon.com.au


I see, I thought you are going to order a cable rather than the converter.


----------



## jl434

Arizor said:


> Cablemod just arrived and installed. Can’t have it _exactly_ straight in a line from socket, but hopefully this tiny horizontal lean isn’t an issue…!


When did you order? and how long it take to arrive AU?


----------



## Demonkevy666

LunaP said:


> 10980XE, played a lot of games, all SLI hungry, and others just had to add the flags in profiler, majority of the ones I played long term ( and still do ) loved it. Was the major reason why I went SLI. the 100% scaling due to nvlink was the greatest part of that series. Was a major reason why I didn't upgrade to ampere since they removed backwards SLI support for older games and anything really ( minus benchmarking) since mine still outperformed friends 3090's by a decent amount in the games I played. ESPECIALLY with 3rd party plugins like Gshade involved at 4k.


Did you find any newer raytracing games that worked with them?
I play some old games too, but a few new ones here & there but not when they come out. Like I've played Control, The Meduim, & Deliver us the moon. The medium's dual realaity is brutal on gpu's along with raytracing on both sides of it the dual reality's. It clams it suports SLI/mGPU too.


----------



## Arizor

jl434 said:


> When did you order? and how long it take to arrive AU?


Ordered October 11th, arrived today. It was a custom order so that takes a lot longer, and I didn't pay the silly "CONGRATULATIONS YOU'VE BEEN SELECTED FOR EXPRESS FOR $50 EXTRA!" fee, so you can get them much quicker if you pay the ridiculous upcharge.


----------



## MrTOOSHORT

Arizor said:


> Cablemod just arrived and installed. Can’t have it _exactly_ straight in a line from socket, but hopefully this tiny horizontal lean isn’t an issue…!


That's fine there.

The Nvidia plug had soldered wires on a foil plate at just before the connector that broke if you bend the cable sideways too much. The cablemods cable has direct wire to pin placement.


----------



## Arizor

MrTOOSHORT said:


> That's fine there.
> 
> The Nvidia plug had soldered wires on a foil plate at just before the connector that broke if you bend the cable sideways too much. The cablemods cable has direct wire to pin placement.


Thanks for the insight mate. Funny how even now, still paranoid about these bloody connectors


----------



## mattskiiau

jl434 said:


> I see, I thought you are going to order a cable rather than the converter.


Yeah no Corsair version from Amazon AU so settled for the adaptor. Anything to get rid of the Nvidia one at this point


----------



## sugi0lover

Here is pubg qhd ultra test with 4090, 13900k and 8000cl32 oc.


sugi0lover said:


> My friend asked me to do this bench with my 13900K, 8000 c32 & 4090.
> Here is the replay file if you want to test.
> 
> 
> 
> 
> 
> PUBG Replay.zip
> 
> 
> 
> 
> 
> 
> 
> drive.google.com
> 
> 
> 
> 
> 
> [Testing Method and Option]
> ○ warming up from 15:00 to 16:00 / benching from 16:00 to 23:52 per my friend's request
> 
> Resoultion : 2560 x 1440 (QHD)
> Option : Ultra
> : Result : Avg 411.1 / Min 263.9 / Max 506.5 / 1% Low 285.3 / 0.1% Low 220.0
> 
> [PC setup]
> ○ CPU : 13900K / P Cores 6.0Ghz / E Cores off / Cache 5.0Ghz
> ○ VGA : RTX 4090 (watercooled & liquid metal applied)
> ○ Ram OC : 8000-32-45-45-30-470-2T
> ○ MB : Z790 Apex
> ○ Cooling : MO-RA3 420 PRO + Noctua NF-A14 Industrial / only mora out on balcony


----------



## EarlZ

I am still wondering why there is no data for Auros Master 4090


----------



## bmagnien

Reply from FormulaNod about GigaOC Bykski blocks:
Hi,
Sorry for the late reply, we rest on the weekend.

Since this product N-GV4090AORUS-X Bykski was released on the 28th, they are still rushing to produce the first few batches, so it will take some time to wait.

Since the time you submit your order is at the forefront of this product, when we receive the product, we will give you priority to send it.
When the package is sent, we will have an automatic email to notify you about the tracking.

Sorry for the inconvenience.


----------



## changboy

I just check at cablemod and for a 4x 8pins to 12 pin its around 70 USD so it can cost me 100$ CAD for this cable ??? ***

I bought a complete RED set cable for my EVGA 1600P 2 for around 125$ !


----------



## mirkendargen

bmagnien said:


> Reply from FormulaNod about GigaOC Bykski blocks:
> Hi,
> Sorry for the late reply, we rest on the weekend.
> 
> Since this product N-GV4090AORUS-X Bykski was released on the 28th, they are still rushing to produce the first few batches, so it will take some time to wait.
> 
> Since the time you submit your order is at the forefront of this product, when we receive the product, we will give you priority to send it.
> When the package is sent, we will have an automatic email to notify you about the tracking.
> 
> Sorry for the inconvenience.


Aliexpress claims I automatically get refunded if my block isn't delivered by the 8th so....maybe I'm gonna get a free block? Mine hasn't shipped yet either.


----------



## BattlePhenom

Mad Pistol said:


> Have you tested those settings on Port Royal? That seems to be the great barrier to a 24/7 GPU overclock. If those settings pass on Port Royal, you've got a good card.
> 
> Also, you've got your fan to 90% manually? Shame! That's cheating.


Funnily enough I was able to get +240 on core and +1575 to pass on PR, but now it seems I'm losing ground and can only get +225 core and +1525 to pass. I guess it's "broken" in now.


----------



## Sayenah

BattlePhenom said:


> Funnily enough I was able to get +240 on core and +1575 to pass on PR, but now it seems I'm losing ground and can only get +225 core and +1525 to pass. I guess it's "broken" in now.


I have +200 on core and +1700 on memory stable. Asus Strix. Yet, my best PR score has been 28,098 I scored 28 098 in Port Royal

What gives? I should be doing better than that. Or I wish I was hah

Which card is yours? EDIT: Giga OC


----------



## J7SC

I just checked the supply situation across three major chains here in Canada (incl. Newegg.ca > 'sold by', only to skip the scalpers); no new 4090s offered / all sold out (apart from one open box return). Also, the price of new Giga-G-OC seems to have moved up a bit, but of course is still sold out.

All I need now are the peripherals (cables, block) to arrive...


----------



## LunaP

J7SC said:


> I also still have my 2x 2080 Ti system (w/ Threadripper 2950X, all w-cooled)...GPUs alone will swallow a combined 760 W max...4090 is an efficiency upgrade...yeah, that's why I did it.


Totally yeah...why would there be any other reason 



Demonkevy666 said:


> Did you find any newer raytracing games that worked with them?
> I play some old games too, but a few new ones here & there but not when they come out. Like I've played Control, The Meduim, & Deliver us the moon. The medium's dual realaity is brutal on gpu's along with raytracing on both sides of it the dual reality's. It clams it suports SLI/mGPU too.


A few especially when the whole DLSS was a big thing, SOTR, and for got some of the others, I mainly play MMO's but have a few games + like to showoff others when guests come over asking questions ( if they see the room ) shame VR can't but understandable as to why. I mean it can but dev's aren't gonna put that amount of time in lol.



changboy said:


> I just check at cablemod and for a 4x 8pins to 12 pin its around 70 USD so it can cost me 100$ CAD for this cable ??? ***
> 
> I bought a complete RED set cable for my EVGA 1600P 2 for around 125$ !


***? I ordered a corsiar one from them and it was 48$ after the 15$ shipping fee, are they all not the same cost? Or did u do the custom and select 1000mm for length ( that came out to 64$ for me w/ shipping )


----------



## changboy

LunaP said:


> Totally yeah...why would there be any other reason
> 
> 
> 
> A few especially when the whole DLSS was a big thing, SOTR, and for got some of the others, I mainly play MMO's but have a few games + like to showoff others when guests come over asking questions ( if they see the room ) shame VR can't but understandable as to why. I mean it can but dev's aren't gonna put that amount of time in lol.
> 
> 
> 
> ***? I ordered a corsiar one from them and it was 48$ after the 15$ shipping fee, are they all not the same cost? Or did u do the custom and select 1000mm for length ( that came out to 64$ for me w/ shipping )


I chose the Pro one with red cable, and 600mm i think, add a plastic thing to sit the cable and price is 69.40 usd with standard flatrate shipping the cable only is 54.40. I didn't order it yet, i think about it hehe. I found its pricey for 1 cable.

I looking at game and saw UNCHARTED™: Legacy of Thieves Collection seam great, i like tomb raider game, do you think this one is a good one to buy ?


----------



## LunaP

changboy said:


> I chose the Pro one with red cable, and 600mm i think, add a plastic thing to sit the cable and price is 69.40 usd with standard flatrate shipping the cable only is 54.40. I didn't order it yet, i think about it hehe. I found its pricey for 1 cable.
> 
> I looking at game and saw UNCHARTED™: Legacy of Thieves Collection seam great, i like tomb raider game, do you think this one is a good one to buy ?


Ah that's the custom one then yeah those are a lil more pricey, the standard 23 inch one is 29$ though + 15$ shipping they have em premade. Unsure on the game though.


----------



## Roacoe717

What is the process if your connector melts and the port on the card is damaged? Will the AIB send you a new one or just refund you?


----------



## Htp187

There has to be a way to burn these scalpers at their own game, someone needs to come up with something to lose these people money big time


----------



## Sayenah

Htp187 said:


> There has to be a way to burn these scalpers at their own game, someone needs to come up with something to lose these people money big time


Buy on eBay and return. Rinse and repeat. But… why would one even do that? I kind of feel pity for the poor bastards. They don’t even make that much and spend so much time doing the juggle. It is all so classless and stupid.


----------



## Htp187

Sayenah said:


> Buy on eBay and return. Rinse and repeat. But… why would one even do that? I kind of feel pity for the poor bastards. They don’t even make that much and spend so much time doing the juggle. It is all so classless and stupid.


It sucks because on launch day I woke up 3 hours before and had everything setup ready to go, had it on cart as processing got the error.

I really wanted to get it so I can review it on my yt channel but no luck at all, and than I see it on websites like ebay and kijiji which pisses me off, every time people like this ruin it for everyone same as for the ps5 stock.


----------



## AvengedRobix

In amy case the latest driver not resolv the g-sync problema.with lg CX-C1-C2 🙁


----------



## changboy

I bought Uncharted, in my steam was 59.99$ maybe before tax but i paid 42$ tax in cad, i will tell you if i like it hehe, seam have some nice graphics.


----------



## Arizor

changboy said:


> I bought Uncharted, in my steam was 59.99$ maybe before tax but i paid 42$ tax in cad, i will tell you if i like it hehe, seam have some nice graphics.


very pretty games, I just don’t really enjoy the mechanics, but it’ll definitely make you appreciate the 4090!

loving Plague Tale at the moment, such a gorgeous showcase of DLSS3.


----------



## Madness11

Guys , fasgear it's good company for cables ?? I want purchase one 3x8 to 16 , but not hear about this company


----------



## Nico67

vigorito said:


> im seeing 450 info not 500


it only does 90%pl by default in AB, which is really weird, but 90% of 500w is 450w. Not sure why they didn't just leave at 100% is 450w and 133% to 600w, unless they are trying to suggest 500w is safe with Strix. In which case why it doesn't default to 100% I don't know


----------



## jl434

Received my Fasgear 12VHPWR cable(3x8 to 16) today, so far so good for my EVGA PSU, and can 133% Power limit against my Gigabyte Gaming OC



























Still no update for my CableMod order -_-


----------



## LunaP

Cable mod wont respond over the weekend since theyre closed, today will be the earliest if anything that they start responding to emails., also Happy halloween all


----------



## vigorito

i run OCCT on gpu,and strix pull 596w in HW info so its okay


----------



## long2905

BeZol said:


> I am not following the thread any more, so if it has already been, then sorry in advance.
> 
> Inno3D X3 OC RTX 4090
> 3x8pin adapter
> TDP 450W
> 
> It's basically one of the cheapest models you can get.
> 
> With 4x8pin adapter nothing changes.
> 
> BUT
> if you put a Gigabyte Gaming OC BIOS (450W +33% = 600W TDP limit) on the GPU,
> coupled with the 4x8pin adapter I managed to unlock the graphics card.
> 
> Interestingly I had voltage limit at 1.1V at my Port Royal run (28.928 score) with peak TDP of 550W.
> 
> View attachment 2579265


thanks for this. you give me hope for the ichill frostbite version when that become available and at a more reasonable price


----------



## mattskiiau

Madness11 said:


> Guys , fasgear it's good company for cables ?? I want purchase one 3x8 to 16 , but not hear about this company


A few people have ordered these and are using them. Some reviews on Amazon shows some users having them installed for 2+ weeks and no issues. 

Who knows.. is the answer.


----------



## sneida

long2905 said:


> thanks for this. you give me hope for the ichill frostbite version when that become available and at a more reasonable price


review: 



-> 450W limit.


----------



## KedarWolf

AvengedRobix said:


> In amy case the latest driver not resolv the g-sync problema.with lg CX-C1-C2 🙁











[Official] NVIDIA RTX 4090 Owner's Club


Hi guys. Is it worth buying a strix 4090 instead of a tuf? I saw that the pcb are very similar. Tuf and strix do not reach 600w? If you have to ask, then the Strix isn’t worth it, at all. You will always be better served by a Suprim x or an FE. Frankly, with the Suprim X available, I don’t...




www.overclock.net


----------



## BigMack70

So has anyone on this forum yet had issues with the cable melting? 

I haven't gotten my IR gun out yet but I've done long gaming sessions in games like cyberpunk and the connector is just a bit warm to the touch during that time. 

I don't doubt that there is a legitimate problem somewhere, but I also wonder how much user error might be going on. People can be real foolish/careless with how they install and use hardware.


----------



## matrix17

Hello guys,one question please.Anyone with a suprim x that has flashed it to strix for 600w pl? Bios is working fine but for the life of me i cant figure how to force rgb to work...They start but after the initial boot sequence they stop...It needs msi center, but mci center obviously doesnt recognise my card after flashing it.

Ive also tried flashing the 600 bios from liquid x but for some reason im capped at 560w,rgb is working fine though..


----------



## RaMsiTo

matrix17 said:


> Hello guys,one question please.Anyone with a suprim x that has flashed it to strix for 600w pl? Bios is working fine but for the life of me i cant figure how to force rgb to work...They start but after the initial boot sequence they stop...It needs msi center, but mci center obviously doesnt recognise my card after flashing it.
> 
> Ive also tried flashing the 600 bios from liquid x but for some reason im capped at 560w,rgb is working fine though..


Don't waste time, I tried several 600w bios and in all of them the rgb turns off when entering windows.


----------



## matrix17

RaMsiTo said:


> Don't waste time, I tried several 600w bios and in all of them the rgb turns off when entering windows.


Have you tried the liquid x as well?Ab lets me go to 125% but the 100% is 450w since it allows up to 560w.


----------



## long2905

sneida said:


> review:
> 
> 
> 
> -> 450W limit.


im hoping someone would purchase this version and flash 600w vbios onto it to test it out. wasnt my first rodeo with inno3d


----------



## dboom

Afterburner OC scan. TUF


----------



## AvengedRobix

KedarWolf said:


> [Official] NVIDIA RTX 4090 Owner's Club
> 
> 
> Hi guys. Is it worth buying a strix 4090 instead of a tuf? I saw that the pcb are very similar. Tuf and strix do not reach 600w? If you have to ask, then the Strix isn’t worth it, at all. You will always be better served by a Suprim x or an FE. Frankly, with the Suprim X available, I don’t...
> 
> 
> 
> 
> www.overclock.net


Enable only vrr not g-sync


----------



## warrior-kid

A 'stupid' question, I see a lot of 100% voltage boost settings on screenshots here. Is this something you are confident of choosing for benchmarks? How does it play together with the power limit? Does it result in artifacts? Does it cause the benchmarks to abort if all that changes is the voltage increase? Do you use it for games or machine learning? Any words of wisdom at all?


----------



## TheNaitsyrk

For anyone interested:

In UK in stock:



https://www.cclonline.com/cp-8920284-corsair-600w-pcie-5-0-12vhpwr-type-4-psu-power-cable-397544/


----------



## Panchovix

dboom said:


> Afterburner OC scan. TUF


Pretty nice, I got 2925Mhz on my TUF non-OC lol

At 3000Mhz I crash at most games, so welp, lost the silicon lottery this time

EDIT: Went into curve editing + more voltage and it seems to gold 3045-3060Mhz fine, but damn, at 1.1V it may be too much lol


----------



## mirkendargen

BigMack70 said:


> So has anyone on this forum yet had issues with the cable melting?
> 
> I haven't gotten my IR gun out yet but I've done long gaming sessions in games like cyberpunk and the connector is just a bit warm to the touch during that time.
> 
> I don't doubt that there is a legitimate problem somewhere, but I also wonder how much user error might be going on. People can be real foolish/careless with how they install and use hardware.


One so far, using the adapter (like all the other reports on reddit).


----------



## jcde7ago

Guy on Reddit reviewing his Fasgear cable:


__
https://www.reddit.com/r/nvidia/comments/yifwjk

I posted on there that I received the "SIRLYR" no-name brand cable as well, and it looks to be the EXACT same as the Fasgear that he received, down to the box, lol:



http://imgur.com/a/qiUugsO


I've not used the Sirlyr cable yet since my CableMod arrives Thursday or Friday and I can wait, but there are folks that have had success with it supposedly:

__
https://www.reddit.com/r/nvidia/comments/yhhkwo/_/iugsemb


----------



## vdbmario

Which 4090's have the best results in regards to coil whine? I hear FE and Gigabyte are the best options. Anyone can confirm for me please?


----------



## Nizzen

vdbmario said:


> Which 4090's have the best results in regards to coil whine? I hear FE and Gigabyte are the best options. Anyone can confirm for me please?


random


----------



## ALSTER868

TheNaitsyrk said:


> In UK in stock:


Can't decide which cable to get. Would this Corsair be a better choice than any other like Cablemod? I like Corsair more for having only 2x8 pins and thus being more compact. And.. will it work fine with a Seasonic PX-1000 PSU? I can see compatible Corsair PSUs in its description.


----------



## motivman

if you guys had a choice, which 4090 would you get between these two: 4090 FE or gigabyte gaming OC?


----------



## Panchovix

motivman said:


> if you guys had a choice, which 4090 would you get between these two: 4090 FE or gigabyte gaming OC?


I would get the FE because:

- Better GPU Power Stages/VRAM Power Stages, VRMs and such

- Cheaper


----------



## vigorito

Seasonic OEM cable (Prime PX 1600) loose wires only 3 cycle ,i swap it for a second one from the box,nothing melted but still doesent look good.


----------



## cheddardonkey

Panchovix said:


> Pretty nice, I got 2925Mhz on my TUF non-OC lol
> 
> At 3000Mhz I crash at most games, so welp, lost the silicon lottery this time
> 
> EDIT: Went into curve editing + more voltage and it seems to gold 3045-3060Mhz fine, but damn, at 1.1V it may be too much lol


 What did you change on the curve?


----------



## Krzych04650

vdbmario said:


> Which 4090's have the best results in regards to coil whine? I hear FE and Gigabyte are the best options. Anyone can confirm for me please?


Don't have any yet but from what I've gathered Asus and MSI are getting pretty much universally negative feedback with really bad whine. This is also in line with previous generations. Gigabyte on the other hand is almost universally good, for Gaming models at least, for Windforce I've seen some complaints. Inno3D and Zotac are getting pretty good reports too. Don't know about Palit/Gainward or Colorful.

Out of the ones with less whine Gigabyte is the only that has native 600W BIOS and has some chance of getting wide waterblock compatibility, so something like Gaming OC seems like the best option overall.


----------



## Sheyster

Krzych04650 said:


> Don't have any yet but from what I've gathered Asus and MSI are getting pretty much universally negative feedback with really bad whine. Gigabyte on the other hand is almost universally good, for Gaming models at least, for Windforce I've seen some complaints. Inno3D and Zotac are getting pretty good reports too. Don't know about Palit/Gainward or Colorful.
> 
> Out of the ones with less whine Gigabyte is the only that has native 600W BIOS and has come chance of getting wide waterblock compatibility, so something like Gaming OC seems like the best option overall.


I run an open bench and there is no coil whine from the Gigabyte OC that I can hear... (getting old so who knows, high frequency hearing loss happens first!)

I'm actually very happy with this card and I'm keeping it. I would have bought an FE or a Strix over this card if one was available, but now that I have it I see no reason to switch later.


----------



## jcde7ago

Krzych04650 said:


> Don't have any yet but from what I've gathered Asus and MSI are getting pretty much universally negative feedback with really bad whine. Gigabyte on the other hand is almost universally good, for Gaming models at least, for Windforce I've seen some complaints. Inno3D and Zotac are getting pretty good reports too. Don't know about Palit/Gainward or Colorful.
> 
> Out of the ones with less whine Gigabyte is the only that has native 600W BIOS and has come chance of getting wide waterblock compatibility, so something like Gaming OC seems like the best option overall.


No coil whine whatsoever from my MSI Suprim Liquid X, running 530w bios undervolted to .950mv @2760mhz and +1000mem OC.


----------



## LunaP

motivman said:


> if you guys had a choice, which 4090 would you get between these two: 4090 FE or gigabyte gaming OC?


My initial intent was to get the FE like most were saying but after a couple weeks of no go I ended up nabbing the gigabyte OC at BB despite them also having an FE there as well , heard a lot of good things on it, and havne't had any issues, OC's insanely well it feels ( YMMV obviously ) and 0 coil whine on mine but as others stated its RNG for that stuff.




Panchovix said:


> I would get the FE because:
> 
> - Better GPU Power Stages/VRAM Power Stages, VRMs and such
> 
> - Cheaper


Are there any sources for the power stages/vrm's etc on this? I remember some cards were actually worse but the gaming OC + asus ones should be similar if not slightly improved on per what's been said here, could be wrong but definitely wanna confirm so we don't give out misinfo. Obv FE has some of the highest Block support at least, w/ Asus/Gigabyte coming in 2nd/3rd


----------



## Krzych04650

jcde7ago said:


> No coil whine whatsoever from my MSI Suprim Liquid X, running 530w bios undervolted to .950mv @2760mhz and +1000mem OC.


This card has a lot of complaints and even videos about massive coil whine, so it is much more likely that you are just no sensitive to it. Which is great, I cannot overstate how much easier my life would be if I was not so sensitive to such sounds. It is not just GPUs, almost all electronics have it to varying degrees.


----------



## Xavier233

I have 2 Gigagbyte 4090s OC. Both have coil whine above 300-400FPS, though the usage of specific power supplies change the loudness of coil whine. I will rule out my electrical wiring system by hooking up the entire PC through a huge battery supply unit rated 2500 watts. 

If its not the electric wiring/grid, its only the GPU and GPU that both can contribute to coil whine. From my testings, once the GPU has coil whine, it can cause the PSU to also have noises (electric buzz, coil whine, or other similar noises), which just makes things worse.


----------



## jcde7ago

Krzych04650 said:


> This card has a lot of complaints and even videos about massive coil whine, so it is much more likely that you are just no sensitive to it. Which is great, I cannot overstate how much easier my life would be if I was not so sensitive to such sounds. It is not just GPUs, almost all electronics have it to varying degrees.


It's possible, i've seen some of those videos as well. But i've also chatted with folks who have a Suprim Liquid X and they can't hear any relevant coil whine either. 

It comes down to sensitivity + whatever else sounds one has going on in their case/room I suppose. I have a closed side panel (011 Dynamic XL) and all 10 case fans run at 1k RPM, and i've tried and tried but don't notice any coil whine from my card.


----------



## mirkendargen

motivman said:


> if you guys had a choice, which 4090 would you get between these two: 4090 FE or gigabyte gaming OC?


Really depends on what matters to you:

If you're keeping the stock cooler and space is an issue, FE. If space isn't an issue, Gigabyte will cool somewhat better.

If you're getting a block, FE has 4+ blocks available/on the way, Gigabyte has 1 available and 2 on the way.

If you're potentially interested in some XOC BIOS down the road, it's more likely to work on Gigabyte than FE.

If you're worried about who will take better care of you if your connector melts, my gut says Nvidia but I think the jury is still out on that.

Clocks/power/etc, all red herrings that are luck of the draw and/or won't matter.


----------



## Panchovix

cheddardonkey said:


> What did you change on the curve?


Basically used auto scanner with 100% voltage to use as a base of a curve on MSI AB (up to 1.1V), then edited some dots given a frequency with that base (for example 2925Mhz to some 1.1V dots and below, and then went down a bin, etc)

Then, saved that, and applied an offset (holding alt and moving a dot up or down, it moves all the curve basically) until the difference of any given dot was (3060Mhz - 2925Mhz = 135Mhz offset), and tested. (Had to do this, since can't see the curve above 3000Mhz lol)


----------



## Panchovix

LunaP said:


> Are there any sources for the power stages/vrm's etc on this? I remember some cards were actually worse but the gaming OC + asus ones should be similar if not slightly improved on per what's been said here, could be wrong but definitely wanna confirm so we don't give out misinfo. Obv FE has some of the highest Block support at least, w/ Asus/Gigabyte coming in 2nd/3rd


I got it from the first post

FE:









Gigabyte:









ASUS for example:


----------



## Madness11

jcde7ago said:


> No coil whine whatsoever from my MSI Suprim Liquid X, running 530w bios undervolted to .950mv @2760mhz and +1000mem OC.


Can you make small video of coil whines ?? Wants listen how is  I got same card , but got coil whines( ( no huge )


----------



## GAN77

vigorito said:


> Seasonic OEM cable (Prime PX 1600) loose wires only 3 cycle ,i swap it for a second one from the box,nothing melted but still doesent look good.
> 
> View attachment 2579803


Pins not fully inserted?


----------



## J7SC

vdbmario said:


> Which 4090's have the best results in regards to coil whine? I hear FE and Gigabyte are the best options. Anyone can confirm for me please?


I ended up with the Gigabyte Gaming OC as the only and cheapest option in my market region - and am thrilled with it....that is apart from general 'the cable / dongle' concern 4090 owners seem to have. It oc's really well, no coil whine, dual vbios switch (up to 600W). Unless you are running it full blast (1.1v, 133%, 500W+), temps are also very good (the Gigabyte Gaming OC seems to have the best VRAM temps in various tests).


----------



## jcde7ago

LunaP said:


> Are there any sources for the power stages/vrm's etc on this?


Here's another table made by kirkle8 on Reddit:


----------



## Stumpfl111

long2905 said:


> im hoping someone would purchase this version and flash 600w vbios onto it to test it out. wasnt my first rodeo with inno3d


 i did exactly that....

i have this card (Inno3D X3) with a Alphacool Block for one week now. (I slapped the block on myself because i dont like the Inno3D branding on the OEM version) and with the 600w bios no problems so far.

all good so far, no issues. Temps are cool and i am happy.


----------



## vigorito

GAN77 said:


> Pins not fully inserted?


they did,but after only several cycle they are not insert good,anyway im wainting for cable from moddi.


----------



## Xavier233

Xavier233 said:


> I have 2 Gigagbyte 4090s OC. Both have coil whine above 300-400FPS, though the usage of specific power supplies change the loudness of coil whine. I will rule out my electrical wiring system by hooking up the entire PC through a huge battery supply unit rated 2500 watts.
> 
> If its not the electric wiring/grid, its only the GPU and GPU that both can contribute to coil whine. From my testings, once the GPU has coil whine, it can cause the PSU to also have noises (electric buzz, coil whine, or other similar noises), which just makes things worse.


Just tested with the whole PC on AC power, same results, both cards do coil whine.


----------



## mirkendargen

vigorito said:


> they did,but after only several cycle they are not insert good,anyway im wainting for cable from moddi.


Yeah this terrifies me because given the size and placement of the connector I'm betting most people insert it by pushing on the wires and disconnect it by pulling on the wires.....which is asking for this to happen over time. It's really hard to be sure you get it in snuggly only pushing on the sides of the connector.


----------



## yzonker

Got my miracle run in Speedway. This is so screwed up.









Result not found







www.3dmark.com





Hit run again immediately, back to below 11000.









Result not found







www.3dmark.com





This is my best legit run (IMO),









Result not found







www.3dmark.com





I'd delete it, but everyone at the top must be doing the same thing. No way you can get well north of 11k legit.


----------



## GosuPl

Result







www.3dmark.com













Result







www.3dmark.com





Comparison RTX 3090 STRIX PT 480 and OC + 100 / + 1000 vs RTX 4090 STRIX PT 600 (real 480-500W under load) and + 100 / + 1000

Holy ****, 85%


----------



## motivman

how are you guys getting such great overclocking cards? I just got an FE, and guess what? it overclocks just about the same as my gaming trio... SMH. If anyone wants an FE, let me know... haha.


----------



## Arizor

motivman said:


> how are you guys getting such great overclocking cards? I just got an FE, and guess what? it overclocks just about the same as my gaming trio... SMH. If anyone wants an FE, let me know... haha.


What clocks are you getting? How are you setting up Afterburner?


----------



## Panchovix

motivman said:


> how are you guys getting such great overclocking cards? I just got an FE, and guess what? it overclocks just about the same as my gaming trio... SMH. If anyone wants an FE, let me know... haha.


Which clocks? Wondering since, I find getting 2 silicon losers in a row on a 4090 seems really hard to achieve to be honest lol


----------



## Recipe7

Panchovix said:


> Basically used auto scanner with 100% voltage to use as a base of a curve on MSI AB (up to 1.1V), then edited some dots given a frequency with that base (for example 2925Mhz to some 1.1V dots and below, and then went down a bin, etc)
> 
> Then, saved that, and applied an offset (holding alt and moving a dot up or down, it moves all the curve basically) until the difference of any given dot was (3060Mhz - 2925Mhz = 135Mhz offset), and tested. (Had to do this, since can't see the curve above 3000Mhz lol)


You can edit the curve in msiafterburner.cfg from 3000 to 3500


----------



## motivman

Arizor said:


> What clocks are you getting? How are you setting up Afterburner?


on both cards, max core clock is around 3000 mhz, max memory around +1100 on afterburner... anything more crashes. this is the best port royal score I can get on my FE...









I scored 27 674 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





HOW DID I GET UNLUCKY TWICE??? LOL


----------



## mirkendargen

motivman said:


> on both cards, max core clock is around 3000 mhz, max memory around +1100 on afterburner... anything more crashes. this is the best port royal score I can get on my FE...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 27 674 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> HOW DID I GET UNLUCKY TWICE??? LOL


Is that at 1.05V or 1.1V? 3ghz at 1.05V isn't so bad, you'd probably get like 3045-3060 at 1.1V which is what most people are comparing.


----------



## motivman

mirkendargen said:


> Is that at 1.05V or 1.1V? 3ghz at 1.05V isn't so bad, you'd probably get like 3045-3060 at 1.1V which is what most people are comparing.


afterburner shows 1.1v


----------



## motivman

Recipe7 said:


> You can edit the curve in msiafterburner.cfg from 3000 to 3500


ok, so I did that, what do I do next? lol


----------



## yzonker

Ok, at least I broke 11k legit. 









Result not found







www.3dmark.com


----------



## yzonker

Was running when I posted. Finally a bit more. 









I scored 11 034 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## bmagnien

yzonker said:


> Ok, at least I broke 11k legit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


Nice! I took a break from score runs until I get the block and a safe cable. Got just under you in the first couple days. Should be over 11k with temps controlled. You’re using a block right? Anything you can share re: temp deltas, boost behavior before and after?









I scored 10 979 in Speed Way


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





also - what’s a ‘legit’ run. If it artifacted once but you missed seeing it, is that still legit?


----------



## yzonker

bmagnien said:


> Nice! I took a break from score runs until I get the block and a safe cable. Got just under you in the first couple days. Should be over 11k with temps controlled. You’re using a block right? Anything you can share re: temp deltas, boost behavior before and after?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 10 979 in Speed Way
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> also - what’s a ‘legit’ run. If it artifacted once but you missed seeing it, is that still legit?


Yea I dunno. Seems like the score always jumps a bunch though if it artifacts. I ran several in a row in the 11015-11020 range though. Bumped the mem and massaged the curve a little more to hit the 11034. So I don't think it was. 

No still on air. The block is sitting here taunting me though.


----------



## bmagnien

yzonker said:


> Yea I dunno. Seems like the score always jumps a bunch though if it artifacts. I ran several in a row in the 11015-11020 range though. Bumped the mem and massaged the curve a little more to hit the 11034. So I don't think it was.
> 
> No still on air. The block is sitting here taunting me though.


What’re you waiting for???


----------



## Panchovix

yzonker said:


> Was running when I posted. Finally a bit more.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 034 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


+1800Mhz on the VRAM? Damn that's a lucky memory there, good job man!


----------



## J7SC

Happy Halloween...



Spoiler



*Trick or Treat







*


----------



## yzonker

bmagnien said:


> What’re you waiting for???


Waiting to see how connectorgate settles out and also I spent all of my free time last weekend tuning that new 13900k setup. RAM tuning sucks.


----------



## yzonker

Panchovix said:


> +1800Mhz on the VRAM? Damn that's a lucky memory there, good job man!


Yea about time I got the good mem. My 3090 was terrible. Wouldn't go much more than +1000 with the chiller. My 3080ti was a little better at +1300-1400.

It's interesting though. I'll say it again, those 2gb chips do not like to be cool. If I run fans on auto on the card, I can do +1850-1875. At 100% fans I have to drop back to +1800-1825. Still worried how that will work with 3C water.


----------



## Panchovix

Recipe7 said:


> You can edit the curve in msiafterburner.cfg from 3000 to 3500


Sorry to bother, but which option I have to change? Searching in the .cfg, and couldn't find something related to frequency, or to 3000


----------



## yzonker

Panchovix said:


> Sorry to bother, but which option I have to change? Searching in the .cfg, and couldn't find something related to frequency, or to 3000











[Official] NVIDIA RTX 4090 Owner's Club


...Weird that Port Royal is weird on your system :D ...That's the one bench I never have any problems with. Highest I subbed was 28443, though there was a low 29k one but it seemed to have some (minor) artifacts near the end, so not subbed. ...I haven't benched in almost a week because a.)...




www.overclock.net


----------



## motivman

Panchovix said:


> Sorry to bother, but which option I have to change? Searching in the .cfg, and couldn't find something related to frequency, or to 3000


CTRL+F, search for 3000, second result, change to 3500, take ownership of file, FULL control, save.. restart afterburner.. DONE. didn't help with either of my crap cards though, SMH


----------



## motivman

I still cannot believe I got an FE, and it overclocks just as bad as my trio.. like what are the odds?? lol


----------



## KedarWolf

I scored 11 174 in Speed Way


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





My best Speed Way.


----------



## mattskiiau

Received the Fasgear extender. It's working.

Noticed I'm not going over 450w in 3dmark even when my Trio X has a 480w maximum. Could be the cable or I don't understand how the BIOS works exactly.

Anyway, glad to get rid of the Nvidia cable.


----------



## arvinz

Does anyone see a direct product page for the Cablemod right angle adapter? Supposed to go live at 9pm EST.


----------



## Spiriva

arvinz said:


> Does anyone see a direct product page for the Cablemod right angle adapter? Supposed to go live at 9pm EST.


Nope, was supposed to be 02.00 CET but nothing on thier page atm.


----------



## Xavier233

the crucial info is: 1) their site cannot handle the traffic, and 2) they are out of stock


----------



## KedarWolf

Spiriva said:


> Nope, was supposed to be 02.00 CET but nothing on thier page atm.
> 
> 
> 
> 
> 
> 
> 
> 
> View attachment 2579910


The timer for the 90-degree adapter for me in Toronto Canada says 54 minutes until they sell it.






CableMod 12VHPWR Angled Adapter – CableMod Global Store







store.cablemod.com


----------



## Spiriva

Xavier233 said:


> the crucial info is: 1) their site cannot handle the traffic, and 2) they are out of stock


The new timer on thier page says 0300 CET. Awesome to sit here like a fool in the middle of the night waiting to order an adapter.


----------



## ZealotKi11er

Got my cable. Pretty happy with it.


----------



## KedarWolf

ZealotKi11er said:


> Got my cable. Pretty happy with it.
> 
> View attachment 2579913


Looks far too bent to be safe.


----------



## ZealotKi11er

KedarWolf said:


> Looks far too bent to be safe.


 Actually that is without any force just gravity.


----------



## MrTOOSHORT

KedarWolf said:


> Looks far too bent to be safe.


Thats direct wire to pin. Safe to bend like a regular 8 pin like on the old older cards.


----------



## inedenimadam

Hey guys. How hot does naked GDDR6X get with no heatsink? I have a few of the old EK supremacy EVO blocks kicking around and was debating throwing one on to play around with.



mattskiiau said:


> Received the Fasgear extender. It's working.
> 
> Noticed I'm not going over 450w in 3dmark even when my Trio X has a 480w maximum. Could be the cable or I don't understand how the BIOS works exactly.
> 
> Anyway, glad to get rid of the Nvidia cable.


Gaming X Trio has a 480W bios, doesn't seem to matter if its 3x8 or 4x8. I had no issues flashing a 600W bios on mine, but it really didn't gain much other than not power throttling down a few bins under heavy load. Basically no gaming performance gains, but a few more port royal points.


----------



## mirkendargen

inedenimadam said:


> Hey guys. How hot does naked GDDR6X get with no heatsink? I have a few of the old EK supremacy EVO blocks kicking around and was debating throwing one on to play around with.


Hot, the stuff uses like 35w or something like that on it's own, so you're looking at >5w/IC. Absolutely needs cooling.

A backplate couldn't adequately cool the rear side GDDR6X on 3090's.


----------



## inedenimadam

mirkendargen said:


> Hot, the stuff uses like 35w or something like that on it's own, so you're looking at >5w/IC. Absolutely needs cooling.


I think I have a stash of copper heat sinks and some thermal grease. May try it anyway...or maybe just wait for a proper block.


----------



## Xavier233

"CableMode to the rescue"... Yet they are out of stock completely 

They have detailed adapter options and info on the website, yet they say "we think it would be better at this time to first confirm compatibilities with more graphics cards on the market, bla bla bla" 

Who approved this Bullshit?


----------



## mirkendargen

inedenimadam said:


> I think I have a stash of copper heat sinks and some thermal grease. May try it anyway...or maybe just wait for a proper block.


That plus a fan pointed at it would probably be good enough, and you can check the temp in hwinfo.




Xavier233 said:


> "CableMode to the rescue"... Yet they are out of stock completely
> 
> They have detailed adapter options and info on the website, yet they say "we think it would be better at this time to first confirm compatibilities with more graphics cards on the market, bla bla bla"
> 
> Who approved this Bullshit?


They're terrified one of their cables will melt and Nvidia will throw them under the bus and want to see what Nvidia does first.


----------



## Xavier233

mirkendargen said:


> That plus a fan pointed at it would probably be good enough, and you can check the temp in hwinfo.
> 
> 
> 
> 
> They're terrified one of their cables will melt and Nvidia will throw them under the bus and want to see what Nvidia does first.


Its just a 90 degree adapter...


----------



## mirkendargen

Xavier233 said:


> Its just a 90 degree adapter...


Which is some connectors soldered on to something stiff.

Kinda like the adapter that's melting lol.

Will it fail? Probably not although there will be manufacturing defects in anything, that's totally normal. If they have even one of those melt and it get posted on Reddit right now, even if the design and normal quality is fine, they'd get trashed.


----------



## Panchovix

Some benches with my TUF 4090, got an below-average, or average sample (based on some benchs here on the thread), my max core clock is 3075-3090Mhz at 1.1V, though on Port Royal my absolute max is 3045Mhz, can't go above that. At 1.05V the clocks can barely reach 3 Ghz.

Also, on VRAM, my max is +1150 or +1200, more than that and it crashes (depending on the game), though haven't seen a crash at +1150Mhz.

Can't complain though, even at stock it is like 2x times faster than my 3080 overclocked at 4K lol, but I REALLY need to change my 5800X, it is bottenecking me in some games 



Spoiler: Benchs with images and links












I scored 15 046 in Time Spy Extreme


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com






















I scored 29 196 in Time Spy


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com






















I scored 27 832 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





















I scored 52 397 in Fire Strike


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





















I scored 25 728 in Fire Strike Ultra


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





















I scored 10 539 in Speed Way


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Xavier233

mirkendargen said:


> Which is some connectors soldered on to something stiff.
> 
> Kinda like the adapter that's melting lol.
> 
> Will it fail? Probably not although there will be manufacturing defects in anything, that's totally normal. If they have even one of those melt and it get posted on Reddit right now, even if the design and normal quality is fine, they'd get trashed.


Sure, but if you do need more time designing or improving your adapters, you dont do so 4 min before the big sale actually happens. You have plenty of time to have done that in the last 2 weeks


----------



## J7SC

yzonker said:


> Yea about time I got the good mem. My 3090 was terrible. Wouldn't go much more than +1000 with the chiller. My 3080ti was a little better at +1300-1400.
> 
> It's interesting though. I'll say it again, those 2gb chips do not like to be cool. If I run fans on auto on the card, I can do +1850-1875. At 100% fans I have to drop back to +1800-1825. Still worried how that will work with 3C water.


VRAM on my card needs further testing once the connector and w-block have arrived and are installed. For some weird reason, I can run 1650+ error-free, with efficiency gains and thorough testing (all 3DMs, Cyberpunk '77, other) as long as I'm not maxing the GPU w/ 133% full GPU OC - then best VRAM falls to mid- +1400s. I think it is heat (as in memory controller, rather than VRAM itself) as my card has a hotspot issue at full tilt. W-cooling it should help to get to the bottom of it, noting that I'm not running a chiller w/ 4C temps. That said, the 2 GB GDDR6X chips not liking to get too cold has been shown a couple of times with the sub-zero folks.


----------



## Arizor

J7SC said:


> VRAM on my card needs further testing once the connector and w-block have arrived and are installed. For some weird reason, I can run 1650+ error-free, with efficiency gains and thorough testing (all 3DMs, Cyberpunk '77, other) as long as I'm not maxing the GPU w/ 133% full GPU OC - then best VRAM falls to mid- +1400s. I think it is heat (as in memory controller, rather than VRAM itself) as my card has a hotspot issue at full tilt. W-cooling it should help to get to the bottom of it, noting that I'm not running a chiller w/ 4C temps. That said, the 2 GB GDDR6X chips not liking to get too cold has been shown a couple of times with the sub-zero folks.


yep finding I can comfortably run my vram at 1700+ on my 0.95 undervolt (though I stay at 1600 for my own comfort), interested to see how high you can go under water, and now the ram reacts.


----------



## mattskiiau

Playing MW2, getting some 600w spike from NVVDD output power.
Is it normal or something wrong? Anyone care to educate me on what that is? Would appreciate it!


----------



## cheddardonkey

Panchovix said:


> Basically used auto scanner with 100% voltage to use as a base of a curve on MSI AB (up to 1.1V), then edited some dots given a frequency with that base (for example 2925Mhz to some 1.1V dots and below, and then went down a bin, etc)
> 
> Then, saved that, and applied an offset (holding alt and moving a dot up or down, it moves all the curve basically) until the difference of any given dot was (3060Mhz - 2925Mhz = 135Mhz offset), and tested. (Had to do this, since can't see the curve above 3000Mhz lol)


Thanks, (Fyi, you can change the max values (3500 for instance) on the curve graph via the .cfg file in the afterburner folder.


----------



## Arizor

Does anyone have an upload of the TUF BIOS? 

Swear I backed mine up a week or two ago before switching to STRIX, but alas cannot find it... Not that it's too much of an issue, STRIX works fine, but still.


----------



## sniperpowa

Krzych04650 said:


> This card has a lot of complaints and even videos about massive coil whine, so it is much more likely that you are just no sensitive to it. Which is great, I cannot overstate how much easier my life would be if I was not so sensitive to such sounds. It is not just GPUs, almost all electronics have it to varying degrees.


My suprim liquid x has no coil whine. I’ve had cards that had bad coil whine. Last one was an msi 1080ti. Coil whine can happen with any gpu. Not every card is the same


----------



## AKBrian

sniperpowa said:


> My suprim liquid x has no coil whine. I’ve had cards that had bad coil whine. Last one was an msi 1080ti. Coil whine can happen with any gpu. Not every card is the same


No discernable coil whine on my MSI Suprim X (Air), either, thankfully.


----------



## Azazil1190

Arizor said:


> Does anyone have an upload of the TUF BIOS?
> 
> Swear I backed mine up a week or two ago before switching to STRIX, but alas cannot find it... Not that it's too much of an issue, STRIX works fine, but still.


Do you need the regular tuf or the tuf oc bios?
I have the tuf oc


----------



## Arizor

Azazil1190 said:


> Do you need the regular tuf or the tuf oc bios?
> I have the tuf oc


TUF OC would work thanks mate!


----------



## Azazil1190

Arizor said:


> TUF OC would work thanks mate!


I will send you


----------



## Arizor

Azazil1190 said:


> I will send you


appreciate it bud!


----------



## Azazil1190

how can i upload the bios?


----------



## TheNaitsyrk

ALSTER868 said:


> Can't decide which cable to get. Would this Corsair be a better choice than any other like Cablemod? I like Corsair more for having only 2x8 pins and thus being more compact. And.. will it work fine with a Seasonic PX-1000 PSU? I can see compatible Corsair PSUs in its description.


Only Gen 4 Corsair PSUs. If you plug it into another one, it will burn and melt.


----------



## pt0x-

Dit some tinkering on my Strix OC, already some days ago, just yeeted the sliders and this was the result.









I scored 10 825 in Speed Way


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32510 MB, 64-bit Windows 11}




www.3dmark.com













I scored 34 223 in Time Spy


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32510 MB, 64-bit Windows 11}




www.3dmark.com





Waiting to push harder, Watercool hopefully will release their block soon. And I need a new PSU as well (waiting for seasonic vertex)
Im running a Seasonic 850w focus plat, but with my system I don't want to add my strix to that. So I'm running the strix on a dedicated 600W PSU... i'm power limited so to speak 
Good thing is, the 600W unit is a be quiet, and I got the be quiet 12VHPWR adapter which is way better than the fiasco adapter we got with our cards.

To clarify I run a 12900k OC'd, 2 x D5 pump, and 22 (yes twenty-two) 120mm fans. (2 x 360 rad and a mora3) I think for gaming the seasonic should hold, but I'd rather not take any chances.


----------



## long2905

pt0x- said:


> Dit some tinkering on my Strix OC, already some days ago, just yeeted the sliders and this was the result.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 10 825 in Speed Way
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32510 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 34 223 in Time Spy
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32510 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Waiting to push harder, Watercool hopefully will release their block soon. And I need a new PSU as well (waiting for seasonic vertex)
> Im running a Seasonic 850w focus plat, but with my system I don't want to add my strix to that. So I'm running the strix on a dedicated 600W PSU... i'm power limited so to speak
> Good thing is, the 600W unit is a be quiet, and I got the be quiet 12VHPWR adapter which is way better than the fiasco adapter we got with our cards.
> 
> To clarify I run a 12900k OC'd, 2 x D5 pump, and 22 (yes twenty-two) 120mm fans. (2 x 360 rad and a mora3) I think for gaming the seasonic should hold, but I'd rather not take any chances.


dont count on it. expect a heatkiler block for it at least a year out


----------



## pt0x-

long2905 said:


> dont count on it. expect a heatkiler block for it at least a year out


I think end of 2022 they will have something. Follow this topic on a different forum for more info if you want:








WATERCOOL --> Produktinfo


Da habe ich ganz andere Erfahrungen gemacht.




www.hardwareluxx.de





Translate to english and look for posts from Rico[Watercool]


----------



## AvengedRobix

pt0x- said:


> Dit some tinkering on my Strix OC, already some days ago, just yeeted the sliders and this was the result.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 10 825 in Speed Way
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32510 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 34 223 in Time Spy
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32510 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Waiting to push harder, Watercool hopefully will release their block soon. And I need a new PSU as well (waiting for seasonic vertex)
> Im running a Seasonic 850w focus plat, but with my system I don't want to add my strix to that. So I'm running the strix on a dedicated 600W PSU... i'm power limited so to speak
> Good thing is, the 600W unit is a be quiet, and I got the be quiet 12VHPWR adapter which is way better than the fiasco adapter we got with our cards.
> 
> To clarify I run a 12900k OC'd, 2 x D5 pump, and 22 (yes twenty-two) 120mm fans. (2 x 360 rad and a mora3) I think for gaming the seasonic should hold, but I'd rather not take any chances.


where you take the bequiet cable??? i can't find in stock


----------



## EEE-RAY

Is anyone else having instability with modern warfare 2 on 526.47? I started having lots of artefacts and display driver crashes just on that game after I updated. I thought it was my overclock but I've done heaps of validation and found nothing. Previously I've gone 10+ hours on the game with same overclock with no issues. Now I crash every 10 minutes.


----------



## mattskiiau

EEE-RAY said:


> Is anyone else having instability with modern warfare 2 on 526.47? I started having lots of artefacts and display driver crashes just on that game after I updated. I thought it was my overclock but I've done heaps of validation and found nothing. Previously I've gone 10+ hours on the game with same overclock with no issues. Now I crash every 10 minutes.


Devs said not to use 526.47 afaik. I rolled back to 522.25 which was recommended and haven't had any issues yet. Only getting more COD related crashes now.


----------



## yzonker

J7SC said:


> VRAM on my card needs further testing once the connector and w-block have arrived and are installed. For some weird reason, I can run 1650+ error-free, with efficiency gains and thorough testing (all 3DMs, Cyberpunk '77, other) as long as I'm not maxing the GPU w/ 133% full GPU OC - then best VRAM falls to mid- +1400s. I think it is heat (as in memory controller, rather than VRAM itself) as my card has a hotspot issue at full tilt. W-cooling it should help to get to the bottom of it, noting that I'm not running a chiller w/ 4C temps. That said, the 2 GB GDDR6X chips not liking to get too cold has been shown a couple of times with the sub-zero folks.


That is interesting.  I haven't really done any testing at less than 133%, so I don't know if my card does that also. And yes I have watched Lummi's videos where he talks about the 2Gb chips falling off at low temps. I just didn't realize it started at such a high temp. I'd say around 30C-40C given that just 100% fan on an air cooler can hurt stability.


----------



## AvengedRobix

whoooooooah


----------



## dante`afk

motivman said:


> how are you guys getting such great overclocking cards? I just got an FE, and guess what? it overclocks just about the same as my gaming trio... SMH. If anyone wants an FE, let me know... haha.


all cards this gen OC more or less the same.

binning this time around is obsolete and a waste of time


----------



## kairi_zeroblade

AvengedRobix said:


> whoooooooah
> 
> View attachment 2579985


literally "when 1 12pin is not enough"


----------



## marc0053

Thanks for linking amazon link to the Fasgear cable @MrTOOSHORT 
Got it today and seems great so far.

I noticed the little red led light on my ASUS TUF OC 4090 by the power connector stays off now when PC is shut down which was lit with the original 12 pin connector.
I peeled back the mesh around the 4 strands and it wrote 300v on the original adapter and no melting occurred even after about 1-2 hours of 3D Mark Port Royal.

Originally had loud coil wine but after a few sessions of Port Royal it seems to have gone away.

Hall of Fame shows that I'm from the USA for some reason and not Canada.
NVIDIA GeForce RTX 4090 video card benchmark result - Intel Core i9-12900KF Processor,ASRock Z690 AQUA OC (3dmark.com) 

Some pics:


----------



## Panchovix

marc0053 said:


> Hall of Fame shows that I'm from the USA for some reason and not Canada.
> NVIDIA GeForce RTX 4090 video card benchmark result - Intel Core i9-12900KF Processor,ASRock Z690 AQUA OC (3dmark.com)


I'm noticing on the scores above 28K, that the VRAM overclock is the most important. Pretty nice score there mate.
Bummer that I can't go higher than 1200, and I'm doubting if WC or anything would help to that lol


----------



## Nizzen

Panchovix said:


> I'm noticing on the scores above 28K, that the VRAM overclock is the most important. Pretty nice score there mate.
> Bummer that I can't go higher than 1200, and I'm doubting if WC or anything would help to that lol


Points over 28k in Port Royal scales very strange. This is due to the vram OC bug. If you look at the gpu frequency and memory frequency, the scaling does not make sense. Hope they are fixing the bug.
Example:
GPU core clock 3105 Mhz / GPU memory clock 1550 Mhz = 29650 points.
Clearly it's wrong......


----------



## yzonker

Nizzen said:


> Points over 28k in Port Royal scales very strange. This is due to the vram OC bug. If you look at the gpu frequency and memory frequency, the scaling does not make sense. Hope they are fixing the bug.
> Example:
> GPU core clock 3105 Mhz / GPU memory clock 1550 Mhz = 29650 points.
> Clearly it's wrong......


Yes, an air cooled card tops out in the 28500-29000 range. Anything over 29k that isn't at least sub ambient is probably a bugged run.


----------



## KedarWolf

yzonker said:


> Yes, an air cooled card tops out in the 28500-29000 range. Anything over 29k that isn't at least sub ambient is probably a bugged run.


Yeah, is the memory artefacts and the run finishes, it can be a bugged high score.

Only runs that have zero artefacts are actually valid.


----------



## Xavier233

Windowed panel on cases are important for a 4090: you would want to know if the GPU is on fire, smoking, or smelling bad


----------



## motivman

dante`afk said:


> all cards this gen OC more or less the same.
> 
> binning this time around is obsolete and a waste of time


correct based on my experience, but seeing people in forums get cards that are capable of over 3100mhz and over +1500 overclock on the ram clearly shows that there are really good overclocking cards out there. I am just unlucky that both my cards were duds. I ended up keeping the FE and returning the TRIO. FE ran cooler in my system.


----------



## Jacinto1023

My 4090 is running in PCIE 4.0 X8 because of my SSD in m.2 slot 1 but its a gen 4x4 ssd. 

Should i move the SSD to the chipset slot to get x16 on the GPU?

Its kind of a hassle, need to remove the gpu and stuff to get access to the m.2 covers. Is there a difference in performance between x8 and x16 4.0? i think i saw a benchmark that it was like a 1-2% loss of performance.


----------



## J7SC

Jacinto1023 said:


> My 4090 is running in PCIE 4.0 X8 because of my SSD in m.2 slot 1 but its a gen 4x4 ssd.
> 
> Should i move the SSD to the chipset slot to get x16 on the GPU?
> 
> Its kind of a hassle, need to remove the gpu and stuff to get access to the m.2 covers. Is there a difference in performance between x8 and x16 4.0? i think i saw a benchmark that it was like a 1-2% loss of performance.







---


AvengedRobix said:


> whoooooooah


Yup - Galax HoF PCBs are supper yummy - and you get_ two_ dongles to play with. Superflower 2KW PSU recommended ? I am also still checking various 4090 PCBs pics since DerBauer pointed out before that some of the Gigabyte cards actually have the circuits onboard for NVLink SLI...wondering whether those will be for the 4090 Ti  since the rumour is that they're using the same PCBs (but w/ more / powerful phases). The HoF doesn't seem to have those circuits for NVL, though...interesting


----------



## Recipe7

motivman said:


> ok, so I did that, what do I do next? lol


That allows you to select clocks past 3000 and up to 3500. Does the curve editor reflect this change?
Edit: nevermind, you got it going

my FE isn’t really great either. I undevolted to 1.0v at 2880, runs great and cool there


----------



## Sheyster

mattskiiau said:


> Devs said not to use 526.47 afaik. I rolled back to 522.25 which was recommended and haven't had any issues yet. Only getting more COD related crashes now.


The 522 is the driver to use for now. the newer 526 is awful, stay away.


----------



## Jacinto1023

Damn is 1-3% worth the hassle of reinstalling gpu and ssds lol


----------



## Nizzen

Sheyster said:


> The 522 is the driver to use for now. the newer 526 is awful, stay away.


I hav no problem with other games, so it ain't awful....


----------



## mirkendargen

Nizzen said:


> I hav no problem with other games, so it ain't awful....


526 makes my second monitor frequently not work when just sitting on the desktop, it's pretty awful lol.


----------



## Sheyster

Jacinto1023 said:


> My 4090 is running in PCIE 4.0 X8 because of my SSD in m.2 slot 1 but its a gen 4x4 ssd.
> 
> Should i move the SSD to the chipset slot to get x16 on the GPU?
> 
> Its kind of a hassle, need to remove the gpu and stuff to get access to the m.2 covers. Is there a difference in performance between x8 and x16 4.0? i think i saw a benchmark that it was like a 1-2% loss of performance.


If it's a GEN4 M.2 you're using in slot 1, you should still have 16 CPU lanes left for the graphics card. Mine seems to work fine with this same config (Z790-F). I show PCI-E 4.0 x16 in GPU-Z.

Have a look at the PCI-E BIOS settings, it's possible the AUTO setting isn't working the way it should.


----------



## Sheyster

Xavier233 said:


> Windowed panel on cases are important for a 4090: you would want to know if the GPU is on fire, smoking, or smelling bad


Open bench is ideal, you can hear and smell every snap, crackle and pop.


----------



## Jacinto1023

Sheyster said:


> If it's a GEN4 M.2 you're using in slot 1, you should still have 16 CPU lanes left for the graphics card. Mine seems to work fine with this same config (Z790-F). I show PCI-E 4.0 x16 in GPU-Z.
> 
> Have a look at the PCI-E BIOS settings, it's possible the AUTO setting isn't working the way it should.


Yeah i just tried and still the same. I'll just have to move the SSD eventually. Once my Sleeved 4090 cable comes in.


----------



## Xavier233

Sheyster said:


> Open bench is ideal, you can hear and smell every snap, crackle and pop.


But its also the most noisy


----------



## Sheyster

Jacinto1023 said:


> Yeah i just tried and still the same. I'll just have to move the SSD eventually. Once my Sleeved 4090 cable comes in.


Before you move it I would suggest checking with ASUS support and/or their forum. Do you have the latest BIOS installed? It should work, there are 20 CPU PCI lanes...


----------



## Nizzen

Xavier233 said:


> But its also the most noisy


Why? Everything watercooled with MoRa + big enough PSU= silent heaven max overclocked


----------



## Sheyster

marc0053 said:


> View attachment 2579997


You need a bigger monitor...


----------



## Jacinto1023

Sheyster said:


> Before you move it I would suggest checking with ASUS support and/or their forum. Do you have the latest BIOS installed? It should work, there are 20 CPU PCI lanes...


Yeah i'll try contacting them. I'm on the latest BIOS

Says this on the Z690 motherboard specs section:
_When M.2_1 is occupied with SSD, PCIEX16(G5) will run x8 mode only. _


----------



## Nizzen

Sheyster said:


> You need a bigger monitor...
> 
> View attachment 2580075


"monitor"


----------



## Sheyster

Nizzen said:


> "monitor"


Hey, don't knock it until you've tried it!  Once you go OLED you don't go back.

I may actually downsize to an ASUS PG42UQ OLED. My only reservation is the anti-glare coating they use on the panel. TFT Central compared them side-by-side in a video and you really lose the color pop with the ASUS coating, sadly.


----------



## Sheyster

Jacinto1023 said:


> Yeah i'll try contacting them. I'm on the latest BIOS
> 
> Says this on the Z690 motherboard specs section:
> _When M.2_1 is occupied with SSD, PCIEX16(G5) will run x8 mode only. _


Interesting, this could possibly be another difference between Z690 and Z790. Does the Z690 support the yet to be released GEN5 M.2 or not?


----------



## Jacinto1023

Sheyster said:


> Interesting, this could possibly be another difference between Z690 and Z790. Does the Z690 support the yet to be released GEN5 M.2 or not?


Looks like it does
M.2_1 slot (Key M), type 2242/2260/2280/22110 (supports PCIe 5.0 x4 mode) 

But apparently cant use that and a GPU at 4.0 X16 at the same time.


----------



## Nizzen

Sheyster said:


> Hey, don't knock it until you've tried it!  Once you go OLED you don't go back.
> 
> I may actually downsize to an ASUS PG42UQ OLED. My only reservation is the anti-glare coating they use on the panel. TFT Central compared them side-by-side in a video and you really lose the color pop with the ASUS coating, sadly.


Too big and too close for my use. My wife use TV, I'm using monitors 
I use 1x 34" 3440x1440 and 1x 4k 24" on my main pc. 

34" UW Oled over any OLED in the world if the size is over 38", in my opinion.


----------



## J7SC

Sheyster said:


> Hey, don't knock it until you've tried it!  Once you go OLED you don't go back.
> 
> I may actually downsize to an ASUS PG42UQ OLED. My only reservation is the anti-glare coating they use on the panel. TFT Central compared them side-by-side in a video and you really lose the color pop with the ASUS coating, sadly.





Nizzen said:


> Too big and too close for my use. My wife use TV, I'm using monitors
> I use 1x 34" 3440x1440 and 1x 4k 24" on my main pc.
> 
> 34" UW Oled over any OLED in the world if the size is over 38", in my opinion.


...funny how one gets used to things - prior to the 48 inch OLED, my biggest monitor was/is a 40 inch Philips VA workstation panel - and I thought, wow, that's big. Now that it sits right next to the 48 inch OLED (along w/ some 27 inch Samsungs), they all look so 'small' compared to the LG...


----------



## Madness11

Guys where I can check vbios updates from MSI ?? Please tell me


----------



## HAQ0901

Madness11 said:


> Guys where I can check vbios updates from MSI ?? Please tell me


Use MSI Dragon Center


----------



## Panchovix

Is there a way to backup the VBIOS? I'm trying to backup my VBIOS (TUF 4090 NON-OC) and GPU-Z doesn't let me lol

Mostly, to update to the newer VBIOS and having the backup in the case it goes wrong; also to submit it to the VBIOS database.


----------



## Azazil1190

Panchovix said:


> Is there a way to backup the VBIOS? I'm trying to backup my VBIOS (TUF 4090 NON-OC) and GPU-Z doesn't let me lol
> 
> Mostly, to update to the newer VBIOS and having the backup in the case it goes wrong; also to submit it to the VBIOS database.


Yes nvflash


----------



## Panchovix

Azazil1190 said:


> Yes nvflash


Thanks, I forgot it lol


----------



## bigfootnz

edit


----------



## dr/owned

Back from vacation and I think I missed a pretty wild ride with my 4090 sitting in the box. Could use some help getting the TLDR version:

Is the exploding connector only an adapter problem? I have the CableMod native cable doodad so I shouldn't need the adapter.
Any blocks yet for the MSI cards besides EK? Quick look seems like no.
Best BIOS right now to flash for 600W? I saw a couple posts about the fan control going wonky.


----------



## J7SC

...seeing all these vbios posts, has anyone updated their RTX 4090 Gigabyte Gaming OC vbios ? Any difference in performance before / after ? Their release notes only say s.th. about compatibility


----------



## Panchovix

dr/owned said:


> Is the exploding connector only an adapter problem? I have the CableMod native cable doodad so I shouldn't need the adapter.


It seems to be an only adapter problem, so in your case you won't have issues.



dr/owned said:


> Best BIOS right now to flash for 600W? I saw a couple posts about the fan control going wonky.


What's your 4090 model? If you have an Gaming X Trio/Gaming Trio, and your cablemod is 4x8 pins (or directly 12vhpwr), probably the Gaming OC VBIOS, or the ASUS TUF one should be fine (both support 600W, IDK which of them has faster fan speeds tho, on my TUF my max is 3000RPM I think); can't comment on the Suprim X Liquid/Suprim Liquid since I'm not sure how the fans would work on an air cooled card.



J7SC said:


> ...seeing all these vbios posts, has anyone updated their RTX 4090 Gigabyte Gaming OC vbios ? Any difference in performance before / after ? Their release notes only say s.th. about compatibility


Wondering the same on my TUF, gonna flash in the night probably and see if it makes any difference (the only changelog is improved compability)


----------



## KedarWolf

Galax showcases their RTX 4090 Hall of Fame PCB - Dual 12VHPWR connectors!


Galax's RTX 4-090 Hall of Fame has a huge 28+4+4 VRM




www.overclock3d.net


----------



## BigMack70

Has anyone had a crash where their PC goes to black screen and spins the GPU fans to 100%? Had this happen twice. Happened to me once while gaming, once while watching youtube. Freaked me out at first, but temps were fine both times, and the connector wasn't particularly warm or hot to the touch, so I'm guessing driver bug/crash?


----------



## Arizor

BigMack70 said:


> Has anyone had a crash where their PC goes to black screen and spins the GPU fans to 100%? Had this happen twice. Happened to me once while gaming, once while watching youtube. Freaked me out at first, but temps were fine both times, and the connector wasn't particularly warm or hot to the touch, so I'm guessing driver bug/crash?


Any info in Windows' Event Viewer?


----------



## ArcticZero

BigMack70 said:


> Has anyone had a crash where their PC goes to black screen and spins the GPU fans to 100%? Had this happen twice. Happened to me once while gaming, once while watching youtube. Freaked me out at first, but temps were fine both times, and the connector wasn't particularly warm or hot to the touch, so I'm guessing driver bug/crash?


These (black screen + fans 100+%) have always been hardware/power related for me on my 3090. It was usually either a loose power connector, some silver paint traces I had left on the PCB (long story), or a burnt cable somewhere. But yeah Event Viewer might have something, though usually driver crashes for me were either momentary black screens, BSOD's, or hard restarts.


----------



## Benni231990

My Suprim X is arrived today from alphacool current with the stock air cooler 









No Coil wine nothing 

and Boost 3000-3015mhz with stock Voltage 1,050


----------



## Sheyster

J7SC said:


> ...seeing all these vbios posts, has anyone updated their RTX 4090 Gigabyte Gaming OC vbios ? Any difference in performance before / after ? Their release notes only say s.th. about compatibility


No difference. Make sure you back up the original VBIOS before you flash it, just in case.


----------



## narrn2761

Damm thats nice.

Just got my TUF OC.

Played a few games.

I got to 2760MHz stock speeds/voltage.
MSI AB report 405W power draw.
Max 60°C.

Is this good?

Few more questions.
How do I get higher clock speeds if I have more thermal head room?

Do I just use MSI AB, and just add frequency until my games/stress test crashes, then back down frequency?

Do I need to add voltage? How do I do that? Can I still use MSI AB or should I use an Asus program since I have the TUF OC?


----------



## Arizor

narrn2761 said:


> Damm thats nice.
> 
> Just got my TUF OC.
> 
> Played a few games.
> 
> I got to 2760MHz stock speeds/voltage.
> MSI AB report 405W power draw.
> Max 60°C.
> 
> Is this good?
> 
> Few more questions.
> How do I get higher clock speeds if I have more thermal head room?
> 
> Do I just use MSI AB, and just add frequency until my games/stress test crashes, then back down frequency?
> 
> Do I need to add voltage? How do I do that? Can I still use MSI AB or should I use an Asus program since I have the TUF OC?


Yep that stock performance is what most should expect.

Use MSI AB, but make sure you get the latest beta as it has voltage control for 4090.

Massively diminishing returns adding voltage, but you can. Usual cycle of increasing frequencies and testing for crashes, yes.

You should expect, maxxed power limit and voltage, to hit somewhere in the 3ghz on core, and most are getting +1500 on the memory, some of us a nice chunk more, some less.


----------



## Arizor

But for what it's worth, I think the ticket here is undervolting. On my TUF I've got it 0.95V @ 2760mhz, +1600 mem. Runs cool and lower power consumption.


----------



## EEE-RAY

I am crashing on windows at idle (just browsing). Hard lock requiring restart. system info states a display driver crash. Is my card dying?


----------



## BigMack70

ArcticZero said:


> These (black screen + fans 100+%) have always been hardware/power related for me on my 3090. It was usually either a loose power connector, some silver paint traces I had left on the PCB (long story), or a burnt cable somewhere. But yeah Event Viewer might have something, though usually driver crashes for me were either momentary black screens, BSOD's, or hard restarts.


Best I can figure is it was these two nvlddmkm errors from event viewer.... This was this afternoon while watching a youtube video. Not sure if relevant, but it was after an hour of playing modern warfare 2 which FILLED my event viewer with DistributedCOM warnings about application-specific permission settings.

First time it happened was several days ago and I can't identify that far back which error it was.



Spoiler






> The description for Event ID 0 from source nvlddmkm cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.
> 
> If the event originated on another computer, the display information had to be saved with the event.
> 
> The following information was included with the event:
> 
> \Device\000000eb
> UCodeReset TDR occurred on GPUID:100
> 
> The message resource is present but the message was not found in the message table





> The description for Event ID 14 from source nvlddmkm cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.
> 
> If the event originated on another computer, the display information had to be saved with the event.
> 
> The following information was included with the event:
> 
> \Device\000000eb
> ffffffff(ffffffff) ffffffff ffffffff
> 
> The message resource is present but the message was not found in the message table


----------



## Pollomir

EEE-RAY said:


> I am crashing on windows at idle (just browsing). Hard lock requiring restart. system info states a display driver crash. Is my card dying?


Are you using a riser cable? I had several problems with mine, freezes and crashes similar to yours. Event viewer also pointed to the driver. Was the riser cable.


----------



## EEE-RAY

No riser cable.

My crash was somehwat similar to BigMack - black screen when watching youtube. Computer then restarted about 2 minutes later by itself. Only difference is fans did not spin 100%


----------



## Arizor

BigMack70 said:


> Best I can figure is it was these two nvlddmkm errors from event viewer.... This was this afternoon while watching a youtube video. Not sure if relevant, but it was after an hour of playing modern warfare 2 which FILLED my event viewer with DistributedCOM warnings about application-specific permission settings.
> 
> First time it happened was several days ago and I can't identify that far back which error it was.


Ah, the likeliest culprit for this error, in my experience (assuming you've already DDU'd and installed fresh drivers), is unstable overclocks. What's your settings in Afterburner? Have you tried an undervolt?


----------



## Zero989

Hello rich people. My Suprim Liquid X has arrived.


----------



## BigMack70

Arizor said:


> Ah, the likeliest culprit for this error, in my experience (assuming you've already DDU'd and installed fresh drivers), is unstable overclocks. What's your settings in Afterburner? Have you tried an undervolt?


I was at +180/+1500 in afterburner, which should be on the edge of stable (+200/+1500 crashes in some games). Probably a bad OC then. First time I've seen a bad OC display this behavior at desktop just doing video decode. 

I've not yet messed with UV with this card.


----------



## EEE-RAY

Damn. Well I used the afterburner auto overclock utility. I clocked my memory to +1500 manually by incrementing 100Mhz at a time and running PR. I had positive increments in score with each VRAM up to +1750. I stupidly tried +2000 and that hard locked.

I decided to settle for 1500 for safety margin as I know I could pass PR with positive score increment. 

Based on this what is the most likly culprit - Vram or Vcore? Could i have permannetly dmaanged something with my +2000 mem causing the hard lock.


----------



## Panchovix

BigMack70 said:


> I was at +180/+1500 in afterburner, which should be on the edge of stable (+200/+1500 crashes in some games). Probably a bad OC then. First time I've seen a bad OC display this behavior at desktop just doing video decode.
> 
> I've not yet messed with UV with this card.


From what I've tested, unstable VRAM overclocks on the 4090 crash my PC entirely lol, meanwhile core overclocks just close the app that is rendering with an DX/Vulkan/OGL error.

BTW, updated my TUF VBIOS, don't see much difference, managed to backup the Performance VBIOS, was trying to backup the Quiet one, but the VBIOS update tool somehow updates both VBIOS at the same time, so now I have newer Performance/Quiet VBIOS.


----------



## EEE-RAY

Damn sounds like an unstable memory overclock for me then.

But it was doing so bloody well and no crashes anywhere excdept that time on desktop >.<

Is there a program to validate memory overclocks independent of core?


----------



## Arizor

BigMack70 said:


> I was at +180/+1500 in afterburner, which should be on the edge of stable (+200/+1500 crashes in some games). Probably a bad OC then. First time I've seen a bad OC display this behavior at desktop just doing video decode.
> 
> I've not yet messed with UV with this card.


Yep that will do it. From my own experimenting, it's more likely the VRAM OC that is the issue, much more likely to cause that kind of hard lock / restart than core.



EEE-RAY said:


> Damn. Well I used the afterburner auto overclock utility. I clocked my memory to +1500 manually by incrementing 100Mhz at a time and running PR. I had positive increments in score with each VRAM up to +1750. I stupidly tried +2000 and that hard locked.
> 
> I decided to settle for 1500 for safety margin as I know I could pass PR with positive score increment.
> 
> Based on this what is the most likly culprit - Vram or Vcore? Could i have permannetly dmaanged something with my +2000 mem causing the hard lock.


As mentioned, more likely to be the VRAM. Unlikely you caused any permanent damage, it would be much more conspicuous, fingers crossed... But er, yeah, don't try ludicrous overclocks in the future


----------



## warbucks

Zero989 said:


> Hello rich people. My Suprim Liquid X has arrived.


Nice! I picked up mine today as well.


----------



## Zero989

warbucks said:


> Nice! I picked up mine today as well.


What are you doing for the 12+4PIN, using the stock?


----------



## ArcticZero

BigMack70 said:


> Best I can figure is it was these two nvlddmkm errors from event viewer.... This was this afternoon while watching a youtube video. Not sure if relevant, but it was after an hour of playing modern warfare 2 which FILLED my event viewer with DistributedCOM warnings about application-specific permission settings.
> 
> First time it happened was several days ago and I can't identify that far back which error it was.


Yeah TDR errors point to an unstable OC/UV if either are applied whether manually or by custom BIOS. Might have found your culprit then. If you were running stock I'd be more worried since that's usually a faulty GPU, assuming drivers and stuff are properly installed.


----------



## EEE-RAY

Whats a TDR error?


----------



## Arizor

EEE-RAY said:


> Whats a TDR error?


Timeout Detection.


----------



## EEE-RAY

Is there much gain in performance in real games (not 3dmark benches) with mem OC? It didn't do too much in previous generations and if its just one or two frames maybe I will just not bother fine tuning and just lap an arbitery +1000mhz or something


----------



## yzonker

Yea, mine blanked out yesterday when I was running Speedway with a ragged edge mem OC. Had to hold the power button down to turn it off. I didn't think anything about it. I've crashed my 3090 hard like that too with a mem OC. Nothing new.


----------



## Arizor

EEE-RAY said:


> Is there much gain in performance in real games (not 3dmark benches) with mem OC? It didn't do too much in previous generations and if its just one or two frames maybe I will just not bother fine tuning and just lap an arbitery +1000mhz or something


I think @yzonker posted stats earlier on this, but mem OC gives about 2-3% performance.


----------



## yzonker

EEE-RAY said:


> Is there much gain in performance in real games (not 3dmark benches) with mem OC? It didn't do too much in previous generations and if its just one or two frames maybe I will just not bother fine tuning and just lap an arbitery +1000mhz or something


Yes, there's more to gain from the mem OC than core this time around. I posted this once before, but when I tested it in CP2077, mem OC was 3.3%, core was only 1.7%. Nothing huge in either case, but the mem OC gained more.


----------



## EEE-RAY

wow! Thats quite signficant. It used to not do a whole much.


----------



## Panchovix

EEE-RAY said:


> Is there much gain in performance in real games (not 3dmark benches) with mem OC? It didn't do too much in previous generations and if its just one or two frames maybe I will just not bother fine tuning and just lap an arbitery +1000mhz or something


On the 4090 case it seems the VRAM overclocks gives more performance

On my best stable overclock, I get 10-11% more performance than stock, but 6-7% comes by VRAM overclock alone, the rest is core overclock. (Tested them separately, and then both at the same time)

Seems that an undervolt + vram oc is the best way to go


----------



## Arizor

Panchovix said:


> On the 4090 case it seems the VRAM overclocks gives more performance
> 
> On my best stable overclock, I get 10-11% more performance than stock, but 6-7% comes by VRAM overclock alone, the rest is core overclock. (Tested them separately, and then both at the same time)
> 
> Seems that an undervolt + vram oc is the best way to go


Yep this is my finding too.


----------



## EEE-RAY

So how do you guys test ram stability? No game crashes and i increment performance up to 1750 but it randomly hard crashes out of the blue crashes playing a youtube on desktop at +1500. Obviously I should down clock but how should I got forward determining how far to go down - aside from crossing my fingers and watching for crashes that occur very rarely.


----------



## warbucks

Zero989 said:


> What are you doing for the 12+4PIN, using the stock?


Using the stock one until Corsair has their 600W cable for my PSU in stock.


----------



## Mad Pistol

EEE-RAY said:


> So how do you guys test ram stability? No game crashes and i increment performance up to 1750 but it randomly hard crashes out of the blue crashes playing a youtube on desktop at +1500. Obviously I should down clock but how should I got forward determining how far to go down - aside from crossing my fingers and watching for crashes that occur very rarely.


Port Royal. It will always sniff out an unstable overclock.

It feels like I can bench at 100-200 mhz higher memory on most benchmarks. Port Royal very consistently will rip apart an unstable overclock, and it usually does it quickly.


----------



## EEE-RAY

Performance on PR increments for me up to to +1750. Crashes on +1800. I can PR for hours on +1750. I decided to go +1500 for safety margin but I crashed even there . That is why I am having trouble deciding where to go from here to troubleshoot.


----------



## Mad Pistol

Mine is +1500 memory benchable, but I do get some odd crashes at that speed. I backed it down to +1200 and have been rock solid for 48 hours now.


----------



## ZealotKi11er

EEE-RAY said:


> Performance on PR increments for me up to to +1750. Crashes on +1800. I can PR for hours on +1750. I decided to go +1500 for safety margin but I crashed even there . That is why I am having trouble deciding where to go from here to troubleshoot.


Yep, different clients will stress the memory path differently. NVEC goes directly to the memory while GFX is going to L2 before memory.


----------



## EEE-RAY

ZealotKi11er said:


> Yep, different clients will stress the memory path differently. NVEC goes directly to the memory while GFX is going to L2 before memory.


Thanks man. So based on experience here, how far would you guys jump down from +1500mhz when it crashed on youtube? I was thinking of setting it to +1375


----------



## Arizor

EEE-RAY said:


> Thanks man. So based on experience here, how far would you guys jump down from +1500mhz when it crashed on youtube? I was thinking of setting it to +1375


It's just the process of learning your clocks. Make it +1400, and if you experience any crash in the next week, go down to 1350, and so on.


----------



## ZealotKi11er

EEE-RAY said:


> Thanks man. So based on experience here, how far would you guys jump down from +1500mhz when it crashed on youtube? I was thinking of setting it to +1375


I am using +1375MHz right now with no issue but generally speaking that is overkill too. Memory acts very different these days. Unless there is a way to see error we are just guessing.


----------



## mattskiiau

Anyone have any idea why this reaches 600w? Bad? Doesn't matter? Not sure what it is and Google not helping much.


----------



## EEE-RAY

ZealotKi11er said:


> I am using +1375MHz right now with no issue but generally speaking that is overkill too. Memory acts very different these days. Unless there is a way to see error we are just guessing.


I wonder why we don't have a proper memory overclock stress tester liek we do with system ram.


----------



## arieldeboca

Hi guys.
For owners of the 4090 Strix:
Could you pass me the measurement of the VGA to the 12VHPWR connector?
See the red line in the photo below.

Thanks.


----------



## dr/owned

Tore my MSI Trio apart. There is a header for the SDA and SCL pins on the 2891 controller, confirmed with multimeter. Elmor confirms EVC2 supports this now: https://elmorlabs.com/forum/topic/evc2-beta-software/?part=11#postid-724:










The VRAM pad thickness is 1.0mm if someone wants to go and repad this thing. Looks like the VRM uses 0.5mm or 0.7mm as well but I haven't measured that yet. The VRM on the left side doesn't have active cooling...it uses the bracing frame for heatsinking. That's not good if you want to shove big power through this on air..

I can't find F2 on the PCB. F1 is for the PCIe connector (12A), F3 is at the main PWR connector (20A branch). Can't see any other fuses or silkscreen for F2.


----------



## Arizor

EEE-RAY said:


> I wonder why we don't have a proper memory overclock stress tester liek we do with system ram.


We do have a test for this, somewhere, damned if I can find it on my PC however... Someone in this thread will have it.


----------



## KingEngineRevUp

a_Criminai said:


> If anyone is using the $1599 Zotac 4090 trinity I can confirm the zotac 4090 extreme vbios works fine on it.
> 
> Also side note, apparently oc scanner is capped at +165 MHz. Nvidia should change that.


How has your zotac trinity been treating you?


----------



## Xavier233

warbucks said:


> Using the stock one until Corsair has their 600W cable for my PSU in stock.


Which Corsair PSU are u using?


----------



## yzonker

EEE-RAY said:


> I wonder why we don't have a proper memory overclock stress tester liek we do with system ram.











I've created an application for testing Video...


A tool written in vulkan compute to stress test video memory for correctness. Ideological successor to memtestCL. Releases at GitHub Open-source, prebuilt binaries available for windows and linux, supports aarch64 NVidia Jetson. Simple to use - no any parameters except optional card...




www.overclock.net


----------



## EEE-RAY

Wow br


yzonker said:


> I've created an application for testing Video...
> 
> 
> A tool written in vulkan compute to stress test video memory for correctness. Ideological successor to memtestCL. Releases at GitHub Open-source, prebuilt binaries available for windows and linux, supports aarch64 NVidia Jetson. Simple to use - no any parameters except optional card...
> 
> 
> 
> 
> www.overclock.net


wow!

Has anyone ever tried to use it? What parameters do we need to set to work on a 4090?


----------



## Panchovix

EEE-RAY said:


> Wow br
> 
> wow!
> 
> Has anyone ever tried to use it? What parameters do we need to set to work on a 4090?


I just tried like 10 minutes with +1000 on my card, just executed the .exe via cmd, no issues.


----------



## biigshow666

EEE-RAY said:


> Is anyone else having instability with modern warfare 2 on 526.47? I started having lots of artefacts and display driver crashes just on that game after I updated. I thought it was my overclock but I've done heaps of validation and found nothing. Previously I've gone 10+ hours on the game with same overclock with no issues. Now I crash every 10 minutes.


I went through the campaign on the old drivers and had no issues. Updated to 526.47 on release and was fighting crashes all Friday/Saturday. Rolled back Sunday and no artifacts or broken shadows and absolutely no crashes all day long. 
Now I have to travel all week for work so no more gaming for a while. But that's what allows me to pay for the new toys.


----------



## narrn2761

Arizor said:


> Yep that stock performance is what most should expect.
> 
> Use MSI AB, but make sure you get the latest beta as it has voltage control for 4090.
> 
> Massively diminishing returns adding voltage, but you can. Usual cycle of increasing frequencies and testing for crashes, yes.
> 
> You should expect, maxxed power limit and voltage, to hit somewhere in the 3ghz on core, and most are getting +1500 on the memory, some of us a nice chunk more, some less.


Ok. So you guys that can hit 3ghz core, do you leave it at that setting and game with it or is this 3ghz just for benchmarking or bragging then you turn it down?

And does that require more voltage for 3ghz yeah?

So I have read that core OC does not yield as much performance increase this gen but the mem OC does more.

Does that mean, I should focus on the mem OC first, find my ceiling, then go for core OC after, or should I still be doing core OC first?

Someone mentioned here they got 6-7% increase with mem OC, wondering how much they increased mem to? Was it 1500, or more?


----------



## Arizor

narrn2761 said:


> Ok. So you guys that can hit 3ghz core, do you leave it at that setting and game with it or is this 3ghz just for benchmarking or bragging then you turn it down?
> 
> And does that require more voltage for 3ghz yeah?
> 
> So I have read that core OC does not yield as much performance increase this gen but the mem OC does more.
> 
> Does that mean, I should focus on the mem OC first, find my ceiling, then go for core OC after, or should I still be doing core OC first?
> 
> Someone mentioned here they got 6-7% increase with mem OC, wondering how much they increased mem to? Was it 1500, or more?


3ghz core generally requires PL and voltage upped, not really worth it for gaming, only for benchmarks.

For games, my advice is undervolt. As I've said before, mine is 0.95V @ 2760mhz, +1600 on the mem. Runs nice and cool, beats stock performance by a few percent in games, very happy with it at that.

You could probably get 2-3% more performance than I'm getting if you maxxed out, but then you're using at least 100-150 watts more and a lot more heat for tiny gains.


----------



## J7SC

Arizor said:


> 3ghz core generally requires PL and voltage upped, not really worth it for gaming, only for benchmarks.
> 
> For games, my advice is undervolt. As I've said before, mine is 0.95V @ 2760mhz, +1600 on the mem. Runs nice and cool, beats stock performance by a few percent in games, very happy with it at that.
> 
> You could probably get 2-3% more performance than I'm getting if you maxxed out, but then you're using at least 100-150 watts more and a lot more heat for tiny gains.


I can get well past 3 GHz core on stock voltage (1.05 instead of 1.1) and stock PL, but you are right, in games none of this matters very much. 

FYI, on memory testing, I tried out that tool linked earlier (memtest_vulkan) and it works fine, though it started to draw well over 410 W with just a VRAM OC and everything else stock...since I am waiting for the 12vhpwr cables and also w-block, I rather do this in earnest after all that is installed.

FYI, you can also google 'ATITool' - it is _ancient_, but the 'artifact checker' option still works fine and only draws ~ 1/4 to 1/3 of what memtest_vulkan uses. The ATITool has been fairly reliable with my 3090, 2080 Ti etc...


----------



## originxt

Picked up another 4090 FE for ****s and giggles, holy bad memory oc. +750 was max before hitting loss on port royal and you can actually see artifacting on the desktop at +1000. Average core, 3045 mhz at 1.1v before crashes.


----------



## Madness11

Guys got issue like Jay2cent , sometime screen won't work (4090 liquid X) it's show A0 on board , but no image on the screen . After reboot it's work ... It's Vbios ?? Got z690 hero and last bios on them . Please help


----------



## Sayenah

marc0053 said:


> Thanks for linking amazon link to the Fasgear cable @MrTOOSHORT
> Got it today and seems great so far.
> 
> I noticed the little red led light on my ASUS TUF OC 4090 by the power connector stays off now when PC is shut down which was lit with the original 12 pin connector.
> I peeled back the mesh around the 4 strands and it wrote 300v on the original adapter and no melting occurred even after about 1-2 hours of 3D Mark Port Royal.
> 
> Originally had loud coil wine but after a few sessions of Port Royal it seems to have gone away.
> 
> Hall of Fame shows that I'm from the USA for some reason and not Canada.
> NVIDIA GeForce RTX 4090 video card benchmark result - Intel Core i9-12900KF Processor,ASRock Z690 AQUA OC (3dmark.com)
> 
> Some pics:
> View attachment 2579997
> 
> View attachment 2579998
> 
> View attachment 2579996











I scored 28 098 in Port Royal


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





you and I have very similar overclocks, almost exact same in many of my runs, yet my score lags behind. It barely broke 28k. I just don’t understand this. Plus I love how cool your card runs. What were you doing that’s different? Is yours water cooled?

I am running on air; I lock a 200MHz on the core clock and a +1700 on the mem, and that’s my best result. I don’t get this mystery.


----------



## LunaP

Sheyster said:


> You need a bigger monitor...
> 
> View attachment 2580075


65" here lol, and agreed its impossible for me to go back down, my other monitors are all 32" around it.


----------



## changboy

Here my score in port royal :








Result not found







www.3dmark.com





Not so high, from 4.8ghz to 5.0ghz and Vram from 3200mhz to 3400mhz, i just gain around 250 point, maybe i'am limited by my cpu and memory but in game i didn't see a difference.
I think its a normal score for my system.


----------



## narrn2761

Arizor said:


> 3ghz core generally requires PL and voltage upped, not really worth it for gaming, only for benchmarks.
> 
> For games, my advice is undervolt. As I've said before, mine is 0.95V @ 2760mhz, +1600 on the mem. Runs nice and cool, beats stock performance by a few percent in games, very happy with it at that.
> 
> You could probably get 2-3% more performance than I'm getting if you maxxed out, but then you're using at least 100-150 watts more and a lot more heat for tiny gains.


Ahh ok, so maybe ill just keep my core where it is now then @ 2760, since I mostly only game, I don't do benchmark crunching or much of anything else. Next then I guess I will up the mem OC as far as I can go on stock voltage, if it can.


----------



## LunaP

YMMV per game, in some games I'm setting 10-15+ gains on fps at good OC's still haven't messed w/ undervolting yet as I've never attempted before. Highest I've gotten so far is 20+ in ffxiv especially w/ gshade up it really helps in that regards since shaders eat up vram quickly.


----------



## changboy

LunaP said:


> YMMV per game, in some games I'm setting 10-15+ gains on fps at good OC's still haven't messed w/ undervolting yet as I've never attempted before. Highest I've gotten so far is 20+ in ffxiv especially w/ gshade up it really helps in that regards since shaders eat up vram quickly.


What you score in port royal with your 10980xe and 4090 ?


----------



## Nico67

mattskiiau said:


> Anyone have any idea why this reaches 600w? Bad? Doesn't matter? Not sure what it is and Google not helping much.
> 
> View attachment 2580115


That is a massive difference, did you flash to a different bios?










just for comparison, GPU Power should be pretty close to GPU Rail Powers, likely vrm efficiency being the difference. It could just be reading wrong, or it could be spikes, hard to say for sure.


----------



## KedarWolf

Got my 7950x and DDR5 dialled in.

This is a legit no artefacts run. 









I scored 29 200 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## EEE-RAY

How does everyone's VRAM behave at the limits of OC? Do you see aritefacts? My VRAMs to go from appearing stable to hard crash usually without the appearance of artifacts, or else withing 5-10mHz each other.

My core is a bit more graceful - It goes from tons of artefacts to tiny mild artifacts to very tiny artifacts in 25 mhz increments on heaven. For some reason heaven seems to produce vcore artefacts at lower clocks than even PR


----------



## shellzzzz

EEE-RAY said:


> I am crashing on windows at idle (just browsing). Hard lock requiring restart. system info states a display driver crash. Is my card dying?


 No. It's a driver issue

https://www.nvidia.com/en-us/geforc...90-driver-crashing-constantly-without-any-lo/

MASSIVE RTX 4090 Problems. driver or hardware?


----------



## EEE-RAY

Wow shellzzz that is EXACTLY my crash.


shellzzzz said:


> No. It's a driver issue
> 
> https://www.nvidia.com/en-us/geforc...90-driver-crashing-constantly-without-any-lo/
> 
> MASSIVE RTX 4090 Problems. driver or hardware?


Wow that is exactly my crash!

idle/low usage state and suddenly screen black

Error is somehting like:
The description for Event ID 0 from source nvlddmkm cannot be found. (...)
The following information was included with the event:
\Device\Video3 Error occurred on GPUID: 100 

Far out ive been hunting to the ends of the earth trying to hunt down this crash.


----------



## LunaP

jcde7ago said:


> Here's another table made by kirkle8 on Reddit:
> View attachment 2579842


Any + side to higher GPU current? Seeing as most cards all seem to be doing well, curious what the additional surplus helps with (if applicable ) 



changboy said:


> What you score in port royal with your 10980xe and 4090 ?


Unless port royal is only GPU based probably crap since I'm seeing people w/ 13th gen floor mine by more than double in some, making it hard NOT to want to upgrade now vs waiting to see if Fishhawk indeed is HEDT ( Jan Q1)


----------



## ShadowYuna

Is there anyone using WB on their 4090? If so please post the pic and temp when load.
I would like to go WB but since it only hit 65 when max load , I am not sure where I should do it or not.


----------



## msky73

ShadowYuna said:


> Is there anyone using WB on their 4090? If so please post the pic and temp when load.
> I would like to go WB but since it only hit 65 when max load , I am not sure where I should do it or not.


Inno3d X3 + Alphacool WB. The PCB (reference design) is small and so is the block. The block is excatly the same what Inno3d Frostbite version will be. Clock runs 2820Mhz at factory curve (was 2740Mhz with air cooler).

Temps reaching 50C when load is maxed. Normally around 45, below 40 during lighter loads, 30 idle. Coolant delta above 10C during high load. 360+360+120 slim rads cooling 5950X and 4090 in line. Fans set to quietness rather than efficiency. IMO water cooling is not a game changer for 4090 series, but I prefer this way.


----------



## rahkmae

Any test?

Are the Bykski GPU blocks good? I see that compared to others, not all the components cooled that are on the EKW block, for example?


----------



## msky73

rahkmae said:


> Any test?


Port Royal 25930 *Port Royal OC **27133*
Time Spy Extreme GPU 19606 *Time Spy Extreme GPU OC 20422

*Not pushed the the edge, I just wanted to score it at factory and stable OC settings.


----------



## mattskiiau

Nico67 said:


> That is a massive difference, did you flash to a different bios?
> 
> View attachment 2580144
> 
> 
> just for comparison, GPU Power should be pretty close to GPU Rail Powers, likely vrm efficiency being the difference. It could just be reading wrong, or it could be spikes, hard to say for sure.


Do you have the drop down for GPU power rails which shows *"GPU Core (NVVDD) Out Power"*?
I have two of them for my card, same name, different results.

NOTE: Trio X is capped at 480w, so not sure what 600w is doing here, default BIOS.

Anyone else have the same?

_Original image for reference:_


----------



## mattskiiau

missclick


----------



## rahkmae

msky73 said:


> Port Royal 25930 *Port Royal OC **27133*
> Time Spy Extreme GPU 19606 *Time Spy Extreme GPU OC 20422*
> 
> Not pushed the the edge, I just wanted to score it at factory and stable OC settings.


Why not? Water only for silence?


----------



## GRABibus

GIGABYTE GAMING OC @ 20°c ambient.
Stock air cooler


















I scored 28 637 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Max power draw => 534W


----------



## Benni231990

Just a quick OC not complette fine tuned on stock air cooler

MSI Suprim X

Clock: 3015Mhz

Idle:33°C
Load: 57°C










Max Power Draw 490-500 Watt

https://www.3dmark.com/3dm/82274653?


----------



## GRABibus

Arizor said:


> 3ghz core generally requires PL and voltage upped, not really worth it for gaming, only for benchmarks.
> 
> For games, my advice is undervolt. As I've said before, mine is 0.95V @ 2760mhz, +1600 on the mem. Runs nice and cool, beats stock performance by a few percent in games, very happy with it at that.
> 
> You could probably get 2-3% more performance than I'm getting if you maxxed out, but then you're using at least 100-150 watts more and a lot more heat for tiny gains.


How do you rise voltage ?

Voltage is "Greyed" in MSI AB on my Gigabyte Gaming OC.

Max I have seen is 1.05V.


----------



## Arizor

GRABibus said:


> How do you rise voltage ?
> 
> Voltage is "Greyed" in MSI AB on my Gigabyte Gaming OC.
> 
> Max I have seen is 1.05V.


get the latest afterburner beta mate. Good to see you back on the forums!


----------



## msky73

rahkmae said:


> Why not? Water only for silence?


Basically yes. And I am waiting for cablemod 12VHPWR cable before stress testing - better safe than sorry


----------



## GRABibus

Arizor said:


> get the latest afterburner beta mate. Good to see you back on the forums!


Thanks !
with this MSI AB beta version, I can now reach 1.1V and 45W more in Port Royal

















I scored 28 852 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Max power draw = 576W.

Does someone reach 600W max Powerdraw in PR with a 600W Bios ?


----------



## pt0x-

AvengedRobix said:


> where you take the bequiet cable??? i can't find in stock


I'm dutch so don't know if it will help, got it from one of our bigger online stores. They still have it in stock, but don't know if they ship international 
https://www.megekko.nl/product/5453/621772/PSU-voedingskabels/be-quiet-12VHPWR-PCI-E-ADAPTER-CABLE


----------



## rahkmae

Are the Bykski GPU blocks good? I see that compared to others, not all the components cooled that are on the EKW block, for example?


----------



## GRABibus

Who is reaching max power draw = 600W in PR with a 600W Bios ? (As the one on Gigabyte Gaming OC).

My max is 576W.


----------



## J7SC

GRABibus said:


> Thanks !
> with this MSI AB beta version, I can now reach 1.1V and 45W more in Port Royal
> View attachment 2580163
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 852 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Max power draw = 576W.
> 
> Does someone reach 600W max Powerdraw in PR with a 600W Bios ?


...unlikely on air; on water entirely possible, depends on max hotspot temps  ...also, what power cable are you using ?


----------



## GRABibus

BeZol said:


> I am not following the thread any more, so if it has already been, then sorry in advance.
> 
> Inno3D X3 OC RTX 4090
> 3x8pin adapter
> TDP 450W
> 
> It's basically one of the cheapest models you can get.
> 
> With 4x8pin adapter nothing changes.
> 
> BUT
> if you put a Gigabyte Gaming OC BIOS (450W +33% = 600W TDP limit) on the GPU,
> coupled with the 4x8pin adapter I managed to unlock the graphics card.
> 
> Interestingly I had voltage limit at 1.1V at my Port Royal run (28.928 score) with peak TDP of 550W.
> 
> View attachment 2579265





J7SC said:


> ...unlikely on air; on water entirely possible, depends on max hotspot temps  ...also, what power cable are you using ?


I use the one provided with the card currently. I wait for the cablemod one's and the Corsair one's.

Which max power draw do you have with your Gaming OC in PR ?


----------



## J7SC

GRABibus said:


> I use the one provided with the card currently. I wait for the cablemod one's and the Corsair one's.
> 
> How max power draw do you have with your Gaming OC in PR ?


-- haven't run PR in a week due to 'the connector' but prior max ~ 575 W


----------



## GRABibus

J7SC said:


> -- haven't run PR in a week due to 'the connector' but prior max ~ 575 W


ok, like me so


----------



## J7SC

GRABibus said:


> ok, like me so


...but I did it first  ...also, watch that connector - and your hotspot temps !


----------



## GRABibus

J7SC said:


> ...but I did it first  ...also, watch that connector - and your hotspot temps !


Hot spot 68°C in PR (Look at my screenshot)


----------



## Madness11

Guys anyone help ?)) Please
got issue like Jay2cent , sometime screen won't work (4090 liquid X) it's show A0 on board , but no image on the screen . After reboot it's work ... It's Vbios ?? Got z690 hero and last bios on them . Please help


----------



## dante`afk

rahkmae said:


> Are the Bykski GPU blocks good? I see that compared to others, not all the components cooled that are on the EKW block, for example?


Ekwb is not the holy grail, a 150$ bykski block is cooling as good as a 400$ ekwb block


----------



## ZealotKi11er

dante`afk said:


> Ekwb is not the holy grail, a 150$ bykski block is cooling as good as a 400$ ekwb block


I only compared Bykski vs HEATKILLER for 6900 XT and it by far worse. I tried to mount Bykski 3 times but hotspot was at least 15c hotter and even edge was 10-15C higher.


----------



## LunaP

rahkmae said:


> Are the Bykski GPU blocks good? I see that compared to others, not all the components cooled that are on the EKW block, for example?


Are u referring to the blocks from the 3000 series? or the current ones? I haven't seen any reviews yet if ur able to link.


----------



## pat182

EEE-RAY said:


> I am crashing on windows at idle (just browsing). Hard lock requiring restart. system info states a display driver crash. Is my card dying?


you nned to put the gpu in performance mode to stop it from happening


----------



## rahkmae

dante`afk said:


> Ekwb is not the holy grail, a 150$ bykski block is cooling as good as a 400$ ekwb block


ekw cost 269.- two said costs 400.- + of course it isn't holy grail, it's just that you can see what is cooled more according to the manual...


----------



## rahkmae

LunaP said:


> Are u referring to the blocks from the 3000 series? or the current ones? I haven't seen any reviews yet if ur able to link.


see the drawings in the EKW 4090 manual and compare Bykskis drawings...


----------



## LunaP

rahkmae said:


> see the drawings in the EKW 4090 manual and compare Bykskis drawings...


Ok so no reviews yet, got it.


----------



## rahkmae

First is Bykskis and right is EKW and see the differences... Asus card


----------



## rahkmae

alpha and phanteks also add


----------



## msky73

rahkmae said:


> View attachment 2580181
> 
> First is Bykskis and right is EKW and see the differences... Asus card


Inductors i.e.coils (passive parts) don't need cooling. DC-DC mosfets (active parts) do. Alphacool for example claims, that pressing inductors to the block with tpads can increse vibrations resonance and whinening.


----------



## Blameless

Chokes don't need to be cooled by the waterblock unless you're running near the VRM's power limit, which is essentially impossible on water.

They also tend to have less coil whine if they are warmer.


----------



## bmagnien

J7SC said:


> I can get well past 3 GHz core on stock voltage (1.05 instead of 1.1) and stock PL, but you are right, in games none of this matters very much.
> 
> FYI, on memory testing, I tried out that tool linked earlier (memtest_vulkan) and it works fine, though it started to draw well over 410 W with just a VRAM OC and everything else stock...since I am waiting for the 12vhpwr cables and also w-block, I rather do this in earnest after all that is installed.
> 
> FYI, you can also google 'ATITool' - it is _ancient_, but the 'artifact checker' option still works fine and only draws ~ 1/4 to 1/3 of what memtest_vulkan uses. The ATITool has been fairly reliable with my 3090, 2080 Ti etc...
> 
> View attachment 2580125


wow i haven't seen that hairy cube in AGES. Probably around the time I was playing the leaked HL2 source code


----------



## nvidiaphysx9

Hello, can I have some advices from you guys: is the cable bending too tight?









I have only 3cm between card and case glass, so thinking is it ok or better go for vertical mount


----------



## LunaP

rahkmae said:


> View attachment 2580183
> 
> alpha and phanteks also add


So far so good, looking forward to reviews once they actually start making/shipping them.


----------



## X79guy

Have any issues been noted with all the AIB cards using 4x poscaps in place of MLCC's? The FE uses a full set of MLCC's on the rear.


----------



## GRABibus

nvidiaphysx9 said:


> Hello, can I have some advices from you guys: is the cable bending too tight?
> View attachment 2580188
> 
> 
> I have only 3cm between card and case glass, so thinking is it ok or better go for vertical mount


Currently I have removed the glass.
No other choice....


----------



## Panchovix

A new personal best with my sample on SpeedWay, and as some people said, having auto fans instead of 100%, let me a little more headroom for VRAM OC, which is actually very interesting lol


Spoiler: 3DMark Speed Way Bench












I scored 10 714 in Speed Way


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com















Though, now I'm wondering 3 things:

1. For the Ryzen users out here (Ryzen 5000/7000), are you using Windows 11 or 10 for benchmarks? If it's 11, are you using a stripped W11 OS or just disabling the services for the benchs and such?
2. For people that have upgraded CPUs after getting the 4090, have you got better *graphics *scores after upgrading your CPU? Since for my CPU/GPU combo I feel I'm doing "good", but for 4090 in general, it seems pretty lackluster
3. How it is possible that for example, some people have high scores despite having lower core/mem clocks? Driver optimization? Example:


----------



## motivman

GRABibus said:


> Thanks !
> with this MSI AB beta version, I can now reach 1.1V and 45W more in Port Royal
> View attachment 2580163
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 852 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Max power draw = 576W.
> 
> Does someone reach 600W max Powerdraw in PR with a 600W Bios ?


Man, are all the good overclocking cards overseas? I have tried two cards now,and both do not overclock half as good as the ones on these forums.. SMH


----------



## Panchovix

motivman said:


> Man, are all the good overclocking cards overseas? I have tried two cards now,and both do not overclock half as good as the ones on these forums.. SMH


And yesterday someone posted on reddit his SpeedWay bench (11054 points), he had +2000VRAM OC (his model is Gigabyte Gaming OC)


__
https://www.reddit.com/r/nvidia/comments/yjsbuv

Man, I wish I could do +1500 LOL


----------



## motivman

Panchovix said:


> And yesterday someone posted on reddit his SpeedWay bench (11054 points), he had +2000VRAM OC (his model is Gigabyte Gaming OC)
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/yjsbuv
> 
> Man, I wish I could do +1500 LOL


watch me get a Gigabyte Gaming OC, and it turns out to be a DUD also, lol


----------



## GRABibus

I wait for CableMod one (Sent today from Cablemod) and the Corsair one.
No more bench until I receive them


----------



## Tideman

GRABibus said:


> I wait for CableMod one (Sent today from Cablemod) and the Corsair one.
> No more bench until I receive them


I'm doing the same. I actually took my card out completely. Back to my 3090 Ti until my cable arrives.

I ordered my custom Moddiy cable on Saturday and it shipped yesterday. Surprisingly fast. Even though I went for express shipping it's not due until next Monday though.. Was hoping for Friday.


----------



## bmagnien

*Silverstone native 12VHPWR cables now available!* Looks almost identical to the Corsair methodology:









Amazon.com: SilverStone Technology PP14-EPS Dual EPS 8 pin (PSU) to 12+4 pin (GPU) 12VHPWR PCIe Gen5 Cable for Silverstone PSUs, SST-PP14-EPS : Electronics


Buy SilverStone Technology PP14-EPS Dual EPS 8 pin (PSU) to 12+4 pin (GPU) 12VHPWR PCIe Gen5 Cable for Silverstone PSUs, SST-PP14-EPS: Internal Power Supplies - Amazon.com ✓ FREE DELIVERY possible on eligible purchases



www.amazon.com













Silverstone SST-PP14-EPS EPS 2 x EPS 8Pin to 12+4Pin PCIe Gen5 Cable


Silverstone SST-PP14-EPS EPS 2 x EPS 8Pin to 12+4Pin PCIe Gen5 Cable




aerocooler.com


----------



## GRABibus

Tideman said:


> I'm doing the same. I actually took my card out completely. Back to my 3090 Ti until my cable arrives.
> 
> I ordered my custom Moddiy cable on Saturday and it shipped yesterday. Surprisingly fast. Even though I went for express shipping it's not due until next Monday though.. Was hoping for Friday.


For gaming, I cap my Gaming OC at 40% PL and get same performances than my 3090.
I will game like this until cables received


----------



## GRABibus

They look very good :


----------



## dr/owned

MSI Trio:

Shunted for 2x current, fuse bypassed, header soldered for voltage control. Lets blow this thing up on air.


----------



## GAN77

dr/owned said:


> Shunted for 2x current, fuse bypassed, header soldered for voltage control. Lets blow this thing up on air.


Give the result and then bang)


----------



## cheddardonkey

GAN77 said:


> Shunted for 2x current, fuse bypassed, header soldered for voltage control. Lets blow this thing up on air.
> View attachment 2580226
> 
> 
> 
> Let it rip and let us know!


----------



## bezerKa

Anyone with a suprim x oc and not able to go above 100% on the power slider on afterburner? All 4 connectors are plugged in, triple checked. 522.25 driver. Latest afterburner 4.6.5 beta 2. Reinstalled drivers etc.


----------



## motivman

dr/owned said:


> MSI Trio:
> 
> Shunted for 2x current, fuse bypassed, header soldered for voltage control. Lets blow this thing up on air.
> View attachment 2580226


but why????


----------



## J7SC

GRABibus said:


> Hot spot 68°C in PR (Look at my screenshot)


Mine is in the low 80s at same clocks / peak wattage - I suspect a less than perfect factory mount / TIM but will get to it once the waterblock arrives. 

@bmagnien ...have you heard anything additional from Formulamod re. Giga-OC block ?


----------



## LunaP

Panchovix said:


> And yesterday someone posted on reddit his SpeedWay bench (11054 points), he had +2000VRAM OC (his model is Gigabyte Gaming OC)
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/yjsbuv
> 
> Man, I wish I could do +1500 LOL


Highest I've attempted is +250 core and 1700 on Mem on my Gaming OC, waiting for block/cable to test higher.



GRABibus said:


> I wait for CableMod one (Sent today from Cablemod) and the Corsair one.
> No more bench until I receive them


How long ago did u order, I ordered mine last week and have yet to get a confirmation of shipment, shows sitll pending.


----------



## GRABibus

LunaP said:


> How long ago did u order, I ordered mine last week and have yet to get a confirmation of shipment, shows sitll pending.


Ordered 10 days ago.

I am also in the queue for the 90 degrees angle adaptater


----------



## Panchovix

LunaP said:


> Highest I've attempted is +250 core and 1700 on Mem on my Gaming OC, waiting for block/cable to test higher.


It may fares well then, you may be one of the lucky (?) that do +2000 or more on the VRAM (probably the most important overclock on the 4090, since core oc doesn't make much difference)

Also, your good overclock on VRAM may confirm that the Gigabyte Gaming OC maybe has some binning on the VRAM?


----------



## Azazil1190

Panchovix said:


> It may fares well then, you may be one of the lucky (?) that do +2000 or more on the VRAM (probably the most important overclock on the 4090, since core oc doesn't make much difference)
> 
> Also, your good overclock on VRAM may confirm that the Gigabyte Gaming OC maybe has some binning on the VRAM?


Seems like that the gigas has better bin chips and rams than the others vendors


----------



## mirkendargen

J7SC said:


> Mine is in the low 80s at same clocks / peak wattage - I suspect a less than perfect factory mount / TIM but will get to it once the waterblock arrives.
> 
> @bmagnien ...have you heard anything additional from Formulamod re. Giga-OC block ?


Mine ordered from Aliexpress is shipped and in LA now, supposed to get here Friday.


----------



## bmagnien

mirkendargen said:


> Mine ordered from Aliexpress is shipped and in LA now, supposed to get here Friday.


Wow, can you share the AliExpress store name that you ordered from? And shipping method? Dropped like $65 for DHL Express with FormulaMod but they havent even shipped yet.

@J7SC you saw my earlier response from them right? I emailed them again today and basically said if you can't give me an estimated ship date, cancel my order because you shouldn't have been taking orders without these in-hand. no response yet


----------



## bmagnien

For anyone that's interested, I've compiled a downloadable collection of next-gen engine demos, available here: Demos - Google Drive

Included are:

UE5 City Demo (Matrix Awakens) - 19.1gb
UE5 Broadleaf Forest - 1.7gb
UE5 Connifer Forest (this one has awesome snow effects) -1.9gb
Unity Enemies demo - 1.1gb
Just download, extract, and run the executable. These are all real-time with various levels of interactivity (make sure to try the flashlight in the forest), and many quality settings to play with.

Enjoy!


----------



## Tideman

Azazil1190 said:


> Seems like that the gigas has better bin chips and rams than the others vendors


Yeah a lot of gaming oc samples seem to do well. 

Mine does +300 core (3090Mhz). I decided to wait for my new cable before testing the memory.


----------



## GRABibus

Tideman said:


> Yeah a lot of gaming oc samples seem to do well.
> 
> Mine does +300 core (3090Mhz). I decided to wait for my new cable before testing the memory.


+300MHz stable in PR ? 😜


----------



## AvengedRobix

pt0x- said:


> I'm dutch so don't know if it will help, got it from one of our bigger online stores. They still have it in stock, but don't know if they ship international
> https://www.megekko.nl/product/5453/621772/PSU-voedingskabels/be-quiet-12VHPWR-PCI-E-ADAPTER-CABLE


Tnx but they don't ship to Italy.. damn


----------



## GRABibus

LunaP said:


> Highest I've attempted is +250 core and 1700 on Mem on my Gaming OC, waiting for block/cable to test higher.
> 
> 
> 
> How long ago did u order, I ordered mine last week and have yet to get a confirmation of shipment, shows sitll pending.


Estimated Date Arrival at my home in France is 8th of Nov according to FedEx.


----------



## Tideman

GRABibus said:


> +300MHz stable in PR ? 😜


Only tested it with TimeSpy Extreme (stress test). That one was more reliable for me than PR in exposing instability with my 3090 Ti.

Going to test PR when my cable arrives.


----------



## Panchovix

Tideman said:


> Only tested it with TimeSpy Extreme (stress test). That one was more reliable for me than PR in exposing instability with my 3090 Ti.
> 
> Going to test PR when my cable arrives.


Just to note, that was the case as well on my 3080, but on my 4090, PR insta kills any unstable overclock for me  on TSE I can do it with higher clocks


----------



## newls1

if you had to choose between these 2 available cards... Gigabyte Gaming OC or Asus TUF OC... which one to get? are they both 600w bios stock?


----------



## Nizzen

newls1 said:


> if you had to choose between these 2 available cards... Gigabyte Gaming OC or Asus TUF OC... which one to get? are they both 600w bios stock?


I have tested 4090 TUF, and the card was VERY good. Cooling is insane! No RGB is nice too


----------



## Panchovix

newls1 said:


> if you had to choose between these 2 available cards... Gigabyte Gaming OC or Asus TUF OC... which one to get? are they both 600w bios stock?


Both are 600W, the TUF supposedly has better VRMs, but based on posts here on the forum, the Gigabyte Gaming OC overclocks way higher. (Better binning)

So by that (pure overclocking), I would go for the Gigabyte Gaming OC (I say that having a TUF card)


----------



## 8472

It looks like the Suprim X non-liquid hasn't restocked since launch day. Smh.


----------



## Tideman

Panchovix said:


> Just to note, that was the case as well on my 3080, but on my 4090, PR insta kills any unstable overclock for me  on TSE I can do it with higher clocks


Oh no, well guess I'm waiting till next Monday to find out 

You're tempting me to put the card back in with the adapter just to run PR haha.


----------



## yt93900

Any news on Aorus Waterforce 600W VBIOS?


----------



## mirkendargen

bmagnien said:


> Wow, can you share the AliExpress store name that you ordered from? And shipping method? Dropped like $65 for DHL Express with FormulaMod but they havent even shipped yet.


https://www.aliexpress.us/item/3256...2uF1iT6&gatewayAdapt=glo2usa&_randl_shipto=US DHL shipping


----------



## Jacinto1023

EDIT: Figured it out.


----------



## Spiriva

GeForce Hotfix Driver Version 526.61


[Call of Duty: Modern Warfare II] Flashing corruption can be seen randomly while playing the game. [3829010]
VTube Studio may crash to black screen [3838158]
GPU stuck in P0 state after exiting certain games. [3846389]

GeForce Hotfix Driver Version 526.61 | NVIDIA


----------



## Gking62

Hey all, I've been diligently reading through this thread for the past few days and have gleaned a wealth of info, kudos to everyone who's contributed. I have a ROG Strix 4090 delivering today, a custom CableMod 16 Pin 4x8 PCI-E cable (ordered 10/19, shipped 11/1) on the way, EK full ABP block pre-ordered, est delivery 11/18 and wondering now anyone here who also has a Strix on a Z690 Maximus Extreme on how your experience went or is going? For the time being upon delivery of my CM cable I'll be running the card outside my case via an exterior mount rack and a LinkUp Extreme PCI-E 4 riser until the EK block shows, rest of my full specs in sig, any other words of wisdom would be very welcome, thanks so much.


----------



## changboy

Just order the waterblock from aliexpess for my 4090 gigabyte gaming oc


----------



## J7SC

bmagnien said:


> Wow, can you share the AliExpress store name that you ordered from? And shipping method? Dropped like $65 for DHL Express with FormulaMod but they havent even shipped yet.
> 
> @J7SC you saw my earlier response from them right? I emailed them again today and basically said if you can't give me an estimated ship date, cancel my order because you shouldn't have been taking orders without these in-hand. no response yet


...same here, I had emailed Formulamod and got some non-answer, but no shopping date - also went for the DHL Express option. Let me know if they answer you today...I'm thinking of canceling as well (my order is 1 or 2 behind yours in their system).


----------



## ArcticZero

Nice to see the custom blocks starting to come out already. As much as I want to order my block now and have it ready when my card arrives, I'm still waiting whether the Strix, TUF, or Suprim X (in that order) will be available first and getting the first one I can.


----------



## Arizor

Does anyone know the size of the thermal pads for the TUF? Thinking of repasting to reduce hotspot temps.


----------



## bmagnien

mirkendargen said:


> https://www.aliexpress.us/item/3256...2uF1iT6&gatewayAdapt=glo2usa&_randl_shipto=US DHL shipping


Lol *** that is the FormulaMod store. No idea how you ordering it from there somehow went faster than ordering directly on their website. You sure it wasn’t this one?

US $112.58 15％ Off | Bykski Water Block for Gigabyte GeForce RTX 4090 Gaming OC / MASTER GPU Card / Copper Cooling Radiator RGB AURA/ N-GV4090AORUS-X


https://a.aliexpress.com/_msSTCOg


----------



## newls1

Panchovix said:


> Both are 600W, the TUF supposedly has better VRMs, but based on posts here on the forum, the Gigabyte Gaming OC overclocks way higher. (Better binning)
> 
> So by that (pure overclocking), I would go for the Gigabyte Gaming OC (I say that having a TUF card)


looks like the Tuf OC is 520watt bios... Is it possible to flash its BIOS to a 600w BIOS, and if so, which one would be recommended?


----------



## bezerKa

Received my suprim x today, looks like I've received a bad adaptor cable. 100% max power limit in afterburner, can't go higher. It seems the 4th connector isn't receiving power from the PSU. Swapped cables around with known working ones, and will not boot with a display, unless I only use the first 3 connectors. Just means I'm capped at 100% power limit (450w) 

Guess I need to order the corsair 600w 12VHPWR cable that terminates into 2x8pin connectors.


----------



## changboy

My Byski for my gaming oc i paid 244.78$ CAD = 174 + 70 shipping (fastest)

If i bought a EK block it will cost me over 400$ for sure.


----------



## J7SC

bmagnien said:


> Lol *** that is the FormulaMod store. No idea how you ordering it from there somehow went faster than ordering directly on their website. You sure it wasn’t this one?
> 
> US $112.58 15％ Off | Bykski Water Block for Gigabyte GeForce RTX 4090 Gaming OC / MASTER GPU Card / Copper Cooling Radiator RGB AURA/ N-GV4090AORUS-X
> 
> 
> https://a.aliexpress.com/_msSTCOg


Lols indeed - I like it when they say '790+' available at Aliexpress (incl. apparently the Formulamod store ?).

The whole ecoculture around top-end cards (ie. cable makers, block suppliers) promises a lot, takes your money immediately plus extra for fast shipping, but then - big guessing game on when what might be delivered, or not. I should really invest in a 3D Printer...


----------



## Arizor

newls1 said:


> looks like the Tuf OC is 520watt bios... Is it possible to flash its BIOS to a 600w BIOS, and if so, which one would be recommended?


Nope TUF BIOS is 600W, goes to 133%, baseline 450W. Got it right here


----------



## newls1

@Arizor Ok awesome... Was watching the "Hardware Unboxed" review of this card and he said the OC slider only goes 8% over 100... Maybe I miss understood then. So just keep the bios thats on it then and be good to go?


----------



## Arizor

newls1 said:


> @Arizor Ok awesome... Was watching the "Hardware Unboxed" review of this card and he said the OC slider only goes 8% over 100... Maybe I miss understood then. So just keep the bios thats on it then and be good to go?


Yep, attached here just in case you do need to flash for any reason.

Edit - Ah ok, not allowing the 'rom' extension, so just delete the 'pdf'.


----------



## narrn2761

Hey guys, I am using the psu adaptor cable from the TUF OC.

The 4x PCIe cables that are connected to the adaptor cable is fully visible in mid air and looks atrocious! lol

Is there any alternative or other cable adaptors that I can use to hide away the 4x pcie cables connected. Like is there an adaptor cable that extends to the other side of the case so I can atleast hide the 4x pcie cables?

My PSU is an EVGA SuperNova P2 1000 btw.

Or do I have no choice but to purchase a PCIe 5.0 PSU?


----------



## newls1

Arizor said:


> Yep, attached here just in case you do need to flash for any reason.
> 
> Edit - Ah ok, not allowing the 'rom' extension, so just delete the 'pdf'.


thank you sir..... thats the stock bios right? Do you recommend flashing to any other bios, or no reason to?


----------



## Gking62

narrn2761 said:


> Hey guys, I am using the psu adaptor cable from the TUF OC.
> 
> The 4x PCIe cables that are connected to the adaptor cable is fully visible in mid air and looks atrocious! lol
> 
> Is there any alternative or other cable adaptors that I can use to hide away the 4x pcie cables connected. Like is there an adaptor cable that extends to the other side of the case so I can atleast hide the 4x pcie cables?
> 
> My PSU is an EVGA SuperNova P2 1000 btw.
> 
> Or do I have no choice but to purchase a PCIe 5.0 PSU?


no new psu needed, visit CableMod below:






Configurator – CableMod Global Store







store.cablemod.com





ModDIY is also a decent alternative

I also have an EVGA psu, 1600 P2 and ordered a 650mm cable.


----------



## Arizor

narrn2761 said:


> Hey guys, I am using the psu adaptor cable from the TUF OC.
> 
> The 4x PCIe cables that are connected to the adaptor cable is fully visible in mid air and looks atrocious! lol
> 
> Is there any alternative or other cable adaptors that I can use to hide away the 4x pcie cables connected. Like is there an adaptor cable that extends to the other side of the case so I can atleast hide the 4x pcie cables?
> 
> My PSU is an EVGA SuperNova P2 1000 btw.
> 
> Or do I have no choice but to purchase a PCIe 5.0 PSU?








CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store







store.cablemod.com


----------



## Arizor

newls1 said:


> thank you sir..... thats the stock bios right? Do you recommend flashing to any other bios, or no reason to?


No point flashing to others, I've tried loads at this point, all give the same performance.

On this one, I lock 0.95V @ 2760mhz, +1600mem. Runs nice and cool, faster than stock.


----------



## newls1

Arizor said:


> No point flashing to others, I've tried loads at this point, all give the same performance.
> 
> On this one, I lock 0.95V @ 2760mhz, +1600mem. Runs nice and cool, faster than stock.


damn the mem will OC that much... 3090Ti i had artifacted starting @ +840


----------



## Arizor

newls1 said:


> damn the mem will OC that much... 3090Ti i had artifacted starting @ +840


Yeah most folks mem will overclock this high; in fact I can push mine to 1700 comfortable, 1800 even, but I start to sweat


----------



## Panchovix

newls1 said:


> looks like the Tuf OC is 520watt bios... Is it possible to flash its BIOS to a 600w BIOS, and if so, which one would be recommended?


It goes up to 600W, Hardware Unboxed fixed that in a comment









Also, luck of the draw I guess, my TUF can't do +1200Mhz on the VRAM without crashing


----------



## newls1

Panchovix said:


> It goes up to 600W, Hardware Unboxed fixed that in a comment
> View attachment 2580333
> 
> 
> Also, luck of the draw I guess, my TUF can't do +1200Mhz on the VRAM without crashing


thank you sir!


----------



## J7SC

Arizor said:


> No point flashing to others, I've tried loads at this point, all give the same performance.
> 
> On this one, I lock 0.95V @ 2760mhz, +1600mem. Runs nice and cool, faster than stock.


...just out f interest as I'm considering undervolting until cable and lock updates, what does HWInfo say re. 'GPU Power' and 'GPU Rail Power' at that voltage ? At stock voltage ? Apparently, a big positive discrepancy between the latter and former suggests the rough amount of transient spiking, though I'm not sure about that and others might want to chime in here.


----------



## Gking62

newls1 said:


> damn the mem will OC that much... 3090Ti i had artifacted starting @ +840


my prior EVGA 3080 Ti FTW3 Ultra mem would do 1250, no issues however on water though.


----------



## dr/owned

motivman said:


> but why????


----------



## changboy

I also order a cablemod cost me around 60$ CAD but coz its from usa i think i will have to pay some fee when they will deliver it to me, i not sure.


----------



## Arizor

J7SC said:


> ...just out f interest as I'm considering undervolting until cable and lock updates, what does HWInfo say re. 'GPU Power' and 'GPU Rail Power' at that voltage ? At stock voltage ? Apparently, a big positive discrepancy between the latter and former suggests the rough amount of transient spiking, though I'm not sure about that and others might want to chime in here.


Here you go mate.


----------



## J7SC

Arizor said:


> Here you go mate.
> 
> View attachment 2580350


Thanks. Similar to mine at lower clocks / voltages, but at higher clocks / voltages, GPU Power and GPU Rail Power start to merge.


----------



## Zero989

Am I doing it right? DUAL PSU (A850GF + A1000G PCIE5) native connector + 0 bend. I can walk away and be safe right?


----------



## kx11

I have that MSI power supply, i need to install it lol


----------



## changboy

When i game with my gaming oc the clock is at 3015mhz(+270) and memory at +1400mhz and i dont have any problem with all my game.


----------



## Panchovix

For guys that are using Ryzen, are you benching on Windows 10 or 11? Wondering if my main OS will may bench better than the BenchOS (W10) that I'm using.

PS: Damn, tested my VRAM on the vulkan test and it crashed at +1100, but at +1000 it's fine, totally lost the silicon VRAM lottery


----------



## dr/owned

Furmark with the Shunt didn't explode...still hits PWR limit. So similar to the 3090 where there's still power monitoring outside the big shunts. Now I gotta find a BIOS to get up to 600W theoretical.:


----------



## J7SC

...may be I'm taking all this too seriously...


----------



## dr/owned

J7SC said:


> ...may be I'm taking all this too seriously...


No adapter, CableMod Corsair doodad so hopefully no patty melt.

Working to get EVC2 to show up as a USB device.


----------



## narrn2761

Arizor said:


> CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store
> 
> 
> 
> 
> 
> 
> 
> store.cablemod.com


Thanks legends!


----------



## Panchovix

Did some tests on Control at 4K render with default settings, only VRAM OC, only core OC and both. (Core stable OC, VRAM OC not so much)

Stock (2760Mhz): 66 FPS
Core OC Only (+285 core clock, 3045Mhz): 69 FPS (4.54% improvement)
VRAM OC Only "safe" (+1000 VRAM clock): 70 FPS (6% improvement)
VRAM OC Only (+1150 VRAM clock): 71 FPS (7.5% improvement)
Both: 73 FPS (10.6% improvement)
Bonus, UV 0.975V 2730Mhz + 1000Mhz VRAM: 68 FPS (3% improvement, but consumes 100W less or so)

So basically in this game, VRAM OC is the most important (man, the people that can do +2000Mhz, imagine how much more % they're getting)

Below just the pics in a spoiler


Spoiler: Control Pics



Stock








Core clock OC Only:








VRAM "safe" OC only:








VRAM OC Only:








Both:








UV + OC:


----------



## Arizor

Yep the mem overclock is king, was a bit true of the 3-series, but certainly now.


----------



## ZealotKi11er

Seasonic was able to get me a cable for my 1300w unit.


----------



## motivman

dr/owned said:


> Furmark with the Shunt didn't explode...still hits PWR limit. So similar to the 3090 where there's still power monitoring outside the big shunts. Now I gotta find a BIOS to get up to 600W theoretical.:
> 
> View attachment 2580368


man, you do not need to shunt to get to 600W, just flash the suprim x liquid 600W bios. I got up to 577W on my trio with that bios.


----------



## LuckyImperial

J7SC said:


> ...just out f interest as I'm considering undervolting until cable and lock updates, what does HWInfo say re. 'GPU Power' and 'GPU Rail Power' at that voltage ? At stock voltage ? Apparently, a big positive discrepancy between the latter and former suggests the rough amount of transient spiking, though I'm not sure about that and others might want to chime in here.


I'm running a 70% power slider and seeing rail max powers of 560W on my liquid x, which has a 480W default power limit. It's higher than I want to see for sure.


----------



## J7SC

ZealotKi11er said:


> Seasonic was able to get me a cable for my 1300w unit.


Have you actually received it yet ? I was approved by Seasonic, but still waiting....


----------



## dr/owned

motivman said:


> man, you do not need to shunt to get to 600W, just flash the suprim x liquid 600W bios. I got up to 577W on my trio with that bios.


The goal is volt mod and > 600W.

Anyways looks like shunting might be a waste of time. This is the Surim Liquid X that also screws up the max fan speed. It's 3300rpm on the stock BIOS:










(I can't get my EVC2 to be recognized by the software in Win11 so I need help from elmor) (ediT: I'm a moron that forgot the legacy driver install procedure. )


----------



## narrn2761

Im a bit confused on the aftermarket PSU cables for the TUF OC 4090.

Currently I am using the 4x 8pin to 16pin adaptor.

Can I replace it with this modDIY cable here:









ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable


Buy ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable for $24.99 with Free Shipping Worldwide (In Stock)




www.moddiy.com






And will that cable still allow upto 600W from my EVGA P2, even though this cable is a 2x 8pin and not 4x 8pin adaptor?


----------



## dr/owned

Alrighty here we go, 960W needs a block cause Therm out the wazoo. I guess volt mod can kinda get around the Pwr limit:


----------



## LunaP

changboy said:


> Just order the waterblock from aliexpess for my 4090 gigabyte gaming oc



Do u have a link? Im gonna cancel my formula mod one. They havent responded in over a week now.



J7SC said:


> ...same here, I had emailed Formulamod and got some non-answer, but no shopping date - also went for the DHL Express option. Let me know if they answer you today...I'm thinking of canceling as well (my order is 1 or 2 behind yours in their system).


Same 1 week and counting.




bmagnien said:


> Lol *** that is the FormulaMod store. No idea how you ordering it from there somehow went faster than ordering directly on their website. You sure it wasn’t this one?
> 
> US $112.58 15％ Off | Bykski Water Block for Gigabyte GeForce RTX 4090 Gaming OC / MASTER GPU Card / Copper Cooling Radiator RGB AURA/ N-GV4090AORUS-X
> 
> 
> https://a.aliexpress.com/_msSTCOg


Oh ill check this one thanks.


----------



## long2905

narrn2761 said:


> Im a bit confused on the aftermarket PSU cables for the TUF OC 4090.
> 
> Currently I am using the 4x 8pin to 16pin adaptor.
> 
> Can I replace it with this modDIY cable here:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable
> 
> 
> Buy ATX 3.0 PCIe 5.0 600W Dual 8 Pin to 12VHPWR 16 Pin Power Cable for $24.99 with Free Shipping Worldwide (In Stock)
> 
> 
> 
> 
> www.moddiy.com
> 
> 
> 
> 
> 
> 
> And will that cable still allow upto 600W from my EVGA P2, even though this cable is a 2x 8pin and not 4x 8pin adaptor?


it will. the cable plug directly into the psu, instead of into the psu pcie cables like the nvidia adapter.


----------



## TheNaitsyrk

J7SC said:


> ...may be I'm taking all this too seriously...
> View attachment 2580375


That's a pretty nice overclock though.

My one can do 3160 but on air, maybe WB will change it to 3200+.


----------



## ShadowYuna

Finally got shipment notice from Formula Mod that my waterblock for Gigabytre 4090 Gaming OC.
It is waiting for pick up by carrier so hopefully it comes by next Monday


----------



## LuckyImperial

My ModDIY cable just got through customs. I'm kind of kicking myself for not using a faster shipping method. FedEx was like $50, but I'm jonesing to get this adapter cable out of my rig.

We're all waiting for something haha.


----------



## mattskiiau

Ok this is really starting to concern me. What the hell is this?


----------



## ArcticZero

I get the same thing with my 3090. Not sure what it is but my power draw measurement from my UPS certainly isn't reflecting that much draw and is more in line with GPU Power. GPU Core (NVVDD) Output Power has gone up as high as 600+w for me as well at times. This is just while playing Forza.


----------



## mattskiiau

ArcticZero said:


> I get the same thing with my 3090. Not sure what it is but my power draw measurement from my UPS certainly isn't reflecting that much draw and is more in line with GPU Power. GPU Core (NVVDD) Output Power has gone up as high as 600+w for me as well at times. This is just while playing Forza.
> 
> View attachment 2580471


Interesting, appreciate the comparison.


----------



## yzonker

There's something off with the power readings between some of our cards. I noticed last night while running Port Royal that my max power was only 514w. But others have shown their cards pulling 550w+ in PR. 

My card never power throttles though. Just holds the 1100mv point for the entire run with a peak just over 500w. +225 core and +1800 mem. Strix bios. (I see the same power draw on the TUF bios also)


----------



## bezerKa

Anyone here with a suprim x (air) able to confirm what the max % is on the power slider? I'm stuck at 100% and I'm 99% sure it's a faulty 4th connector on the supplied power adapter. My card has also shipped with the latest bios which is newer than all the ones listed on tech power up too. 

I very much doubt it's hard locked as I flashed to the older listed suprim bios and my max power slider was showing 93%


----------



## bmagnien

ShadowYuna said:


> Finally got shipment notice from Formula Mod that my waterblock for Gigabytre 4090 Gaming OC.
> It is waiting for pick up by carrier so hopefully it comes by next Monday


Yup I was going back and forth with them all night via email and AliExpress text. They finally got their shipment dispatched from the bykski factory, and should’ve fulfilled all orders. Mines shipped with an ETA of the 9th.

@J7SC did you get your notification?


----------



## ZealotKi11er

Still in the fence if I should get the block for my card. I know I will not get any extra OC. Only saving thing is space in the case and a bit quieter.


----------



## cheddardonkey

..


----------



## mirkendargen

J7SC said:


> Lols indeed - I like it when they say '790+' available at Aliexpress (incl. apparently the Formulamod store ?).
> 
> The whole ecoculture around top-end cards (ie. cable makers, block suppliers) promises a lot, takes your money immediately plus extra for fast shipping, but then - big guessing game on when what might be delivered, or not. I should really invest in a 3D Printer...


I don't think this Aliexpress store is actually Formulamod, that kind of thing is rampant on Aliexpress. You'll see "Bykski Store Official", "Official Bykski Store", "Bykski Official Store", etc. This is the same thing, just someone stealing the Formulamod name.

The good news is I think the block should get here today. The bad news is I'm having a hell of a time purging my loop with the EPDM tubing I redid it with instead of vinyl this time around. I think the pressure loss from friction in it is too high for long runs, and my pumps can't overcome it to purge. I have some vinyl tubing on the way to see if I can get it flowing, sigh.


----------



## yzonker

FE's are up on BB right now.


----------



## Christopher2178

Niceeee!!! I have been trying since the 12th Just got an FE from BB. Arrives next Tuesday.


----------



## motivman

Got another FE also today from Best BUY. Hopefully this overclocks better than my current FE, lol. best I can do with my current FE. Lets see if third times the charm, haha.


----------



## Panchovix

motivman said:


>


Hey, honestly 28k it isn't too bad, I wish I could do 28000 lol

Good luck with your new FE tho! Let's hope it's a silicon winner


----------



## dr/owned

Found a pretty nice sheet with 4090 block compatibility. Looks like Asus is winning this round.:









RTX 4090 Block Compatibility


RTX 4090 Models,Blocks Available/Upcoming,Is Reference,PCB ,Compatible Blocks,Dimensions (mm),Notes NVIDIA Founders Edition,Yes,No,<a href="https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/4.html">PCB</a>,<a href="https://www.ekwb.com/shop/ek-quantum-vector2-fe-rtx-409...




docs.google.com


----------



## cletus-cassidy

yzonker said:


> FE's are up on BB right now.


I got one. Also finally got my Bykski block installed on my Tuf OC. Will play with it tonight after work if anyone wants any info on the block's performance.


----------



## motivman

cletus-cassidy said:


> I got one. Also finally got my Bykski block installed on my Tuf OC. Will play with it tonight after work if anyone wants any info on the block's performance.


why you get an FE when you already got a TUF? are you keeping both or selling the better one?


----------



## motivman

Panchovix said:


> Hey, honestly 28k it isn't too bad, I wish I could do 28000 lol
> 
> Good luck with your new FE tho! Let's hope it's a silicon winner


yeah, its a good sample really (for gaming), I just want a card that can put me in the PR HOF... this is overclock.net right? haha


----------



## dk_mic

GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability


Vulkan compute tool for testing video memory stability - GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability




github.com












I've created an application for testing Video...


A tool written in vulkan compute to stress test video memory for correctness. Ideological successor to memtestCL. Releases at GitHub Open-source, prebuilt binaries available for windows and linux, supports aarch64 NVidia Jetson. Simple to use - no any parameters except optional card...




www.overclock.net





Really handy command line tool to test stability of your memory overclock. Unfortunately I can't really go over +1000 MHz.
Wondering if and how temperature has an impact. Also wondering if all the people reporting 1500+ really pass this.

Here is it failing at 1025 MHz










Waiting for the EK block for my MSI X Trio. In the meantime I have an urge to repaste, deshroud and put some proper fans on it. I think 3 Phanteks T30 would fit 
Had great success doing something similar to my 2080 Ti before I watercooled it.


----------



## GAN77

cletus-cassidy said:


> Will play with it tonight after work if anyone wants any info on the block's performance.


Delta chip water and power consumption


----------



## cletus-cassidy

motivman said:


> why you get an FE when you already got a TUF? are you keeping both or selling the better one?


One for work, one for play? I'll prob sell one.


----------



## cletus-cassidy

GAN77 said:


> Delta chip water and power consumption


I didn't play with the card on air at all, so I'll only have the water data. But we can compare to some of the folks who have Tuf on air.


----------



## Nd4spdvn

bezerKa said:


> Anyone here with a suprim x (air) able to confirm what the max % is on the power slider?


Mine goes to 108% max. Have the card since launch so it came with the bugged bios with 93% power at default. I have since updated the bios with the newer one provided by MSI which resolves the issue but sadly keeps the power at 520W max (108% from 480W default).


----------



## motivman

cletus-cassidy said:


> One for work, one for play? I'll prob sell one.


shoot if my second FE overclocks worse than your TUF OC, I might grab that from you... depending on which you keep.


----------



## Panchovix

motivman said:


> yeah, its a good sample really (for gaming), I just want a card that can put me in the PR HOF... this is overclock.net right? haha


Honestly, yep, that's actually a really good goal, to reach HoF you will have to try and get some non-so good samples haha, maybe this is the one though!



dk_mic said:


> Really handy command line tool to test stability of your memory overclock. Unfortunately I can't really go over +1000 MHz.
> Wondering if and how temperature has an impact. Also wondering if all the people reporting 1500+ really pass this.


I can do +1000, +1050 and +1075 fine; if I do +1100, I don't get errors, but my PC crashes lol
For gaming, up to +1150 seems fine, 1200 crashes my pc very quickly.


----------



## cletus-cassidy

motivman said:


> shoot if my second FE overclocks worse than your TUF OC, I might grab that from you... depending on which you keep.


Sounds good. I got it up and running very late last night, so I didn't to do much, but I could not get it to pull over 550W no matter what I did. Scoring 28.3k ish in PR at 3075 and +1500 mem. Anyone else seeing that?


----------



## motivman

cletus-cassidy said:


> Sounds good. I got it up and running very late last night, so I didn't to do much, but I could not get it to pull over 550W no matter what I did. Scoring 28.3k ish in PR at 3075 and +1500 mem. Anyone else seeing that?


try timespy extreme


----------



## bezerKa

Nd4spdvn said:


> Mine goes to 108% max. Have the card since launch so it came with the bugged bios with 93% power at default. I have since updated the bios with the newer one provided by MSI which resolves the issue but sadly keeps the power at 520W max (108% from 480W default).


Are you able to tell me which bios version you're currently on? My card shipped with 95.02.18.80.68 which is newer than all the ones currently listed on tech power up. My slider is at 100% and I can't max it further. I'm thinking my power adapter is faulty as per my previous post on here.


----------



## Laithan

bezerKa said:


> Are you able to tell me which bios version you're currently on? My card shipped with 95.02.18.80.68 which is newer than all the ones currently listed on tech power up. My slider is at 100% and I can't max it further. I'm thinking my power adapter is faulty as per my previous post on here.


What happens when you disconnect each connector one at a time (while off)? Anything change for each test?


----------



## doubledoubt

motivman said:


> Got another FE also today from Best BUY. Hopefully this overclocks better than my current FE, lol. best I can do with my current FE. Lets see if third times the charm, haha.
> 
> View attachment 2580591


How much did you have on the core and mem for this? Also what was your max temp?


----------



## bezerKa

Laithan said:


> What happens when you disconnect each connector one at a time (while off)? Anything change for each test?


So I believe my issue is with the 4th connector. When I unplug any of the first 3 while keeping the 4th plugged in, it doesn't post. I get display error on mobo. It's as if it's trying to boot with only 2 connectors plugged in.

It boots fine and runs fine with connectors 1 to 3 plugged in as you know 3 is enough. Which is why I think my power slider is locked at 100% no more to slide.

I've swapped cables around with known working ones, I've even disconnected the 4th psu cable from the adapter and used the piggy back connector off the 3rd cable to plug into the 4th, still no.


----------



## Laithan

bezerKa said:


> So I believe my issue is with the 4th connector. When I unplug any of the first 3 while keeping the 4th plugged in, it doesn't post. I get display error on mobo. It's as if it's trying to boot with only 2 connectors plugged in.
> 
> It boots fine and runs fine with connectors 1 to 3 plugged in as you know 3 is enough. Which is why I think my power slider is locked at 100% no more to slide.
> 
> I've swapped cables around with known working ones, I've even disconnected the 4th psu cable from the adapter and used the piggy back connector off the 3rd cable to plug into the 4th, still no.


Sus cable for sure.. I would leave it disconnected until a proper PSU cable comes in. That defect may have been your saving grace... could have been the next Reddit post. 🔥🔥🔥. Perhaps you can send it to Gamer's Nexus as they are collecting bad cables and analyzing them.


----------



## LunaP

Went to cancel my block and got an email that formulamod had shipped from china lol.

You guys are all seriously making me wanna upgrade my CPU w/ all these scores I'm seeing more than double my CPU's current lol, just thinking about the fps increase I'd get in mmo's etc for player / item heavy areas. Fishhawk is still 5 months away, but trying my best to hold out to CES to see IF it'll be HEDT / bring back AVX or not.

Has anyone seen any downside on the AVX side w/ their 13900ks as far as bench / gaming go ? I know this gen was missing it for some odd reason.


----------



## Gking62

is the vbios up to date on the Strix 4090?


----------



## bezerKa

Laithan said:


> Sus cable for sure.. I would leave it disconnected until a proper PSU cable comes in. That defect may have been your saving grace... could have been the next Reddit post. 🔥🔥🔥. Perhaps you can send it to Gamer's Nexus as they are collecting bad cables and analyzing them.


Haha yea true! 

Sorry I meant it's the actual 12+4 pin adapter provided with the card. And it's the 4th connector on that which I believe is faulty (or at the 16pin end), not my psu cable. 

I've preordered the corsair 600w cable hoping that'll fix it.


----------



## Nd4spdvn

bezerKa said:


> Are you able to tell me which bios version you're currently on? My card shipped with 95.02.18.80.68 which is newer than all the ones currently listed on tech power up.


Mine is 95.02.18.80.69, Gaming BIOS, never bothered to update the Silent/Quiet one. All the dets in the pic:


----------



## motivman

doubledoubt said:


> How much did you have on the core and mem for this? Also what was your max temp?


core max was exactly 3000mhz, anything more and it crashes. For memory it was at +1200 on afterburner. Max temps was around 72C


----------



## bezerKa

Nd4spdvn said:


> Mine is 95.02.18.80.69, Gaming BIOS, never bothered to update the Silent/Quiet one. All the dets in the pic:
> View attachment 2580620


Sorry yea I quoted the silent bios by mistake.

Ahh ok thanks, same as me then, so mine is definitely not locked in any way then by bios, that pretty much confirms my 12+4 adapter is not allowing power from the 4th connector that I tested.

It's a shame I can't pick up a spare adapter from anywhere to verify this, and I have to wait 2+weeks for this 600w corsair cable.

So you're locked at max 520w (108% on the power slider?)


----------



## bezerKa

Nd4spdvn said:


> Mine is 95.02.18.80.69, Gaming BIOS, never bothered to update the Silent/Quiet one. All the dets in the pic:
> View attachment 2580620





















Weird silent is +16%. But you can see under general tab it's showing max 100% and +0%


----------



## Nd4spdvn

bezerKa said:


> So you're locked at max 520w (108% on the power slider?)


Yup, indeed I am.


----------



## Nd4spdvn

bezerKa said:


> Weird silent is +16%.


This part is not weird because the base (default) power is 450W on the Silent BIOS and with 16% it goes to 520W which is the max both Silent and Gaming bios provide. The Gaming bios starts from 480W as base (default) power and it needs just 8% to reach 520W.


bezerKa said:


> But you can see under general tab it's showing max 100% and +0%


This is the weird part indeed as you are locked at 100% maximum as per your pic which means you cannot access the extra 16% available which points indeed toward a faulty adapter, PCIE power cable or PCIE power port on your PSU.


----------



## bezerKa

Nd4spdvn said:


> This part is not weird because the base (default) power is 450W on the Silent BIOS and with 16% it goes to 520W which is the max both Silent and Gaming bios provide. The Gaming bios starts from 480W as base (default) power and it needs just 8% to reach 520W.


Of course makes perfect sense ty.



Nd4spdvn said:


> This is the weird part indeed as you are locked at 100% maximum as per your pic which means you cannot access the extra 16% available which points indeed toward a faulty adapter, PCIE power cable or PCIE power port on your PSU.


Thanks for the info and confirmation on the power limits. Hopefully there'll be an update to fix the 'maxing out' issue.


----------



## dante`afk

Anyone got here their EKWB block and changed the stock thermal pads they ship? EKs pads are only 3.5 W/mK

I’m thinking of changing them but even if I take the same mm size, if the pads have a different hardness the block might not sit right


----------



## motivman

dante`afk said:


> Anyone got here their EKWB block and changed the stock thermal pads they ship? EKs pads are only 3.5 W/mK
> 
> I’m thinking of changing them but even if I take the same mm size, if the pads have a different hardness the block might not sit right


I NEVER used EK thermal pads or the included thermal paste. I have great luck using Kingpin KPE for my paste and Thermalright pads. I tried Gelid pads, but they do not compress like Thermalright pads, and my GPU did not make good contact, once I switched to thermalright pads, temps were great. This was on my 3090 strix card before I sold it.


----------



## J7SC

[/QUOTE]


TheNaitsyrk said:


> That's a pretty nice overclock though.
> 
> My one can do 3160 but on air, maybe WB will change it to 3200+.


Per original post on this, those clocks were still on air, but also only via a light 3D load...looking forward to what it can do once on water and also to get rid of a high hotspot temp.



bmagnien said:


> Yup I was going back and forth with them all night via email and AliExpress text. They finally got their shipment dispatched from the bykski factory, and should’ve fulfilled all orders. Mines shipped with an ETA of the 9th.
> 
> @J7SC did you get your notification?


I presume you are referring to Formulamod re. ETA of the 9th? ..and no, I haven't heard yet (time zone difference > they are closed right now), but I will follow up later this evening as I was told that I am in #3 spot in their order line-up.


----------



## Gking62

dante`afk said:


> Anyone got here their EKWB block and changed the stock thermal pads they ship? EKs pads are only 3.5 W/mK
> 
> I’m thinking of changing them but even if I take the same mm size, if the pads have a different hardness the block might not sit right


I used Thermalright 12.8 w/mk on mem, Kryonaut Extreme Grizzly (14.2 w/mk) on the core and had phenomenal temps on my prior 3080 Ti/Vector block but recently found these:

_Thermalright Thermal Pad 14.8 W/mK Non Conductive Heat Dissipation Silicone Pad for PC Laptop Heatsink CPU/GPU/LED Graphics Card Motherboard Silicone Grease Multi-Size Pad (120x120x1.0mm) https://a.co/d/iuQv7zZ_

my EK block is still processing, ordered on Oct. 19th, approx 11/18 delivery


----------



## WayWayUp

looks like we arnt getting a 4090ti for a while
The AMD reveal was very disappointing from a performance perspective....pretty much worst case scenario across the board based on the initial rumors
pricing is phenomenal but this will not lead nvida to launch an even more expensive card

I think Nvidia was blindsided by fake AMD rumors. They thought AMD would compete on performance but they are competing on price

That explains the 4080 12gb "unlaunching" and rumors of the canceled 4090ti. They figured out what AMD actually has and adjusted accordingly


----------



## mirkendargen

Gigabyte block is here. Initial observation is the cold plate is very small compared to my Strix 3090 block, because the power components are a lot denser.


----------



## KingEngineRevUp

msky73 said:


> Inno3d X3 + Alphacool WB. The PCB (reference design) is small and so is the block. The block is excatly the same what Inno3d Frostbite version will be. Clock runs 2820Mhz at factory curve (was 2740Mhz with air cooler).
> 
> Temps reaching 50C when load is maxed. Normally around 45, below 40 during lighter loads, 30 idle. Coolant delta above 10C during high load. 360+360+120 slim rads cooling 5950X and 4090 in line. Fans set to quietness rather than efficiency. IMO water cooling is not a game changer for 4090 series, but I prefer this way.


I just scored a 4090 FE and I'm very interested in hearing more of your thoughts on putting a waterblock on it. It sounds like it's not worth it?


----------



## dr/owned

dante`afk said:


> Anyone got here their EKWB block and changed the stock thermal pads they ship? EKs pads are only 3.5 W/mK
> 
> I’m thinking of changing them but even if I take the same mm size, if the pads have a different hardness the block might not sit right


I always use my own pads. I've been using Aairhut brand off Amazon for several repads and they seem near-as-makes-no-difference the same as Thermalright Odyssey pads at half the cost. Get a pair of calipers and it's pretty easy to know the pad thickness that you need by measuring the standoff height and subtracting component height and block-plateau heights. Components like the VRAM are 1mm tall, the VRMs are 0.75mm tall...everything else doesn't matter much to put a pad on it. A lot of AIBs go nuts putting pads on inductors and capacitors.

Or just caliper the existing pads lengthwise (so you don't compress them with the calipers) and they're usually some increment of 0.5mm. AIB stock pads can be weird though where they'll use putty-like 2.2mm pads or some sh* that they want to compress down to a 1.5mm void.


----------



## msky73

KingEngineRevUp said:


> I just scored a 4090 FE and I'm very interested in hearing more of your thoughts on putting a waterblock on it. It sounds like it's not worth it?


Objectively 4090 coolers are very efficient and the room between air and LC is smaller. But it was worth for me definitely. Higher clocks without OC, being able to run fans at low 800 RPM in many games, easy access to 12VHPWR socket ;-). I was 100% positive into LC and did not even run my card on air.


----------



## mirkendargen

dr/owned said:


> I always use my own pads. I've been using Aairhut brand off Amazon for several repads and they seem near-as-makes-no-difference the same as Thermalright Odyssey pads at half the cost. Get a pair of calipers and it's pretty easy to know the pad thickness that you need by measuring the standoff height and subtracting component height and block-plateau heights. Components like the VRAM are 1mm tall, the VRMs are 0.75mm tall...everything else doesn't matter much to put a pad on it. A lot of AIBs go nuts putting pads on inductors and capacitors.
> 
> Or just caliper the existing pads lengthwise (so you don't compress them with the calipers) and they're usually some increment of 0.5mm. AIB stock pads can be weird though where they'll use putty-like 2.2mm pads or some sh* that they want to compress down to a 1.5mm void.


I'm trying out the new Arctic pads that are supposed to be super compressible this time around, we'll see how they do. The ones Bykski included seem to have changed, they were grey before and this time they're blue. The grey ones were horrible for leeching oil everywhere the one time I used them, maybe the blue ones are better, IDK cause I'm not using them. I'm not really worried about pad performance this time around without any rear VRAM to worry about, I just want something that won't make a mess and won't needlessly raise the block off the GPU die.


----------



## Mad Pistol

WayWayUp said:


> looks like we arnt getting a 4090ti for a while
> The AMD reveal was very disappointing from a performance perspective....pretty much worst case scenario across the board based on the initial rumors
> pricing is phenomenal but this will not lead nvida to launch an even more expensive card
> 
> I think Nvidia was blindsided by fake AMD rumors. They thought AMD would compete on performance but they are competing on price
> 
> That explains the 4080 12gb "unlaunching" and rumors of the canceled 4090ti. They figured out what AMD actually has and adjusted accordingly


It will be interesting to see if the 4080 16GB price gets an adjustment. I'd be very surprised if the 7900 XTX isn't at least 10-20% faster than the 4080 16GB, and if so, the 4080 16GB will not sell at $1200.


----------



## PLATOON TEKK

Been stalking this thread, still traveling. Def good info, am hype to get back and get my hands dirty. Will most likely be Strix or suprim. Also there is a 1000w bios (confirmed) 95.02.18.80.46 floating around apparently. 

I’ll post here if I convince someone to send it to me in the meantime. Will be interesting to see if cards with just one 600w (only theoretical max) can perform proper with it. Although with all the burning going on don’t know how much I recommend it for now.


----------



## WayWayUp

Mad Pistol said:


> It will be interesting to see if the 4080 16GB price gets an adjustment. I'd be very surprised if the 7900 XTX isn't at least 10-20% faster than the 4080 16GB, and if so, the 4080 16GB will not sell at $1200.


I think both AMD and Nvidia knew that the 4080 wouldn’t be price competitive.

Nvidia has a stock problem and a cheap 4080 would send the remaining 3xxx stock to an early grave.
So AMD clearly cut some things and operated around being cheaper with perf/watt since they figured nvidia had to go the performance route and charge a premium 
They probably thought nvidia would go a lot further with the 4080 but nvidia played it smartly

Both the AMD cards and the 4080 will sell out for the first few months regardless 

This buys nvda time to sell more 3xxx stock and early next year they can price cut the 4080 to $899 (probably where they wanted price to be from the get go) and slide in a 4080ti @ $1200


----------



## bmagnien

mirkendargen said:


> I'm trying out the new Arctic pads that are supposed to be super compressible this time around, we'll see how they do. The ones Bykski included seem to have changed, they were grey before and this time they're blue. The grey ones were horrible for leeching oil everywhere the one time I used them, maybe the blue ones are better, IDK cause I'm not using them. I'm not really worried about pad performance this time around without any rear VRAM to worry about, I just want something that won't make a mess and won't needlessly raise the block off the GPU die.


What thickness are the included pads? The specs on the website listed them as 1.8mm? I only have .5 and 1mm of those new Arctic pads. Let me know what you end up doing - sounds like we have all the same materials.


----------



## mirkendargen

bmagnien said:


> What thickness are the included pads? The specs on the website listed them as 1.8mm? I only have .5 and 1mm of those new Arctic pads. Let me know what you end up doing - sounds like we have all the same materials.


I don't have calipers, but I have the Arctic pads in 1.5mm. Comparing them to the included Bykski ones, I'd say the Bykski ones are 1.8-2.0mm. Thicker, but not drastically so.


----------



## J7SC

mirkendargen said:


> I'm trying out the new Arctic pads that are supposed to be super compressible this time around, we'll see how they do. The ones Bykski included seem to have changed, they were grey before and this time they're blue. The grey ones were horrible for leeching oil everywhere the one time I used them, maybe the blue ones are better, IDK cause I'm not using them. I'm not really worried about pad performance this time around without any rear VRAM to worry about, I just want something that won't make a mess and won't needlessly raise the block off the GPU die.


I've used thermal putty (tg-10) on the 3090 Strix and another high-end card last time around - with excellent / steady results even 18 months later. I am leaning towards thermal putty again for the 4090.

@bmagnien - just to confirm, you were in contact with Formulamod and they suggested ETA of Nov. 9th ?


----------



## bmagnien

mirkendargen said:


> I don't have calipers, but I have the Arctic pads in 1.5mm. Comparing them to the included Bykski ones, I'd say the Bykski ones are 1.8-2.0mm. Thicker, but not drastically so.


Cool. Let me know what you end up using and how the contact is. If you get good contact with the arctic 1.5mm I might order some before the block arrives next week


----------



## yzonker

The Bykski pads are ~1.8mm. I measured them the other day with my calipers. I suspect 2mm Gelid Extreme's would work ok since they are very soft. I'm planning to use them on the VRM's and probably the TG-10 thermal putty @J7SC just mentioned for the VRAM. I also have it on both my 3080ti and 3090. It's worked well for many months also for me. My 3090 is getting close to a year.

I'm mainly leaning toward the thermal putty because it takes no contact force from the core and is always the correct thickness. 

Like I keep mentioning, you really don't need the absolute best pads for the VRAM though. It doesn't respond to lower temps anyway like the 1GB chips. Cheaper pads, like 3.5w/mk, would be fine as long as they're soft.


----------



## ArcticZero

J7SC said:


> I've used thermal putty (tg-10) on the 3090 Strix and another high-end card last time around - with excellent / steady results even 18 months later. I am leaning towards thermal putty again for the 4090.
> 
> @bmagnien - just to confirm, you were in contact with Formulamod and they suggested ETA of Nov. 9th ?


 I'm actually about to order a batch while they still exist. Might I ask how much do you end up using for one application on your 3090? I'm getting 1-2 50g pots and want to know how much is generally enough. At least we don't have rear memory chips to worry about on the 4090.


----------



## LunaP

mirkendargen said:


> Gigabyte block is here. Initial observation is the cold plate is very small compared to my Strix 3090 block, because the power components are a lot denser.
> 
> View attachment 2580632


Is this the Byski block? OC gaming? 


yzonker said:


> The Bykski pads are ~1.8mm. I measured them the other day with my calipers. I suspect 2mm Gelid Extreme's would work ok since they are very soft. I'm planning to use them on the VRM's and probably the TG-10 thermal putty @J7SC just mentioned for the VRAM. I also have it on both my 3080ti and 3090. It's worked well for many months also for me. My 3090 is getting close to a year.
> 
> I'm mainly leaning toward the thermal putty because it takes no contact force from the core and is always the correct thickness.
> 
> Like I keep mentioning, you really don't need the absolute best pads for the VRAM though. It doesn't respond to lower temps anyway like the 1GB chips. Cheaper pads, like 3.5w/mk, would be fine as long as they're soft.


Please do share results on that as far as 2.0 pads fitting, I'm curious since 1.8's are non existing minus primochill which are jsut the pads that byski uses apparently, mine arrives next week.


----------



## KingEngineRevUp

msky73 said:


> Objectively 4090 coolers are very efficient and the room between air and LC is smaller. But it was worth for me definitely. Higher clocks without OC, being able to run fans at low 800 RPM in many games, easy access to 12VHPWR socket ;-). I was 100% positive into LC and did not even run my card on air.


I do like the idea it won't screw up my setup, also it'll take less space and the Alpha Cool comes with a new PCI-E bracket which is great. My current vertical bracket can't support 3 slots. 

Your AC came with a new PCI-E plate right?


----------



## bmagnien

Ok what the hell is this stuff:


https://www.digikey.com/en/products/detail/chip-quik-inc/TC4-20G/15195126



79w/mk ??

also, this is the stuff to get I guess?





TG-PP10-50 t-Global Technology | Fans, Thermal Management | DigiKey


Order today, ships today. TG-PP10-50 – Thermal Silicone Putty 50 gram Container from t-Global Technology. Pricing and Availability on millions of electronic components from Digi-Key Electronics.




www.digikey.com




Only gonna exist until they sell out of current stock

@Blameless looks like you have experience with this stuff right? How many grams for a full blocks worth of mem and vrms?


----------



## J7SC

ArcticZero said:


> I'm actually about to order a batch while they still exist. Might I ask how much do you end up using for one application on your 3090? I'm getting 1-2 50g pots and want to know how much is generally enough. At least we don't have rear memory chips to worry about on the 4090.


...I used about 3/4 of one of the 50g containers - but that included not only the backside VRAM on the Strix but also the backside of the die itself to connect to an extra heatsink...I found the best method is to use isopropanol to clean your hands with, then roll the thermal putty into little balls. Thermal putty is very forgiving (unless you use way, way, way too much) re. conforming into the available space w/o lifting the block. The only minor drawback is that the (non-conductive) oils will leach out as they are supposed to after a few heat cycles.


----------



## Blameless

bmagnien said:


> Ok what the hell is this stuff:
> 
> 
> https://www.digikey.com/en/products/detail/chip-quik-inc/TC4-20G/15195126
> 
> 
> 
> 79w/mk ??


Liquid metal, very similar to other liquid metal TIMs.



bmagnien said:


> also, this is the stuff to get I guess?
> 
> 
> 
> 
> 
> TG-PP10-50 t-Global Technology | Fans, Thermal Management | DigiKey
> 
> 
> Order today, ships today. TG-PP10-50 – Thermal Silicone Putty 50 gram Container from t-Global Technology. Pricing and Availability on millions of electronic components from Digi-Key Electronics.
> 
> 
> 
> 
> www.digikey.com
> 
> 
> 
> 
> Only gonna exist until they sell out of current stock
> 
> @Blameless looks like you have experience with this stuff right? How many grams for a full blocks worth of mem and vrms?


Depends on how stingy you are, but 30g should do it.


----------



## yzonker

Here's a pic of the Bykski pad and 2mm Gelid Extreme. Very close to the same thickness.

Edit: forum is protecting us from thermal pad nudies.  


__
Sensitive content, not recommended for those under 18
Show Content


----------



## J7SC

PLATOON TEKK said:


> Been stalking this thread, still traveling. Def good info, am hype to get back and get my hands dirty. Will most likely be Strix or suprim. Also there is a 1000w bios (confirmed) 95.02.18.80.46 floating around apparently.
> 
> I’ll post here if I convince someone to send it to me in the meantime. Will be interesting to see if cards with just one 600w (only theoretical max) can perform proper with it. Although with all the burning going on don’t know how much I recommend it for now.


1kw - probably Asus XOC or Galax XOC (with KPE not currently in the picture)... ?


----------



## PLATOON TEKK

J7SC said:


> 1kw - probably Asus XOC or Galax XOC (with KPE not currently in the picture)... ?


Yup galax. Apparently there’s also a full blown xoc/ln2 bios (I’m assuming 1200w) but that’s not confirmed (or finalized) to my knowledge.


----------



## motivman

zotac 4090 in stock right now at zotacstore.com


----------



## Panchovix

Tried to RMA because ASUS TUF 4090 coil whine and nope  thanks Chile for bottom of the barrel warranties lol




motivman said:


> zotac 4090 in stock right now at zotacstore.com


It seems Zotac is giving away cables?

__
https://www.reddit.com/r/nvidia/comments/ylicbh


----------



## dante`afk

Gking62 said:


> I used Thermalright 12.8 w/mk on mem, Kryonaut Extreme Grizzly (14.2 w/mk) on the core and had phenomenal temps on my prior 3080 Ti/Vector block but recently found these:
> 
> _Thermalright Thermal Pad 14.8 W/mK Non Conductive Heat Dissipation Silicone Pad for PC Laptop Heatsink CPU/GPU/LED Graphics Card Motherboard Silicone Grease Multi-Size Pad (120x120x1.0mm) https://a.co/d/iuQv7zZ_
> 
> my EK block is still processing, ordered on Oct. 19th, approx 11/18 delivery


it appears gelid extreme are the ones to order for EK replacement pads


__
https://www.reddit.com/r/watercooling/comments/rlkavy


----------



## Gking62

dante`afk said:


> it appears gelid extreme are the ones to order for EK replacement pads
> 
> 
> __
> https://www.reddit.com/r/watercooling/comments/rlkavy


yeah they're good as well, both about equal


----------



## Gking62

EK has manuals up now for the Strix/TUF blocks









EK-Quantum Vector² Strix/TUF RTX 4090 D-RGB - Nickel + Plexi


EK-Quantum Vector² water block for the ROG Strix and ASUS TUF 4090




www.ekwb.com













EK-Quantum Vector² Strix/TUF RTX 4090 D-RGB ABP Set - Nickel + Plexi


EK-Quantum Vector² water block and active backplate SET for the ROG Strix and ASUS TUF 4090




www.ekwb.com


----------



## a_Criminai

KingEngineRevUp said:


> How has your zotac trinity been treating you?


Pretty good. It definitely isn't the best 4090 but my chip clocks good and the default fan curve is quiet and decently effective. I have the least amount of coil whine I've ever heard on a GPU but I might have just gotten lucky.


----------



## J7SC

mirkendargen said:


> I don't have calipers, but I have the Arctic pads in 1.5mm. Comparing them to the included Bykski ones, I'd say the Bykski ones are 1.8-2.0mm. Thicker, but not drastically so.


Helpful info on the pad thickness...just learned that my Bykski block for the Gigabyte G-OC is about to ship. For VRAM, I'll definitely use TG 10 but I might use thermal pads for the VRM - I still have a variety of sizes (0.5mm to 2.0 mm) of Thermalright Odyssey from earlier projects.


----------



## mirkendargen

Got the block on, 1.5mm Arctic TP-3 pads definitely compressed fine and no dremmeling needed for any interference issues like I had to on my V1 Strix 3090 block.... Gigabyte stock pads look to be 1mm all around and feel just like Thermalright Odyssey. They went a bit bananas at the factory and put pads on literally everything, so beware that it took an uncomfortable amount of force to get the heatsink off initially. I went back and looked three times to make sure I hadn't missed a screw somewhere. I'm still working on getting my damn loop purged so I can't give performance info yet (I'm trying Honeywell PTM7950 on the core and am super curious how that turns out) but here are some pics of the process and end result:


----------



## Panchovix

WayWayUp said:


> looks like we arnt getting a 4090ti for a while
> The AMD reveal was very disappointing from a performance perspective....pretty much worst case scenario across the board based on the initial rumors
> pricing is phenomenal but this will not lead nvida to launch an even more expensive card
> 
> I think Nvidia was blindsided by fake AMD rumors. They thought AMD would compete on performance but they are competing on price
> 
> That explains the 4080 12gb "unlaunching" and rumors of the canceled 4090ti. They figured out what AMD actually has and adjusted accordingly


I have to say, we need to wait. I mean, I already got a 4090, but maybe the 7900XTX will be a surprise in rasterization.

Though, the RT improvement is not much as I expected, but this is a win for all

The RTX 4080 16GB literally makes no sense now, and if the 7900XTX matches the 4090 rasterization while being 600USD less, it will hurt NVIDIA surely.
I mean, I can't really say if AMD is so much behind the 4090, if the claims of the 7900XTX being 1.5-1.7x faster than the 6950XT are real.

Maybe the worst case for NVIDIA would reduce the price of the 4090 and add a 4090Ti at the same price point (I mean, we're already sure that the 4090Ti will exist)


----------



## Arizor

It's an interesting situation. AMD's price point certainly challenges the 16GB 4080, but with such weak raytracing (surely a common feature a year from now), it's hard to say where the XTX will land - if you're going to spend 1k on a graphics card, why not an extra 200 for superior RT and DLSS?

Considering the rumours are that NVIDIA have been stockpiling Ti chips for a while now, I wouldn't be surprised by the Ti appearing in the middle of next year.


----------



## Mad Pistol

Arizor said:


> It's an interesting situation. AMD's price point certainly challenges the 16GB 4080, but with such weak raytracing (surely a common feature a year from now), it's hard to say where the XTX will land - if you're going to spend 1k on a graphics card, why not an extra 200 for superior RT and DLSS?
> 
> Considering the rumours are that NVIDIA have been stockpiling Ti chips for a while now, I wouldn't be surprised by the Ti appearing in the middle of next year.


Ti will certainly appear, but Nvidia probably wants to clear out their lower tier stock first. The last thing they need to do is knee-cap themselves even further.


----------



## Arizor

Mad Pistol said:


> Ti will certainly appear, but Nvidia probably wants to clear out their lower tier stock first. The last thing they need to do is knee-cap themselves even further.


Yep, as I say, I think around June 2023 they'll be in a position to release the Ti, having cleared out their 3-series backlog over the preceding 8-9 months, and probably in a good position to counter AMD's release of FSR3 grabbing headlines (AMD say 2023, if it was January they would've said, I imagine we'll see FSR3 appearing close towards the middle of the year).


----------



## PhuCCo

Hey guys, I saw some comments about EK blocks for the 4090 and wanted to chime in. My FE block is a nightmare. 

I read the manual online before the block arrived and it states that you use 1mm pads on the memory/vrm, and 1.5mm pads on the chokes. I knew the stock EK pads are trash so I ordered Gelid Extremes to replace them with. I received the block yesterday and put the Gelids on straight away. I noticed I wasn't getting much contact between the core and the block; lots of compressing. I checked the manual 100 times and verified that I used the "correct" size pads. I went and tested the card anyways because I was tired and was greeted with a 30C+ delta between gpu core and core hotspot. And a 30C+ delta between core and water temps at around 400W gpu power draw.

I thought maybe the pads were too stiff and not allowing good contact, even though they feel very similar to the stock pads. I took the Gelids off and started measuring the stock pads. I found three pads were actually 0.8mm thick instead of 1mm. I started looking online and found one guy on reddit that also has the block, and his block included this piece of paper telling him to use 0.8mm and 1mm on the front of the card, and to ignore the online manual. I never received this piece of paper with my block and so I had no way of knowing. They never bothered to update the website either lol. Also, the pads are not labelled. So you will need a micrometer nearby to differentiate the 0.8mm and 1mm pads to not get them confused.

I put the "correct" 0.8mm and 1mm pads on the front and.... 20C+ delta between gpu core and gpu hotspot, and 20C+ delta between core and water temp at 400W.

Can anyone else with the EK 4090FE block throw their cards together and see if you get similar results? The performance of my block is absolutely pathetic. I have 0.5mm pads on the way and I will update if that improves the gpu contact. Otherwise I should just put the stock cooler back on lol


----------



## KingEngineRevUp

Well ****... the Alpha Cool water block only works for "Reference" cards? That means it won't work on the FE. There's only Corsair and EKWB it seems... The Corsair block comes out to be cheaper. Not sure which one to get.


----------



## J7SC

Arizor said:


> Yep, as I say, I think around June 2023 they'll be in a position to release the Ti, having cleared out their 3-series backlog over the preceding 8-9 months, and probably in a good position to counter AMD's release of FSR3 grabbing headlines (AMD say 2023, if it was January they would've said, I imagine we'll see FSR3 appearing close towards the middle of the year).


Thinking ahead, are we ? 

For gaming, I doubt that the 4090 Ti will make much difference compared to a 'regular' 4090. It is probably more important for benching, and may be some productivity apps.

I am just looking forward to enjoy the 4090 to the fullest (just got the tracking # for the block). While the delta from GPU to hotpsot is fine, the overall temps are a bit too high (I suspect a potential seating / TIM issue) and once w-cooled, I'll address that. Oddly enough, the VRAM temps are really low.

I now have an extra high-end GPU I have to find a work-place home for as it got displaced by the 4090. Since I run multiple machines, it's going to be a real game of dominos re. various CPU-GPU reconfigurations...I '''might''' add a new mobo, but am not even sure what to get. Socket AM5 will have more longevity, but the 7950X doesn't really pull me away from my current AM4 setup. LG1700 looks interesting, but to get into that game now as the socket will be replaced reasonably soon doesn't make too much sense from a cost-benefit POV. Now, if AMD releases a 7950X3D, I'll take another look...but really, contentment can be a beautiful thing...


----------



## Arizor

J7SC said:


> Thinking ahead, are we ?
> 
> For gaming, I doubt that the 4090 Ti will make much difference compared to a 'regular' 4090. It is probably more important for benching, and may be some productivity apps.
> 
> I am just looking forward to enjoy the 4090 to the fullest (just got the tracking # for the block). While the delta from GPU to hotpsot is fine, the overall temps are a bit too high (I suspect a potential seating / TIM issue) and once w-cooled, I'll address that. Oddly enough, the VRAM temps are really low.
> 
> I now have an extra high-end GPU I have to find a work-place home for as it got displaced by the 4090. Since I run multiple machines, it's going to be a real game of dominos re. various CPU-GPU reconfigurations...I '''might''' add a new mobo, but am not even sure what to get. Socket AM5 will have more longevity, but the 7950X doesn't really pull me away from my current AM4 setup. LG1700 looks interesting, but to get into that game now as the socket will be replaced reasonably soon doesn't make too much sense from a cost-benefit POV. Now, if AMD releases a 7950X3D, I'll take another look...but really, contentment can be a beautiful thing...


Haha can't help myself. Definitely will help with some of my game dev and video editing work, but still, we'll always find excuses won't we .

I'm in the same boat, only really interested when AMD offers a substantial performance upgrade, the X3D of this new gen will probably be it, so reckon around this time next year I might be tempted.


----------



## J7SC

Arizor said:


> Haha can't help myself. Definitely will help with some of my game dev and video editing work, but still, we'll always find excuses won't we .
> 
> I'm in the same boat, only really interested when AMD offers a substantial performance upgrade, the X3D of this new gen will probably be it, so reckon around this time next year I might be tempted.


Some EU-based sales data seems to suggest that Intel's 13th gen launch went better than AMD's AM5, but concurrent sales of AM4 (especially 5800X3D) still lead the market by a chunk...


----------



## zkareemz

Arizor said:


> Does anyone know the size of the thermal pads for the TUF? Thinking of repasting to reduce hotspot temps.


 please share the findings


----------



## zkareemz

Arizor said:


> Does anyone know the size of the thermal pads for the TUF? Thinking of repasting to reduce hotspot temps.


please share the findings


----------



## DokoBG

Just flashed my Gaming Trio with the Suprim X bios. Works perfectly !!! Got about 1K extra points from this new bios in Port Royal. Hella happy right now.


----------



## vigorito

i wonder how 4090 will perform over 2.1 monitor (full bandwith 48gbs),did anyone try to test that


----------



## mirkendargen

vigorito said:


> i wonder how 4090 will perform over 2.1 monitor (full bandwith 48gbs),did anyone try to test that


It works, no different than a 3090. Is there some specific thing you're trying to do?


----------



## vigorito

But what about gsync,lag,tearing i know we must use DP with NV card ,is gaming in this combo works well like over DP?


----------



## Damaged__

PhuCCo said:


> Hey guys, I saw some comments about EK blocks for the 4090 and wanted to chime in. My FE block is a nightmare.
> 
> I read the manual online before the block arrived and it states that you use 1mm pads on the memory/vrm, and 1.5mm pads on the chokes. I knew the stock EK pads are trash so I ordered Gelid Extremes to replace them with. I received the block yesterday and put the Gelids on straight away. I noticed I wasn't getting much contact between the core and the block; lots of compressing. I checked the manual 100 times and verified that I used the "correct" size pads. I went and tested the card anyways because I was tired and was greeted with a 30C+ delta between gpu core and core hotspot. And a 30C+ delta between core and water temps at around 400W gpu power draw.
> 
> I thought maybe the pads were too stiff and not allowing good contact, even though they feel very similar to the stock pads. I took the Gelids off and started measuring the stock pads. I found three pads were actually 0.8mm thick instead of 1mm. I started looking online and found one guy on reddit that also has the block, and his block included this piece of paper telling him to use 0.8mm and 1mm on the front of the card, and to ignore the online manual. I never received this piece of paper with my block and so I had no way of knowing. They never bothered to update the website either lol. Also, the pads are not labelled. So you will need a micrometer nearby to differentiate the 0.8mm and 1mm pads to not get them confused.
> 
> I put the "correct" 0.8mm and 1mm pads on the front and.... 20C+ delta between gpu core and gpu hotspot, and 20C+ delta between core and water temp at 400W.
> 
> Can anyone else with the EK 4090FE block throw their cards together and see if you get similar results? The performance of my block is absolutely pathetic. I have 0.5mm pads on the way and I will update if that improves the gpu contact. Otherwise I should just put the stock cooler back on lol


Here is a few minutes of furmark with my 4090 FE block and as you can see the Core-Hotspot delta is consistent at around 10c even with near 600w load. I am unfortunately using the stock pads as I fear there aren't really a whole lot of alternatives right now and I'd be more keen to experiment once we have more data on the matter. The memory does get a bit hot when loading super memory intensive games while also pulling a reasonable amount of power to the core (Quake RTX for example) but there's not much that can be done about that at the moment.

My block did include the piece of paper basically telling you to ignore the online manual and I had a hell of a time determining which pads I should use, but all in all the results seem decent.


----------



## sneida

anyone else facing the problem that whole computer freezes when watching youtube videos (chrome)? playing games for hours works perfectly fine...

z690/12700k, tuf 4090, 526.61, 22h2 (22621.755)


----------



## vigorito

disable HW accelaration in chrome,and go back to 522.25,new driver is a mess


----------



## KingEngineRevUp

PhuCCo said:


> I knew the stock EK pads are trash


My EKWB pads on my 3080 Ti worked great. I would go back and give them a try and not assume they're trash.

Pad upgrades on water blocks seems unnecessary.

Btw, I didn't know EKWB blocks were shipping already for the 4090s. It just says pre-order for me. How did you get one so early?


----------



## KingEngineRevUp

Damaged__ said:


> Here is a few minutes of furmark with my 4090 FE block and as you can see the Core-Hotspot delta is consistent at around 10c even with near 600w load. I am unfortunately using the stock pads as I fear there aren't really a whole lot of alternatives right now and I'd be more keen to experiment once we have more data on the matter. The memory does get a bit hot when loading super memory intensive games while also pulling a reasonable amount of power to the core (Quake RTX for example) but there's not much that can be done about that at the moment.
> 
> My block did include the piece of paper basically telling you to ignore the online manual and I had a hell of a time determining which pads I should use, but all in all the results seem decent.
> View attachment 2580759


54C memory is hot?


----------



## msky73

KingEngineRevUp said:


> Your AC came with a new PCI-E plate right?


----------



## KingEngineRevUp

Damaged__ said:


> Here is a few minutes of furmark with my 4090 FE block and as you can see the Core-Hotspot delta is consistent at around 10c even with near 600w load. I am unfortunately using the stock pads as I fear there aren't really a whole lot of alternatives right now and I'd be more keen to experiment once we have more data on the matter. The memory does get a bit hot when loading super memory intensive games while also pulling a reasonable amount of power to the core (Quake RTX for example) but there's not much that can be done about that at the moment.
> 
> My block did include the piece of paper basically telling you to ignore the online manual and I had a hell of a time determining which pads I should use, but all in all the results seem decent.
> View attachment 2580759


54C memory is hot?



msky73 said:


> View attachment 2580763
> View attachment 2580763


I realized that block won't work on the FE. I'll probably have to go with EKWB again.


----------



## 8472

sneida said:


> anyone else facing the problem that whole computer freezes when watching youtube videos (chrome)? playing games for hours works perfectly fine...
> 
> z690/12700k, tuf 4090, 526.61, 22h2 (22621.755)


Yes, I had that happen with the studio driver. I'm going to roll back to the first 4090 driver.


----------



## 8472

__ https://twitter.com/i/web/status/1588462665676083200


----------



## long2905

8472 said:


> __ https://twitter.com/i/web/status/1588462665676083200


thats very concerning


----------



## msky73

KingEngineRevUp said:


> 54C memory is hot?
> 
> 
> 
> I realized that block won't work on the FE. I'll probably have to go with EKWB again.


AC will have FE version soon.


https://www.alphacool.com/download/compatibility%20list%20Nvidia.pdf


----------



## mattskiiau

8472 said:


> __ https://twitter.com/i/web/status/1588462665676083200


 Oh gosh.. I thought my anxiety would relax when I replaced the adaptor..
😢


----------



## Arizor

mattskiiau said:


> Oh gosh.. I thought my anxiety would relax when I replaced the adaptor..
> 😢


At this point I'm pondering whether it is as JohnnyGuru (PSU veteran and Corsair techie) posted on Reddit, after weeks of testing - it is in fact user error, not pushing in the cable fully.

Less margin of error allowed in such a small connector. Certainly there needs to be some blame laid at the designer's feet here, better to ensure some kind of clear feedback that the cable is plugged in, rather than leave it ambiguous.

Lots of reports of no "click" sound for those who've had melting. I can attest none of mine "clicked", which is why I've made trebly sure the connector is absolutely all the way in before booting up, but I'm sure a lot of people aren't as careful, nor really should they have to be.


----------



## MrTOOSHORT

Cablemod cable shipped. ETA Nov. 10th from Fedex tracking. Ordered Oct. 21st.


----------



## Gking62

MrTOOSHORT said:


> Cablemod cable shipped. ETA Nov. 10th from Fedex tracking. Ordered Oct. 21st.


same, ordered 10/19, shipped 11/1, delivery for 11/7


----------



## yzonker

Arizor said:


> At this point I'm pondering whether it is as JohnnyGuru (PSU veteran and Corsair techie) posted on Reddit, after weeks of testing - it is in fact user error, not pushing in the cable fully.
> 
> Less margin of error allowed in such a small connector. Certainly there needs to be some blame laid at the designer's feet here, better to ensure some kind of clear feedback that the cable is plugged in, rather than leave it ambiguous.
> 
> Lots of reports of no "click" sound for those who've had melting. I can attest none of mine "clicked", which is why I've made trebly sure the connector is absolutely all the way in before booting up, but I'm sure a lot of people aren't as careful, nor really should they have to be.


Like I mentioned previously, it doesn't have enough margin to absorb defects, degredation from plugging/unplugging, etc... 

So if everything is in good shape it works fine, but probably all it takes is poor contact on one +12v pin to potentially overheat.


----------



## PLATOON TEKK

ASUS xoc 1000w is out there too. 95.02.18.80.83


----------



## bsch3r

PLATOON TEKK said:


> ASUS xoc 1000w is out there too. 95.02.18.80.83


Where can it be downloaded?


----------



## PLATOON TEKK

bsch3r said:


> Where can it be downloaded?


I’m waiting on emails for both the galax and this ASUS 1kw. Let’s hope my links come through.

The ASUS is being sent out to people today and tomorrow so chances it will leak are bigger than the galax for now (to my knowledge).


----------



## Blameless

long2905 said:


> thats very concerning


One example isn't a trend.

I mean, I'm sure there has been more than one, but there definitely hasn't been as many as with the dubiously designed adapters.



yzonker said:


> all it takes is poor contact on one +12v pin to potentially overheat.


Or on one ground.


----------



## ALSTER868

Damaged__ said:


> Here is a few minutes of furmark with my 4090 FE block and as you can see the Core-Hotspot delta is consistent at around 10c even with near 600w load. I am unfortunately using the stock pads as I fear there aren't really a whole lot of alternatives right now and I'd be more keen to experiment once we have more data on the matter. The memory does get a bit hot when loading super memory intensive games while also pulling a reasonable amount of power to the core (Quake RTX for example) but there's not much that can be done about that at the moment.
> 
> My block did include the piece of paper basically telling you to ignore the online manual and I had a hell of a time determining which pads I should use, but all in all the results seem decent.
> View attachment 2580759


And what's your T delta coolant-chip like?


----------



## Laithan




----------



## WayWayUp

using hdmi 2.1 instead of display port 1.4, whats the maximum 4k res without compression?

4k 170fps? since it's 48Gbit/s

I know dp40 can do 142hz and hdmi 2.1 still has 20% more bandwidth


----------



## yzonker

Blameless said:


> One example isn't a trend.
> 
> I mean, I'm sure there has been more than one, but there definitely hasn't been as many as with the dubiously designed adapters.
> 
> 
> 
> Or on one ground.


Well actually according to Buildzoid, the +12v side is much more likely due some of the ground current traveling through the slot. The pic shows this latest one is melted on the +12v side.


----------



## Blameless

yzonker said:


> Well actually according to Buildzoid, the +12v side is much more likely due some of the ground current traveling through the slot. The pic shows this latest one is melted on the +12v side.


How much is grounded through the slot? I'd expect them to be balanced.


----------



## slopokdave

Damaged__ said:


> Here is a few minutes of furmark with my 4090 FE block and as you can see the Core-Hotspot delta is consistent at around 10c even with near 600w load. I am unfortunately using the stock pads as I fear there aren't really a whole lot of alternatives right now and I'd be more keen to experiment once we have more data on the matter. The memory does get a bit hot when loading super memory intensive games while also pulling a reasonable amount of power to the core (Quake RTX for example) but there's not much that can be done about that at the moment.
> 
> My block did include the piece of paper basically telling you to ignore the online manual and I had a hell of a time determining which pads I should use, but all in all the results seem decent.
> View attachment 2580759


What is your radiator setup?


----------



## yzonker

Blameless said:


> How much is grounded through the slot? I'd expect them to be balanced.


Dunno, have to ask BZ.


----------



## Blameless

yzonker said:


> Dunno, have to ask BZ.


I did find the video where he mentions this: 




Would need to measure specifics, but if there are no diodes or anything else isolating the slot to keep it from grounding any arbitrary amount of current, the reasoning is sound.


----------



## BeZol

BeZol said:


> I am not following the thread any more, so if it has already been, then sorry in advance.
> 
> Inno3D X3 OC RTX 4090
> 3x8pin adapter
> TDP 450W
> 
> It's basically one of the cheapest models you can get.
> 
> With 4x8pin adapter nothing changes.
> 
> BUT
> if you put a Gigabyte Gaming OC BIOS (450W +33% = 600W TDP limit) on the GPU,
> coupled with the 4x8pin adapter I managed to unlock the graphics card.
> 
> Interestingly I had voltage limit at 1.1V at my Port Royal run (28.928 score) with peak TDP of 550W.
> 
> View attachment 2579265


Some correction.
I just did a quick test with the factory 3x8pin adapter + Gigabyte Gaming OC BIOS on my Inno3D RTX 4090 X3 OC
AND
it can still pull the 550W like with the 4x8pin adapter.
Video here:


----------



## dante`afk

PhuCCo said:


> Hey guys, I saw some comments about EK blocks for the 4090 and wanted to chime in. My FE block is a nightmare.
> 
> I read the manual online before the block arrived and it states that you use 1mm pads on the memory/vrm, and 1.5mm pads on the chokes. I knew the stock EK pads are trash so I ordered Gelid Extremes to replace them with. I received the block yesterday and put the Gelids on straight away. I noticed I wasn't getting much contact between the core and the block; lots of compressing. I checked the manual 100 times and verified that I used the "correct" size pads. I went and tested the card anyways because I was tired and was greeted with a 30C+ delta between gpu core and core hotspot. And a 30C+ delta between core and water temps at around 400W gpu power draw.
> 
> I thought maybe the pads were too stiff and not allowing good contact, even though they feel very similar to the stock pads. I took the Gelids off and started measuring the stock pads. I found three pads were actually 0.8mm thick instead of 1mm. I started looking online and found one guy on reddit that also has the block, and his block included this piece of paper telling him to use 0.8mm and 1mm on the front of the card, and to ignore the online manual. I never received this piece of paper with my block and so I had no way of knowing. They never bothered to update the website either lol. Also, the pads are not labelled. So you will need a micrometer nearby to differentiate the 0.8mm and 1mm pads to not get them confused.
> 
> I put the "correct" 0.8mm and 1mm pads on the front and.... 20C+ delta between gpu core and gpu hotspot, and 20C+ delta between core and water temp at 400W.
> 
> Can anyone else with the EK 4090FE block throw their cards together and see if you get similar results? The performance of my block is absolutely pathetic. I have 0.5mm pads on the way and I will update if that improves the gpu contact. Otherwise I should just put the stock cooler back on lol



what the EK page shows:











the paper they give you with on shipping:


----------



## mirkendargen

vigorito said:


> But what about gsync,lag,tearing i know we must use DP with NV card ,is gaming in this combo works well like over DP?


Gsync has worked over HDMI for a long time now, I do 4k/120hz/10bit/RGB with Gsync over HDMI for my primary display.


----------



## Gking62

dante`afk said:


> what the EK page shows:
> View attachment 2580815
> 
> 
> 
> 
> the paper they give you with on shipping:
> View attachment 2580816


yeah that's strange and I struck up a ticket with them on this, hell anyone could quite possibly pinch a 1.0mm with a caliper or more so just using it and tightening the block, which is why I'd still use 1.0mm thickness, .2mm is just .007in.

I used the Thermalright Odyssey 12.8 w/mk stuff on my 3080 Ti ftw3 ultra EK block and had zero fitment issues, temps were fantastic.


----------



## PhuCCo

dante`afk said:


> what the EK page shows:
> View attachment 2580815
> 
> 
> 
> 
> the paper they give you with on shipping:
> View attachment 2580816


The guy on Reddit sent me the same thing last night. I looked through all of the packaging of my block and I definitely did not receive this instruction sheet. When I opened the box, there was only a piece of foam covering the block on top and then the little orange boxes with the tools and pads. 

Do you have this block? If so, have you tested it?

With the stock pads, I'm now getting an 11C delta between core and hotspot at around 450W load which is pretty good, but my core to water delta is still around 20C. I'm still going to try thinner pads on the memory since they already F'd up the sizing to begin with.


----------



## dante`afk

PhuCCo said:


> The guy on Reddit sent me the same thing last night. I looked through all of the packaging of my block and I definitely did not receive this instruction sheet. When I opened the box, there was only a piece of foam covering the block on top and then the little orange boxes with the tools and pads.
> 
> Do you have this block? If so, have you tested it?
> 
> With the stock pads, I'm now getting an 11C delta between core and hotspot at around 450W load which is pretty good, but my core to water delta is still around 20C. I'm still going to try thinner pads on the memory since they already F'd up the sizing to begin with.


I'll get my block later today.

I have also ordered gelid extreme pads that I will use 1.0mm instead of 0.8mm, should be fine.


but I have the following suspicion: even though the EK pads are rated at only 3.6 wkm, the are probably specifically made and adjusted tot he block to give the best contact, so good that even higher rated 12 wkm pads like gelid or thermalright have the same effect.


----------



## cletus-cassidy

motivman said:


> try timespy extreme


TSE got me to 562W max. Does this look similar to others with Tuf (other than temps as I'm under water)?

TSE: 18.2k 
PR: 28.3k

GPU-Z: https://gpuz.techpowerup.com/22/11/04/tw7.png


----------



## Gking62

dante`afk said:


> I'll get my block later today.
> 
> I have also ordered gelid extreme pads that I will use 1.0mm instead of 0.8mm, should be fine.
> 
> 
> but I have the following suspicion: even though the EK pads are rated at only 3.6 wkm, the are probably specifically made and adjusted tot he block to give the best contact, so good that even higher rated 12 wkm pads like gelid or thermalright have the same effect.


EK explanation I got moments ago:

"
The first couple of FE 4090 blocks need to use special/thinner thermal pads (0.8mm ones) to fit the card properly. When the first blocks were done, we got informed that the contact was not 100% perfect and that we need to make the cold plate a bit thinner to be able to use the 1mm therm pads. After the cold plate adjustment, we were able to add the 1mm thermal pads to the packages. We needed to add the 0.8mm thermal pads to the first blocks and add the corrected printed manual so the customers would get 100% alignment.

The Strix/TUF version did not have to go through that process so you are safe."


----------



## J7SC

Arizor said:


> At this point I'm pondering whether it is as JohnnyGuru (PSU veteran and Corsair techie) posted on Reddit, after weeks of testing - it is in fact user error, not pushing in the cable fully.
> 
> Less margin of error allowed in such a small connector. Certainly there needs to be some blame laid at the designer's feet here, better to ensure some kind of clear feedback that the cable is plugged in, rather than leave it ambiguous.
> 
> Lots of reports of no "click" sound for those who've had melting. I can attest none of mine "clicked", which is why I've made trebly sure the connector is absolutely all the way in before booting up, but I'm sure a lot of people aren't as careful, nor really should they have to be.


Yes, 'seating' the connector correctly is half the battle in the first place ! Also, continued 'checking' with unplugging / reinserting / wiggling is likely to make it worse, given that the 4-into-1 dongle exposed metal sleeves for the pins on the card side have a design that seems to allow for more loosening when the connector is bent. I already posted the pic in the spoiler before, but I actually managed to melt a bit of a regular 8 pin connector when I had folded the cables over and out of the way for 'cosmetic' reasons.


Spoiler














Anyway, both the waterblock and the 12VHPWR single-strand cable from Cablemod are now in transit and and scheduled to arrive here on the 10th...when I take the Giga-G-OC out to install the waterblock will be the first time I unplug the 4-into-1 dongle connector from its original mounting.

@PLATOON TEKK - nice, ahem, dichotomy re. new 1,000 W XOC and existing 600 W connector melt posts here today


----------



## rahkmae

strix quick test, just put in


----------



## Gking62

rahkmae said:


> View attachment 2580827
> 
> 
> strix quick test, just put in


not bad, most PR scores are around 27 to 28K i believe pc specs? I'll be doing this with my Strix when I get my CM cable on Mon.


----------



## PhuCCo

dante`afk said:


> I'll get my block later today.
> 
> I have also ordered gelid extreme pads that I will use 1.0mm instead of 0.8mm, should be fine.
> 
> 
> but I have the following suspicion: even though the EK pads are rated at only 3.6 wkm, the are probably specifically made and adjusted tot he block to give the best contact, so good that even higher rated 12 wkm pads like gelid or thermalright have the same effect.


Another thing to look for when you get your block. 
They absolutely cranked down the backplate to the block and warped my backplate lol. Look for that.
It's relatively straight once mounted, but some of the pads on the backside don't even contact anything on the pcb. The 2mm pad on the backplate that sits behind the GPU core has yet to contact the backplate, and the memory pads on the backplate barely touch the pcb.

Overall I'm extremely unhappy with my block. I wonder if I can get a replacement or refund. Too many red flags. 

I look forward to some temps when you get your block mounted. 

If my memory temps skyrocket and core/hotspot temps stay the same with 0.5mm pads, then that confirms that my block simply sucks at transferring heat from the core. At that point I'll know I'm bottomed out.


----------



## dante`afk

Gking62 said:


> EK explanation I got moments ago:
> 
> "
> The first couple of FE 4090 blocks need to use special/thinner thermal pads (0.8mm ones) to fit the card properly. When the first blocks were done, we got informed that the contact was not 100% perfect and that we need to make the cold plate a bit thinner to be able to use the 1mm therm pads. After the cold plate adjustment, we were able to add the 1mm thermal pads to the packages. We needed to add the 0.8mm thermal pads to the first blocks and add the corrected printed manual so the customers would get 100% alignment.
> 
> The Strix/TUF version did not have to go through that process so you are safe."


so how do you know which block you got


----------



## mirkendargen

Gking62 said:


> EK explanation I got moments ago:
> 
> "
> The first couple of FE 4090 blocks need to use special/thinner thermal pads (0.8mm ones) to fit the card properly. When the first blocks were done, we got informed that the contact was not 100% perfect and that we need to make the cold plate a bit thinner to be able to use the 1mm therm pads. After the cold plate adjustment, we were able to add the 1mm thermal pads to the packages. We needed to add the 0.8mm thermal pads to the first blocks and add the corrected printed manual so the customers would get 100% alignment.
> 
> The Strix/TUF version did not have to go through that process so you are safe."


Wow that's some BS. "We made a defective product, and instead of throwing it in the trash, we're just going to sell them to people anyway with a workaround".


----------



## PhuCCo

Gking62 said:


> EK explanation I got moments ago:
> 
> "
> The first couple of FE 4090 blocks need to use special/thinner thermal pads (0.8mm ones) to fit the card properly. When the first blocks were done, we got informed that the contact was not 100% perfect and that we need to make the cold plate a bit thinner to be able to use the 1mm therm pads. After the cold plate adjustment, we were able to add the 1mm thermal pads to the packages. We needed to add the 0.8mm thermal pads to the first blocks and add the corrected printed manual so the customers would get 100% alignment.
> 
> The Strix/TUF version did not have to go through that process so you are safe."


See this type of stuff is absolutely unacceptable to me. I pay hundred of dollars for their machining oversight and they try to bandaid it by making me use unconventional pad sizes AND not include any instructions in my box to tell me.


----------



## J7SC

Laithan said:


>


...nice vid - but I'm somewhat mesmerized by the look of that 12VHPWR connector - paranoia time, folks !


----------



## GAN77

dante`afk said:


> so how do you know which block you got


Assemble, see the temperature, disassemble)


----------



## Panchovix

Was doing some morning benchmarks, on TSE and damn, I was surprised when I saw the best score for my GPU-CPU Combo










He did 23k on graphics score! Wow, have you guys do 23k on TSE graphics score as well? If yes, which clocks were you using?


----------



## Gking62

mirkendargen said:


> Wow that's some BS. "We made a defective product, and instead of throwing it in the trash, we're just going to sell them to people anyway with a workaround".


yeah it's pathetic to me, I may cancel my order...



PhuCCo said:


> See this type of stuff is absolutely unacceptable to me. I pay hundred of dollars for their machining oversight and they try to bandaid it by making me use unconventional pad sizes AND not include any instructions in my box to tell me.


honestly, this is making me reconsider my EK order although my 3080 Ti FTW3 Ultra Vector experience was fine, not sure about my Strix which I have waaaay too much invested (eBay nuclear option), I'm strongly looking at Optimus where I'm on their notify list for the Absolute 4090 block and the fact that the Strix OEM cooling is superior, hell I can wait this out, what do you all think?


----------



## rahkmae

Gking62 said:


> not bad, most PR scores are around 27 to 28K i believe pc specs? I'll be doing this with my Strix when I get my CM cable on Mon.












add mem +1000 not time today more test...


----------



## AKBrian

mirkendargen said:


> Wow that's some BS. "We made a defective product, and instead of throwing it in the trash, we're just going to sell them to people anyway with a workaround".


Worse still, these blocks are going to quickly end up on the secondary used market, and new owners will have no practical way to know if they're using a defective version.


----------



## dr/owned

PhuCCo said:


> See this type of stuff is absolutely unacceptable to me. I pay hundred of dollars for their machining oversight and they try to bandaid it by making me use unconventional pad sizes AND not include any instructions in my box to tell me.


Stuff like this is why I don't run EK. This has been their operating style for 10 years now and their price keeps increasing. My attitude is: I might as well buy the $115 Bykski version and have the same experience.


----------



## PhuCCo

Gking62 said:


> yeah it's pathetic to me, I may cancel my order... honestly, this is making me reconsider my EK order although my 3080 Ti FTW3 Ultra Vector experience was fine, not sure about my Strix which I have waaaay too much invested (eBay nuclear option), I'm strongly looking at Optimus where I'm on their notify list for the Absolute 4090 block and the fact that the Strix OEM cooling is superior, hell I can wait this out, what do you all think?


 I have a bunch of EK products. D5 pump top, radiators, cpu block, fittings, etc. All flawless. EK seems to have a giant smear campaign for their gpu blocks on forums and reddit and my experience really makes me understand it. I put in a ticket for an RMA. My block is obviously not made right. I really hope they correct this and either let me get a refund or a replacement. If they deny it for whatever reason and make me eat almost $300 for a block made incorrectly with bandaid fixes with no instructions for the said bandaid fixes.. I will be at a loss for words. That would be it for me with buying EK products. Oh, and a backplate shipped to me cranked down so hard that it will forever be shaped like a banana. I would wait to see others who have already received their blocks and see what they have to say with their experience. I could just have a lemon. I don't want to blindly say avoid the block.


----------



## Gking62

PhuCCo said:


> I have a bunch of EK products. D5 pump top, radiators, cpu block, fittings, etc. All flawless. EK seems to have a giant smear campaign for their gpu blocks on forums and reddit and my experience really makes me understand it. I put in a ticket for an RMA. My block is obviously not made right. I really hope they correct this and either let me get a refund or a replacement. If they deny it for whatever reason and make me eat almost $300 for a block made incorrectly with bandaid fixes with no instructions for the said bandaid fixes.. I will be at a loss for words. That would be it for me with buying EK products. Oh, and a backplate shipped to me cranked down so hard that it will forever be shaped like a banana. I would wait to see others who have already received their blocks and see what they have to say with their experience. I could just have a lemon. I don't want to blindly say avoid the block.


I'm in the process as I write this on cancelling my order for the EK Strix block, going to wait it out the Optimus Absolute block...

order now cancelled









Signature 4090 Strix/TUF GPU Waterblock


IMPORTANT SHIPPING INFORMATION: Orders 11/15/22: Ship November and through December 2022 Orders 11/16/22: Ship January 2023 The highest performance GPU waterblock ever created, exclusively for the NVIDIA 4090 GPU. Compatible with: ASUS Strix 4090 OC Edition ASUS Strix 4090 ASUS TUF 4090 OC...




optimuspc.com


----------



## GAN77

EK wants to become a Mercedes for the price, but the quality is worse than Bykski


----------



## Tideman

Missed delivery of my Moddiy cable this morning. Been rescheduled for Monday. There was someone at home too.. so frustrating.

Credit to Moddiy though for getting it over to me 5 days from the placing of my order.


----------



## yzonker

Panchovix said:


> Was doing some morning benchmarks, on TSE and damn, I was surprised when I saw the best score for my GPU-CPU Combo
> 
> View attachment 2580835
> 
> 
> He did 23k on graphics score! Wow, have you guys do 23k on TSE graphics score as well? If yes, which clocks were you using?


That's from causing artifacting with an unstable VRAM OC. Notice that the clocks are the same as the rest of us are getting on air. This is the run I did. 









I scored 18 531 in Time Spy Extreme


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





There's something weird otherwise too though. A guy over in the Speedway benchmark thread did a 112xx. Swears it wasn't artifacted but it's above even KedarWolf's score. I don't see where the extra pts are coming from. His PR score was about the same as mine. 









3DMark Speed Way


Like I said though, I had run this in my 12900k with DDR5 7000CL32. Same score. 7000 tuned and 4133 are extremely close in perf. If it's not ram, bandwidth and or timings then it's cpu core. If it's not cpu core then it's OS/drivers, if not those then the benchmark needs to be deleted and...




www.overclock.net





Sorry, got off on a tangent, but it's got me perplexed.


----------



## yzonker

Oh and we've proven again that EK sucks....


----------



## Panchovix

yzonker said:


> That's from causing artifacting with an unstable VRAM OC. Notice that the clocks are the same as the rest of us are getting on air. This is the run I did.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 531 in Time Spy Extreme
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> There's something weird otherwise too though. A guy over in the Speedway benchmark thread did a 112xx. Swears it wasn't artifacted but it's above even KedarWolf's score. I don't see where the extra pts are coming from. His PR score was about the same as mine.


Something's is weird right? I'm not sure how he can do 23K on TSE with those clocks, I feel they would be in the 21-22k range (at most)

On SpeedWay bench, yeah I have been seeing it, though he is using hyper mega fast RAM, really tight timings and a CPU overclock, though I'm not sure how much that affects Speedway


----------



## changboy

I tied hit 27k on port royal but i fail hehe, iam not far. I dont know how some score over 28k lol : 








Result not found
 






www.3dmark.com


----------



## mirkendargen

I finally fired up my system with the Bykski block. My max temps in PR with 1.1v 3.1Ghz clocks/+1400mem are 46C on the core, 56C on the hotspot, and 38C (lol) on the memory. Block is good, Arctic TP3 pads are good, Honeywell PTM7950 is shockingly good, I think I'm going to turn the fans off on my rad for a bit to make sure it melts and flows because it's melting point is 45C. I haven't done testing to see how high I can overclock now with the better temps, we'll see.


----------



## KedarWolf

changboy said:


> I tied hit 27k on port royal but i fail hehe, iam not far. I dont know how some score over 28k lol :
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com











I scored 29 200 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





My legit not bugged no artefacts score. 29200.


----------



## mirkendargen

KedarWolf said:


> I scored 29 200 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> My legit not bugged no artefacts score. 29200.


That seems super questionable at those clocks... It's 1k points above scores with the same clocks. If you were somehow getting 3.2Ghz on a die straight from Jensen's oven...then it would seem more plausible.


----------



## changboy

KedarWolf said:


> I scored 29 200 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> My legit not bugged no artefacts score. 29200.


Lol ! CPU at near 5.9ghz and vram at 6400mhz have something to do with this for sure hehe.
Do you use a A/C to drop your temperature ? My gigabyte gaming oc dont oc pass +1400mhz on memory and max i hit on core is 3030-3045mhz.


----------



## PhuCCo

mirkendargen said:


> I finally fired up my system with the Bykski block. My max temps in PR with 1.1v 3.1Ghz clocks/+1400mem are 46C on the core, 56C on the hotspot, and 38C (lol) on the memory. Block is good, Arctic TP3 pads are good, Honeywell PTM7950 is shockingly good, I think I'm going to turn the fans off on my rad for a bit to make sure it melts and flows because it's melting point is 45C. I haven't done testing to see how high I can overclock now with the better temps, we'll see.


Those temps sound great. Where did you get the honeywell pad? Since the FE has it by default, I got to clean it off of the die when installing the waterblock and the clean up process was so easy. The edges just chipped off and the residue wiped right off. It was like clay lol


----------



## mirkendargen

PhuCCo said:


> Those temps sound great. Where did you get the honeywell pad? Since the FE has it by default, I got to clean it off of the die when installing the waterblock and the clean up process was so easy. The edges just chipped off and the residue wiped right off. It was like clay lol


Dude it was wild to put on. It's a 0.2mm sheet so trying to peel the backing off is a nightmare...but it adheres to silicon so well the excess stuck to the backing and it tore a perfect cover of the die off.

I got it from Aliexpress, I think Moddiy sells it too.


----------



## PhuCCo

Gking62 said:


> I'm in the process as I write this on cancelling my order for the EK Strix block, going to wait it out the Optimus Absolute block...
> 
> order now cancelled
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Signature 4090 Strix/TUF GPU Waterblock
> 
> 
> IMPORTANT SHIPPING INFORMATION: Orders 11/15/22: Ship November and through December 2022 Orders 11/16/22: Ship January 2023 The highest performance GPU waterblock ever created, exclusively for the NVIDIA 4090 GPU. Compatible with: ASUS Strix 4090 OC Edition ASUS Strix 4090 ASUS TUF 4090 OC...
> 
> 
> 
> 
> optimuspc.com


Goodluck with the Optimus blocks, I hope they have a better supply this time around. The few people that I know with one for Ampere swear by them.
I just checked Optimus pricing and did not expect to see EK nearly charging what Optimus does..


----------



## dr/owned

Screenshot of the XOC bios version (from the derbauer vid):










Sadly I don't think the Strix bios played nice on my Trio. One of the DP outputs stopped workingh and I need all 3.


----------



## LunaP

mirkendargen said:


> I finally fired up my system with the Bykski block. My max temps in PR with 1.1v 3.1Ghz clocks/+1400mem are 46C on the core, 56C on the hotspot, and 38C (lol) on the memory. Block is good, Arctic TP3 pads are good, Honeywell PTM7950 is shockingly good, I think I'm going to turn the fans off on my rad for a bit to make sure it melts and flows because it's melting point is 45C. I haven't done testing to see how high I can overclock now with the better temps, we'll see.


Nice , I'm def excited for my block then (arrives next wednesday ) 

Do you happen to have links for the pads u used? I need to order more fittings for my loop so figured I'd get those added as well.


----------



## Gking62

PhuCCo said:


> Goodluck with the Optimus blocks, I hope they have a better supply this time around. The few people that I know with one for Ampere swear by them.
> I just checked Optimus pricing and did not expect to see EK nearly charging what Optimus does..


I decided to _uncancel _it for now, I'll take delivery and inspect it for defects and go from there, I'll eat the shipping but I don't care about that. I will however purchase the Optimus just in case.


----------



## LunaP

Yeah initially was gonna get the FE for the Optimus but the talk of possible December pushed me to the OC instead, which I feel is still a great card, they mentioned looking at other cards possibly to so if they make one for Gigabyte I'll def rebuy. I absolutely love their CPU blocks.


----------



## cletus-cassidy

Panchovix said:


> Something's is weird right? I'm not sure how he can do 23K on TSE with those clocks, I feel they would be in the 21-22k range (at most)
> 
> On SpeedWay bench, yeah I have been seeing it, though he is using hyper mega fast RAM, really tight timings and a CPU overclock, though I'm not sure how much that affects Speedway


Agree. I have the same parts (12900KS vs. 12900K) and everything clocked higher and I'm at 21,555.


----------



## mirkendargen

LunaP said:


> Nice , I'm def excited for my block then (arrives next wednesday )
> 
> Do you happen to have links for the pads u used? I need to order more fittings for my loop so figured I'd get those added as well.











Amazon.com: ARCTIC TP-3: Premium Performance Thermal Pad, 200 x 100 x 1.5 mm (2 Pieces) - High Performance, Particularly Soft, Ideal Gap Filler, Bridging Gaps, Safe handling : Everything Else


Buy ARCTIC TP-3: Premium Performance Thermal Pad, 200 x 100 x 1.5 mm (2 Pieces) - High Performance, Particularly Soft, Ideal Gap Filler, Bridging Gaps, Safe handling: Everything Else - Amazon.com ✓ FREE DELIVERY possible on eligible purchases



www.amazon.com






https://www.aliexpress.us/item/3256804559146076.html?spm=a2g0o.order_list.0.0.56551802jRhmgX&gatewayAdapt=glo2usa&_randl_shipto=US


----------



## cletus-cassidy

mirkendargen said:


> Amazon.com: ARCTIC TP-3: Premium Performance Thermal Pad, 200 x 100 x 1.5 mm (2 Pieces) - High Performance, Particularly Soft, Ideal Gap Filler, Bridging Gaps, Safe handling : Everything Else
> 
> 
> Buy ARCTIC TP-3: Premium Performance Thermal Pad, 200 x 100 x 1.5 mm (2 Pieces) - High Performance, Particularly Soft, Ideal Gap Filler, Bridging Gaps, Safe handling: Everything Else - Amazon.com ✓ FREE DELIVERY possible on eligible purchases
> 
> 
> 
> www.amazon.com
> 
> 
> 
> 
> 
> 
> https://www.aliexpress.us/item/3256804559146076.html?spm=a2g0o.order_list.0.0.56551802jRhmgX&gatewayAdapt=glo2usa&_randl_shipto=US


Alternative here: Honeywell 7950 phase change thermal conductive sheet silicone grease notebook mobile phone computer phase change silicone grease cpu silicone grease sheet

This was where you had to buy before AliExpress carried it. I bought the 80x80mm.


----------



## LunaP

mirkendargen said:


> Amazon.com: ARCTIC TP-3: Premium Performance Thermal Pad, 200 x 100 x 1.5 mm (2 Pieces) - High Performance, Particularly Soft, Ideal Gap Filler, Bridging Gaps, Safe handling : Everything Else
> 
> 
> Buy ARCTIC TP-3: Premium Performance Thermal Pad, 200 x 100 x 1.5 mm (2 Pieces) - High Performance, Particularly Soft, Ideal Gap Filler, Bridging Gaps, Safe handling: Everything Else - Amazon.com ✓ FREE DELIVERY possible on eligible purchases
> 
> 
> 
> www.amazon.com
> 
> 
> 
> 
> 
> 
> https://www.aliexpress.us/item/3256804559146076.html?spm=a2g0o.order_list.0.0.56551802jRhmgX&gatewayAdapt=glo2usa&_randl_shipto=US





cletus-cassidy said:


> Alternative here: Honeywell 7950 phase change thermal conductive sheet silicone grease notebook mobile phone computer phase change silicone grease cpu silicone grease sheet
> 
> This was where you had to buy before AliExpress carried it. I bought the 80x80mm.



Where did u apply the 0.2mm pads? Surprised fuji doesn't make any of these, since they're higher rated, are Artic/Honeywell good for long term?


----------



## mirkendargen

LunaP said:


> Where did u apply the 0.2mm pads? Surprised fuji doesn't make any of these, since they're higher rated, are Artic/Honeywell good for long term?


Honeywell PTM7950 isn't a "pad" persay, it's a phase change material that melts at 45C. Think of it like LM that's not conductive, solid at room temp, and comes in a sheet. I used it on the GPU die only.

Honeywell's datasheet says no drying and no performance loss after tons of heat cycles. Not sure about Arctic, they're more generic thermal pads like you're used to, just on the squishier side so they form to components well.


----------



## GRABibus

Galax Hall Of Fame tested and already first place in HOF Port Royal :









I scored 31 096 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## GRABibus

changboy said:


> I tied hit 27k on port royal but i fail hehe, iam not far. I dont know how some score over 28k lol :
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


I compared scores in HOF PR board and there are some strange things....

People getting range score 29k+ with much lower average boost core frequencies than people getting 28k+ range, with comparable systems and temperatures....


----------



## Sheyster

dr/owned said:


> Sadly I don't think the Strix bios played nice on my Trio. One of the DP outputs stopped workingh and I need all 3.


I'm not surprised to hear this given that the Strix has 2 x HDMI 2.1 / 2 x DP outputs.


----------



## yzonker

mirkendargen said:


> I finally fired up my system with the Bykski block. My max temps in PR with 1.1v 3.1Ghz clocks/+1400mem are 46C on the core, 56C on the hotspot, and 38C (lol) on the memory. Block is good, Arctic TP3 pads are good, Honeywell PTM7950 is shockingly good, I think I'm going to turn the fans off on my rad for a bit to make sure it melts and flows because it's melting point is 45C. I haven't done testing to see how high I can overclock now with the better temps, we'll see.


What's your actual delta though?


----------



## GQNerd

GRABibus said:


> I compared scores in HOF PR board and there are some strange things....
> 
> People getting range score 29k+ with much lower average boost core frequencies than people getting 28k+ range, with comparable systems and temperatures....


That's due to Reported Clocks vs. Effective Clocks..

I have the Suprim Liquid X, and a friend of mine has the same model. He can pass Port Royal with reported boost clocks of 3105mhz, but mine can only get to 3060mhz on a good day (cold night, lol).. Yet, mine beats his card easily. 

His HWinfo shows that while his core clock is reporting 3105mhz (in Afterburner, 3dmark, GPUZ, etc.), his effective clock is actually ~3007mhz.. Wild variance right?

When my card is at '3060mhz' the effective clock is ~3042mhz.. much closer. 

(same cards, same vbioses)

I think that's why some are scoring higher with 'lower clocks'... it's that, or the VRAM bug boost. lol


----------



## J7SC

GRABibus said:


> Galax Hall Of Fame tested and already first place in HOF Port Royal :
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 31 096 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


...amazing what sub-zero and 1,000 W bios (along w/ voltage control tools) can do


----------



## GRABibus

Miguelios said:


> That's due to Reported Clocks vs. Effective Clocks..
> 
> I have the Suprim Liquid X, and a friend of mine has the same model. He can pass Port Royal with reported boost clocks of 3105mhz, but mine can only get to 3060mhz on a good day (cold night, lol).. Yet, mine beats his card easily.
> 
> His HWinfo shows that while his core clock is reporting 3105mhz (in Afterburner, 3dmark, GPUZ, etc.), his effective clock is actually ~3007mhz.. Wild variance right?
> 
> When my card is at '3060mhz' the effective clock is ~3042mhz.. much closer.
> 
> (same cards, same vbioses)
> 
> I think that's why some are scoring higher with 'lower clocks'... it's that, or the VRAM bug boost. lol


And why so much differences in efficiency clocks...?


----------



## GQNerd

GRABibus said:


> And why so much differences in efficiency clocks...?


 ¯\_(ツ)_/¯ 
Your guess is as good as mine..

Chip quality?, PSU power delivery quality? 

All I know is that it's worth tuning your VF curve in Afterburner and try to get the reported clocks and effective clocks as close to matching as possible... For example, I flatten my curve at 1090mv since the clocks are closer than if I were to flatten @ 1075 or even 1100 (oddly)..


----------



## yzonker

mirkendargen said:


> That seems super questionable at those clocks... It's 1k points above scores with the same clocks. If you were somehow getting 3.2Ghz on a die straight from Jensen's oven...then it would seem more plausible.


That's the same thing I saw in the Speedway score. If @KedarWolf says it's legit, then I'd say it very likely is. There's something else that's giving a boost in score for some people, but I haven't figured out what it is. No doubt you can't get to there on the clocks alone, but it must be some system config difference. It's not the CPU/mem either. I have all that stuff and have run both my [email protected] with DDR5 7000CL32 and my new [email protected] and DDR4 4133CL15 (long story why I'm on DDR4 now). Neither of them seem to be able to hit those scores. My 4090 clocks are a little lower, but not nearly enough.


----------



## yzonker

Woo-hoo! No my rig isn't super tidy like some.


----------



## bsch3r

Miguelios said:


> ¯\_(ツ)_/¯
> Your guess is as good as mine..
> 
> Chip quality?, PSU power delivery quality?
> 
> All I know is that it's worth tuning your VF curve in Afterburner and try to get the reported clocks and effective clocks as close to matching as possible... For example, I flatten my curve at 1090mv since the clocks are closer than if I were to flatten @ 1075 or even 1100 (oddly)..


How can you see the difference between reported and effective clocks?


----------



## GAN77

bsch3r said:


> How can you see the difference between reported and effective clocks?


HwInfo, ThermSpy v3.0.1


----------



## KedarWolf

mirkendargen said:


> That seems super questionable at those clocks... It's 1k points above scores with the same clocks. If you were somehow getting 3.2Ghz on a die straight from Jensen's oven...then it would seem more plausible.


On my 5950x I got like 28800, also a valid run, zero artifacts. So on my 7950x is not unreasonable I got 29200. I'm on my phone going home from work so can't link my 5950x run, but I had absolutely no artifacts both runs and am sure they valid. Plus my DDR5 is at 6400MHz with decent timings, so that helps and I have a golden 7950x that runs the CO Curve totally y-cruncher and Core Cycler stable at all 30 except for cores 3 and 11 at 29.

I've actually not shared bugged runs that were higher as I'd never do that.


----------



## GQNerd

yzonker said:


> That's the same thing I saw in the Speedway score. If @KedarWolf says it's legit, then I'd say it very likely is. There's something else that's giving a boost in score for some people, but I haven't figured out what it is. No doubt you can't get to there on the clocks alone, but it must be some system config difference. It's not the CPU/mem either. I have all that stuff and have run both my [email protected] with DDR5 7000CL32 and my new [email protected] and DDR4 4133CL15 (long story why I'm on DDR4 now). Neither of them seem to be able to hit those scores. My 4090 clocks are a little lower, but not nearly enough.


As I was telling Grabibus, check your effective clocks.. (Run hwinfo in the background during a benchmark, then go back and compare your reported clock to your effective clock during the run..) Your performance is closer to whichever is lower... 

I passed 29k in PR with my reported clock at 3045, and my effective clock at 3037-3042.. 


Some ppl are getting the vram bug, where it artifacts like hell but still completes the run. -_ 3dmark needs to fix this_


Side Note,
Most are already doing the following to squeeze some extra perf:
_tweaking Windows till it's barely an 'Operating System', special drivers (nvcleanstall), NV control panel settings, forcing rebar on with NVinspector, running older versions of 3dmark.. _


----------



## Panchovix

Miguelios said:


> running older versions of 3dmark...


Wait, tell me about this *please *lol, older versions give higher scores?


----------



## KedarWolf

I scored 28 983 in Port Royal My 5950x run. I think it's effective clocks that matter the most, my Strix does really well.


----------



## J7SC

...by extension, ambient temps and cooling setup to get effective clock to stay high.


----------



## yzonker

Ok, block delta with the 4090 TUF/Bykski combo. About [email protected] running the PR stress test. Not terrible, but not amazing either.




Spoiler: Giant screenshot.


----------



## GQNerd

Panchovix said:


> Wait, tell me about this *please *lol, older versions give higher scores?


Add a few results to the "Compare" function on the 3dmark website, then scroll down to "Result" and then check the UI Version:

Some are on
2.24.7509 s64
2.25.8043 64

(the "s" is for Steam... RIP to whoever runs these versions, lmao)

Minimal gains.. but trust me, some are OCD. lol


----------



## motivman

yzonker said:


> Woo-hoo! No my rig isn't super tidy like some.
> 
> View attachment 2580888


man, do the enthoo 719 some justice... here is my setup. 3 internal radiators, 4 d5 pumps, and an External MORA... lol. All with Quick disconnects for easy peezy hardware swaps ... don't steal my loop design now


----------



## GQNerd

yzonker said:


> Ok, block delta with the 4090 TUF/Bykski combo. About [email protected] running the PR stress test. Not terrible, but not amazing either.
> 
> 
> Spoiler: Giant screenshot.
> 
> 
> 
> 
> View attachment 2580893


Look at that Effective clock delta...


----------



## mirkendargen

yzonker said:


> What's your actual delta though?


18C @ 550W. Plenty good enough.


----------



## yzonker

motivman said:


> man, do the enthoo 719 some justice... here is my setup. 3 internal radiators, 4 d5 pumps, and an External MORA... lol. All with Quick disconnects for easy peezy hardware swaps ... don't steal my loop design now
> 
> View attachment 2580894


I have 2 CL480s as external and a 2nd D5. Internal is minimal to keep flow rate up and heat load down when I connect it to my chiller (without the externals).


----------



## motivman

yzonker said:


> I have 2 CL480s as external and a 2nd D5. Internal is minimal to keep flow rate up and heat load down when I connect it to my chiller (without the externals).


my setup needs four D5's to keep flow rate at around 300 lph with all d5's running at 100%, especially with all the quick disconnects in the loop. for daily, I run 3 of the d5's at 800rpm, and one at full speed to get 130lph. I like my setup because I switch between watercooled to air cooled gpu in less than 10 minutes, and I can swap my CPU in 10 minutes or less too, Just sucks that most of the hardware I get are not silicon winners lately


----------



## mirkendargen

With a block on I could finally feel the 12VHPWR connector under sustained load without the air cooler blowing hot air everywhere. It's definitely warm, but not too hot to touch. I'd estimate 40-50C. Yeah nylon is a poor conductor oh heat so that means the pins are hotter...but nylon also doesn't melt till like 270C so it's fine. In retrospect I probably would have slapped a thermal pad between the back of the power connector area and the backplate, I might still do that because I need to drain and rinse my loop another time anyway.


----------



## motivman

Any consensus on the best waterblock for the FE?


----------



## GAN77

*Preview: kryographics STRIX/TUF 4090*



Preview: kryographics STRIX/TUF 4090 / 14.11.: UPDATE: Version mit seitlichen Anschlüssen - Wasserkühlung - Aqua Computer Forum


----------



## LuckyImperial

motivman said:


> my setup needs four D5's to keep flow rate at around 300 lph with all d5's running at 100%, especially with all the quick disconnects in the loop. for daily, I run 3 of the d5's at 800rpm, and one at full speed to get 130lph. I like my setup because I switch between watercooled to air cooled gpu in less than 10 minutes, and I can swap my CPU in 10 minutes or less too, Just sucks that most of the hardware I get are not silicon winners lately


Are quick disconnects really that choked? A _single _D5 can do 1.5GPM at a 4psi drop (very high). To be dropping that much pressure across disconnects at 1.5gpm...they would have to be like...1/8" ID.


----------



## yzonker

Has anyone else noticed a significant drop in power draw with the waterblock? 

If you look at that screenshot I posted a little bit ago, I'm only pulling 450w in PR at 1100mv. That's down from 500w+ with the air cooler. Control only pulls about 450w at 1100mv also in 4k max settings.


----------



## mirkendargen

LuckyImperial said:


> Are quick disconnects really that choked? A _single _D5 can do 1.5GPM at a 4psi drop (very high). To be dropping that much pressure across disconnects at 1.5gpm...they would have to be like...1/8" ID.


Try and blow air through a quick disconnect, a lot of them are pretty shockingly restrictive.


----------



## cletus-cassidy

yzonker said:


> Has anyone else noticed a significant drop in power draw with the waterblock?
> 
> If you look at that screenshot I posted a little bit ago, I'm only pulling 450w in PR at 1100mv. That's down from 500w+ with the air cooler. Control only pulls about 450w at 1100mv also in 4k max settings.


Yes. See my posts above. Can’t get close to 600 watts. 4090 Tuf under Bykski WB.


----------



## 8472

Interesting results. I could be wrong, but it looks like the 13900k did better against the 5800X3D than Intel's own marketing said it would. I'm guessing they (intel) used a 3090?


__ https://twitter.com/i/web/status/1588634955910873088


----------



## KingEngineRevUp

Laithan said:


>


This proves OC is more and more not worth it. Going from 3000 to 3045 is just a 1.5% increase in clocks and probably a 1% increase in performance running it at 750W.


----------



## kx11

8472 said:


> Interesting results. I could be wrong, but it looks like the 13900k did better against the 5800X3D than Intel's own marketing said it would. I'm guessing they (intel) used a 3090?
> 
> 
> __ https://twitter.com/i/web/status/1588634955910873088


5800x3d made Ryzen 7000 obsolete


----------



## zhrooms

Added the HOF OC Lab


----------



## Gking62

mirkendargen said:


> Try and blow air through a quick disconnect, a lot of them are pretty shockingly restrictive.


funny, I ordered a combo QD3 Koolance set which arrived a little while ago and was checking them out, fitting them together and wondered this very same thing, looking thru can't see any light whatsoever.


----------



## yzonker

Gking62 said:


> funny, I ordered a combo QD3 Koolance set which arrived a little while ago and was checking them out, fitting them together and wondered this very same thing, looking thru can't see any light whatsoever.


The QD4's are a lot better, but obviously larger. 


















Koolance QD Series - ExtremeRigs.net


Koolance’s latest Quick Disconnect series is called the QD series. It retains much of the same trusted internal design but uses a snazzy new push to release mechanism rather than the older style twist mechanism. The QDCs come in three different sizes – the QD2 (not reviewed) is the smallest and...




www.xtremerigs.net


----------



## PhuCCo

I recently tested the QD3, QD4, NS4, NS6, Alphacool twistlock, Alphacool QD4 knock offs, etc. with a flow meter at different flow rates. The QD4 was such a low resistance that I couldn't measure a drop. NS6 was nearly as good, QD3 wasn't far behind contrary to popular belief. Everything else was a bit worse. The NS4 was the worst.


----------



## Benni231990

Has anybody a manuel to flash a 4090 bios? Or is it 1:1 the same way like the 3090 flash?

@zhrooms 

In the 3090 Thread i can remember on the first page was a tutorial with step by step for the flash maybe we can add this here to?


----------



## Gking62

PhuCCo said:


> I recently tested the QD3, QD4, NS4, NS6, Alphacool twistlock, Alphacool QD4 knock offs, etc. with a flow meter at different flow rates. The QD4 was such a low resistance that I couldn't measure a drop. NS6 was nearly as good, QD3 wasn't far behind contrary to popular belief. Everything else was a bit worse. The NS4 was the worst.


really appreciate this, I like the QD3 due to my using 3/8 tubing (10x13mm) for draining.


----------



## N19htmare666

EEE-RAY said:


> I wonder why we don't have a proper memory overclock stress tester liek we do with system ram.





Panchovix said:


> A new personal best with my sample on SpeedWay, and as some people said, having auto fans instead of 100%, let me a little more headroom for VRAM OC, which is actually very interesting lol
> 
> 
> Spoiler: 3DMark Speed Way Bench
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 10 714 in Speed Way
> 
> 
> AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> View attachment 2580197
> 
> 
> 
> 
> Though, now I'm wondering 3 things:
> 
> 1. For the Ryzen users out here (Ryzen 5000/7000), are you using Windows 11 or 10 for benchmarks? If it's 11, are you using a stripped W11 OS or just disabling the services for the benchs and such?
> 2. For people that have upgraded CPUs after getting the 4090, have you got better *graphics *scores after upgrading your CPU? Since for my CPU/GPU combo I feel I'm doing "good", but for 4090 in general, it seems pretty lackluster
> 3. How it is possible that for example, some people have high scores despite having lower core/mem clocks? Driver optimization? Example:
> View attachment 2580200





Miguelios said:


> That's due to Reported Clocks vs. Effective Clocks..
> 
> I have the Suprim Liquid X, and a friend of mine has the same model. He can pass Port Royal with reported boost clocks of 3105mhz, but mine can only get to 3060mhz on a good day (cold night, lol).. Yet, mine beats his card easily.
> 
> His HWinfo shows that while his core clock is reporting 3105mhz (in Afterburner, 3dmark, GPUZ, etc.), his effective clock is actually ~3007mhz.. Wild variance right?
> 
> When my card is at '3060mhz' the effective clock is ~3042mhz.. much closer.
> 
> (same cards, same vbioses)
> 
> I think that's why some are scoring higher with 'lower clocks'... it's that, or the VRAM bug boost. lol


Memory latency tweaking (ram not vram) isn't showing on the summary details but has a large impact on speed so might be more to do with it.


----------



## Damaged__

cletus-cassidy said:


> Yes. See my posts above. Can’t get close to 600 watts. 4090 Tuf under Bykski WB.


Try furmark


----------



## zhrooms

Benni231990 said:


> Has anybody a manuel to flash a 4090 bios? Or is it 1:1 the same way like the 3090 flash?
> 
> In the 3090 Thread i can remember on the first page was a tutorial with step by step for the flash maybe we can add this here to?


It "should" be the same.

But I removed it for now, because I don't know if it has changed, as I don't have a 30 or 40 series card, I made the tutorial on a 2080 Ti so, that was a while ago.


----------



## J7SC

PhuCCo said:


> I recently tested the QD3, QD4, NS4, NS6, Alphacool twistlock, Alphacool QD4 knock offs, etc. with a flow meter at different flow rates. The QD4 was such a low resistance that I couldn't measure a drop. NS6 was nearly as good, QD3 wasn't far behind contrary to popular belief. Everything else was a bit worse. The NS4 was the worst.


Yup, I have been running 8x Koolance QD4s in various loops (2x D5, 3x D5, 4x D5, 5x D5 pump) for years - resistance is a very minor issue. The reason why you can't see through them is a flap, and depending which way you blow into them, you might encounter 100% resistance, per design. Just make sure you cleanse and rinse the QD after you blew into it...


----------



## motivman

J7SC said:


> Yup, I have been running 8x Koolance QD4s in various loops (2x D5, 3x D5, 4x D5, 5x D5 pump) for years - resistance is a very minor issue. The reason why you can't see through them is a flap, and depending which way you blow into them, you might encounter 100% resistance, per design. Just make sure you cleanse and rinse the QD after you blew into it...


I tried QD3, QD4, and they always eventually started leaking. I use Alphacool right now, and they never leak, way more reliable than koolance.


----------



## J7SC

motivman said:


> I tried QD3, QD4, and they always eventually started leaking. I use Alphacool right now, and they never leak, way more reliable than koolance.


...Never had a leak w/ any QD4 (up to a decade so far). The only issue was w/one QD4 which had been set aside and the inner o-ring jammed the closing / opening mechanism (no leak when it was connected, though), but that was an easy fix.

With Zen4 / AM5 3DVCache expected to be announced at CEA 2023, my collection of Koolance QD4s might come in handy soon .

---

In other news, two more Gelid GC Extreme arrived today for the 4090 w-block install (I'm messy...) and can join the collection below left over from a previous build project (2 systems). Not shown is 100+ g of TG-10 thermal putty.


----------



## BTK

What is wrong with my 3dmark I changed from 12700k to 13900k and to 64gb 3600mhz cl18 ram from 32gb 3600mhz cl16. Cpu score up gpu down from 19k from with my 12700k and 32gb. All stock


----------



## changboy

Maybe a bad bios setting on your motherboard coz i just have a 10980xe and i score 18227.








I scored 18 277 in Time Spy Extreme


Intel Core i9-10980XE Extreme Edition Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





I think you should score over 20 000.


----------



## Arizor

BTK said:


> What is wrong with my 3dmark I changed from 12700k to 13900k and to 64gb 3600mhz cl18 ram from 32gb 3600mhz cl16. Cpu score up gpu down from 19k from with my 12700k and 32gb. All stock
> View attachment 2580948


Yeah that's super low for that set up - I get 21k on the graphics score and I think that's slightly limited by my 5900x.

Surely something in the BIOS gone screwy. I'd check for BIOS updates and then go through line by line and make sure you've got the PCIe lanes operating and reBAR on etc.


----------



## tubs2x4

Miguelios said:


> Side Note,
> Most are already doing the following to squeeze some extra perf:
> _tweaking Windows till it's barely an 'Operating System', special drivers (nvcleanstall), NV control panel settings, forcing rebar on with NVinspector, running older versions of 3dmark.. _


What prize do you get at the top of the score board?


----------



## BTK

Arizor said:


> Yeah that's super low for that set up - I get 21k on the graphics score and I think that's slightly limited by my 5900x.
> 
> Surely something in the BIOS gone screwy. I'd check for BIOS updates and then go through line by line and make sure you've got the PCIe lanes operating and reBAR on etc.


I flashed the bios again tried my old ram. Something is not communicating right. I heard of the similar issues on z690 but msi does not provide an updated me driver like asus did I’m not sure what to do.

everything is updated it’s running at pcie 4.0 x16 i fresh installed drivers what do I do


----------



## Arizor

BTK said:


> I flashed the bios again tried my old ram. Something is not communicating right. I heard of the similar issues on z690 but msi does not provide an updated me driver like asus did I’m not sure what to do.
> 
> everything is updated it’s running at pcie 4.0 x16 i fresh installed drivers what do I do


Have you got any games you can test for frame rate? Does it perform below par on those too or perhaps this issue is isolated to 3DMark? It is behaving a bit screwy with the 4090s at the moment.


----------



## BTK

Arizor said:


> Have you got any games you can test for frame rate? Does it perform below par on those too or perhaps this issue is isolated to 3DMark? It is behaving a bit screwy with the 4090s at the moment.


On games I get more less the same fps my lows are better my system is faster. But my port Royal 3d mark score is 25k. On 12700k my timespy extreme gpu score was 19k+

I’m running 4k 120


----------



## changboy

BTK said:


> On games I get more less the same fps my lows are better my system is faster. But my port Royal 3d mark score is 25k. On 12700k my timespy extreme gpu score was 19k+
> 
> I’m running 4k 120


What is your psu ?


----------



## GQNerd

tubs2x4 said:


> What prize do you get at the top of the score board?


The adoration of a bunch of techies online, and Epeen bragging rights?..

Sarcasm aside, a potential sponsorship? Aka free stuff

Personally, I use benchmarks to gauge how my hardware is performing, and tune it just like I do my car.. If I end up landing on the leaderboard, even better.


----------



## tubs2x4

Miguelios said:


> The adoration of a bunch of techies online, and Epeen bragging rights?..
> 
> Sarcasm aside, a potential sponsorship? Aka free stuff
> 
> Personally, I use benchmarks to gauge how my hardware is performing, and tune it just like I do my car.. If I end up landing on the leaderboard, even better.


Ok. Just wondering if a material prize offered or what


----------



## dante`afk

very impressed with my EKWB FE block. insane temps. no reason to swap the stock pads for 12 w/mk pads if the max memory temp is 36c. I ordered gelid extreme and will ship them right back.

11c delta on stock, 17c delta on OC, max PT, max voltage, 250+gpu, 1400+ memory

10 minutes bright infinity benchmark
5120x1440p
rtx: very high
dlss: quality 

stock:











OC


----------



## BTK

changboy said:


> What is your psu ?


Seasonic gx1000w and I’m running everything at stock right now the gpu is only 450w and I have a thermaltake gf3 Atx 3 1200W on the way


----------



## J7SC

BTK said:


> I flashed the bios again tried my old ram. Something is not communicating right. I heard of the similar issues on z690 but msi does not provide an updated me driver like asus did I’m not sure what to do.
> 
> everything is updated it’s running at pcie 4.0 x16 i fresh installed drivers what do I do


Are you on Win 10 or 11 ?


----------



## BTK

J7SC said:


> Are you on Win 10 or 11 ?


11 my 12700k was fine


----------



## J7SC

BTK said:


> 11 my 12700k was fine


I take it you did the DDU process and all that - I know it was a CPU upgrade. Ditto for resizable_BAR setting in bios.


----------



## BTK

J7SC said:


> I take it you did the DDU process and all that - I know it was a CPU upgrade. Ditto for resizable_BAR setting in bios.


reflashed newest bios. Reseated gpu. Tried old ram. Fresh DDU. Rebar is on. Xmp is on. Cpu score is up. Gpu score is down. Where can I get me firmware for msi.


----------



## long2905

dante`afk said:


> very impressed with my EKWB FE block. insane temps. no reason to swap the stock pads for 12 w/mk pads if the max memory temp is 36c. I ordered gelid extreme and will ship them right back.
> 
> 11c delta on stock, 17c delta on OC, max PT, max voltage, 250+gpu, 1400+ memory
> 
> 10 minutes bright infinity benchmark
> 5120x1440p
> rtx: very high
> dlss: quality
> 
> stock:
> 
> View attachment 2580969
> 
> 
> 
> OC
> 
> View attachment 2580970
> 
> 
> View attachment 2580971
> View attachment 2580972
> View attachment 2580973
> View attachment 2580974


Still need active backplate? Whats your ambient like and the rad setup?


----------



## dante`afk

long2905 said:


> Still need active backplate? Whats your ambient like and the rad setup?


Need probably not, bought because why not

dual mora, 23c ambient


----------



## mirkendargen

I went ahead and did it because why not lol.


----------



## KingEngineRevUp

Damaged__ said:


> Here is a few minutes of furmark with my 4090 FE block and as you can see the Core-Hotspot delta is consistent at around 10c even with near 600w load. I am unfortunately using the stock pads as I fear there aren't really a whole lot of alternatives right now and I'd be more keen to experiment once we have more data on the matter. The memory does get a bit hot when loading super memory intensive games while also pulling a reasonable amount of power to the core (Quake RTX for example) but there's not much that can be done about that at the moment.
> 
> My block did include the piece of paper basically telling you to ignore the online manual and I had a hell of a time determining which pads I should use, but all in all the results seem decent.
> View attachment 2580759


This is with which block? I'm assuming EKWB? Since they're the only ones that have released an FE block?


----------



## lassek1981

mirkendargen said:


> My Canadian Gigabyte Gaming OC made it home and got #22 single GPU on Port Royal. Seems like a keeper, boosts to 3120 then tapers down to 3090 as the temperature rises. Memory can only do +1400 though, +1500 crashes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 202 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Thats pretty great! Must say... I have same GB Gaming OC 4090 with stock air cooler, and so far my best Port Royal is 23388. This is on a Gen3 10900K though with 32gb DDR4 @ 3333Mhz and a 10 year old 1000w PSU with the 4-1 adapter. . In 3 weeks time in putting it in my new rig which is a 13900K Gen4 with DDR5-6000 32 gb. 1350w TT ATX 3.0 PSU. And its all gonna be water cooled including the 4090. I am not sure if I can improve my score even more though. I had mine up around 3100 mhz also but at best it can only run benchmarks and not gaming torture (4k ultra no v sync) for many hours which I then consider unstable. My everyday use is +220 on the GPU and +1000 on the Vram effectively being 3000-3030Mhz and 23Gpbs Vram.


----------



## lassek1981

StreaMRoLLeR said:


> Anyone able to active force rebar for 4090 in 3dmark ?


Its a BIOS PCI-E feature, so its operating at low level is my guess. There is no forcing on off here and there. It either on or off via BIOS- And I believe its on all the time when enabled. It basically give the CPU access to the entire VGA framebuffer at once instead of kind of like a "serial connection" where its little at a time. I don't think app and games need to be written to this its, between the CPU, MOBO and VGA CARD/BIOS.


----------



## mirkendargen

lassek1981 said:


> Thats pretty great! Must say... I have same GB Gaming OC 4090 with stock air cooler, and so far my best Port Royal is 23388. This is on a Gen3 10900K though with 32gb DDR4 @ 3333Mhz and a 10 year old 1000w PSU with the 4-1 adapter. . In 3 weeks time in putting it in my new rig which is a 13900K Gen4 with DDR5-6000 32 gb. 1350w TT ATX 3.0 PSU. And its all gonna be water cooled including the 4090. I am not sure if I can improve my score even more though. I had mine up around 3100 mhz also but at best it can only run benchmarks and not gaming torture (4k ultra no v sync) for many hours which I then consider unstable. My everyday use is +220 on the GPU and +1000 on the Vram effectively being 3000-3030Mhz and 23Gpbs Vram.


Make sure you don't have a framerate limit enabled in NVCP. Port Royal will pass 120fps in most sections, some by quite a bit, so if you have a ~120fps limiter on it will take a few thousand off your score.


----------



## dk_mic

lassek1981 said:


> Its a BIOS PCI-E feature, so its operating at low level is my guess. There is no forcing on off here and there. It either on or off via BIOS- And I believe its on all the time when enabled. It basically give the CPU access to the entire VGA framebuffer at once instead of kind of like a "serial connection" where its little at a time. I don't think app and games need to be written to this its, between the CPU, MOBO and VGA CARD/BIOS.


Nvidia enables it only for specific applications via their driver. It's not enabled in 3DMark. You can use Nvidia Inspector to force it enabled


----------



## lassek1981

xcx xcxvgyt said:


> My gamerock non oc is also stable around 3120mhz and can bench mem+1750 but power limit is 450w+zero
> 
> So i can't get more than 28139 from port royal and waiting for bios mod
> 
> 3dmark.com without system info on that's the best i can with 450w
> 
> Plus, I can't update futuremark SystemInfo for some reason and unable to solve it yet


You would expect us to believe that you are running your VRAM at 28Gpbs? This is imossible! They come at 21Gpbs and the best ones do 23-24 Gpbs MAX. Also 3120 Mhz within stock power limit? I am sorry to say but I would need to see that with my own eyes before I believe it. You are claiming that Your VRAM are running basically 300+ mhz over ALL other 4090 and within PWL.. its too good to be true.. 

also when you say stable…. Is this for benchmarks and testing etc.? Or did you play the most demanding titles 4K ultra , ray tracing vsync off?
there are a lot of users that kind of bend the term stable … I can also reach very high clock on my 4090 GB OC … but running with the settings above for hours where the GPU is at 99-100% Its ALWAYS a different case … in that case im able to run between 3015-3050mhz on the core and 23 gpbs memory (+1000) this is ROCK solid and stable with stock air cooler (waiting for EKWB LC block).. Its also on a 10900K Gen3/ DDR4. in 3 weeks my 13900K Gen4/ DDR5 is ready … but my current rig with the 4090 it will only reach these speeds with full 133% power limit and maxes out at 520-560w max. 1.1v Of curse no normal people run like this daily or at least I don’t. but I do take it for a ride in at least two hours torture test gaming in Cyberpunk 2077 4K psycho settings, GTA V with Natural Vision Evolved at ULTRA (excellent OC stability tester even though old game.) and other notorious demanding titles no v sync to verify the OC. When benchmarking or ruining lower intense games also without ray tracing one can archive higher clocks but then in my eyes it’s not 100% stable … but mostly I don’t want the gpu at all times at 100% or +500w … if a title is so demanding 4K ultra I actually limit the fps from 120 to 85-100. This makes more sense I think especially in titles that are not 1st person with mouse. Also dont wanna pay 10$ a hour in power bill to play a game. Even though the 4090 is a beauty with RayTracing and very impressive... some titles drop like 50% when RT disabled. In most games RT works great on 4090 which is sweet to see! but others horrible and then I simply disable it. I think its WAY to hyped for what it is and the performance cost. Just like DLSS, even though its a nice feature to have I always avoid it at all costs. 4K Ultra razorsharp is the way to go for me.. Cyperpunk though is the only game I use it in mainly due to the extreme performance demand with DLSS OFF and also because its the only game I know of where the game looks more ugly when disabled! lol .. 

Besides a BIOS mod with even a 1000w wont give you extra performance...And certainly not IF your GPU runs like you claim. You could probably just set PWL to max and get full potential. Even A bin chips will hit the voltage ceiling and im pretty sure it will do that before 600w. You WILL reach a voltage limit/starving before 600w. If you wanna do a BIOS mod you HAVE to do a voltage hardware mod also to get ANY kind of benefit out of that. Der Bauer made a video about it and did both! He reached 770w @ 1.2v which is insane and even he said he don't want that as a daily driver. He said if he added even more voltage like 1.25v then even at water cooling the temperature will become to high way over 80-90 Celsius. 

I am looking forward to water cool mine in a new latest gen intel raptor lake rig and see if it just performs just a little better when temps obvious are gonna be better and Gen 4.


----------



## lassek1981

dk_mic said:


> Nvidia enables it only for specific applications via their driver. It's not enabled in 3DMark. You can use Nvidia Inspector to force it enabled


Hmm are you sure? It would sound more logical at low level via BIOS. Because otherwise why should BIOS handle it if it were driver controlled. But I could be wrong- it just a guess... I have not read it in detail but I will dive into to clarify now at some point. Its a new feature for me coming from a 2080Ti .. And I dont even know if it actually gives any kind of performance boost and if yes- how much? Could be another marketing gimmick like Raytracing basically been useless and overhyped since its launch with the 2000 series... the 4090 though in SOME titles can now run it beutiful but often still at a high cost in performance drop. I will try to DL NV inspector now as we speak to check it out. Thanks


----------



## lassek1981

mirkendargen said:


> Make sure you don't have a framerate limit enabled in NVCP. Port Royal will pass 120fps in most sections, some by quite a bit, so if you have a ~120fps limiter on it will take a few thousand off your score.


Yes thanks for the head up because I often use RIVA STAT. to limit but then it would be 75-100fps range as vsync on for me is 120. Ill recheck everything a do another Royal! haha.. You dont know to happen to have seen or tested if this RESIZE BAR actually works? and if yes what the improvements? Also if there is any specific way to test it out. Its new to me coming from a 2080Ti.


----------



## lassek1981

KingEngineRevUp said:


> This is with which block? I'm assuming EKWB? Since they're the only ones that have released an FE block?


The thing is they only announced it. They have not released it. It will be released mid to end of this month. I know because I made a preorder and waiting for one myself. No one has them yet. Its just ridicolous the price ! Its the most expensive waterblock I have ever bought.


----------



## mirkendargen

lassek1981 said:


> Yes thanks for the head up because I often use RIVA STAT. to limit but then it would be 75-100fps range as vsync on for me is 120. Ill recheck everything a do another Royal! haha.. You dont know to happen to have seen or tested if this RESIZE BAR actually works? and if yes what the improvements? Also if there is any specific way to test it out. Its new to me coming from a 2080Ti.


It does, needs to be enabled with Nvidia profile inspector like previously mentioned. It's no giant change though, 100-200pts maybe, but that can be the difference of a lot of places on the leaderboard.


----------



## KingEngineRevUp

lassek1981 said:


> The thing is they only announced it. They have not released it. It will be released mid to end of this month. I know because I made a preorder and waiting for one myself. No one has them yet. Its just ridicolous the price ! Its the most expensive waterblock I have ever bought.


People have it already. I'm thinking of cancelling my order and waiting for Heatkiller, Alphacool or the Corsair one.

The EKWB one seems a bit pricey and the performance doesn't seem that great. Like +17-20C over water at just 450W.

But that user could have had a bad mount. I'll keep my order if someone else can report better results.


__
https://www.reddit.com/r/nvidia/comments/yml6dq


----------



## lassek1981

mirkendargen said:


> My Canadian Gigabyte Gaming OC made it home and got #22 single GPU on Port Royal. Seems like a keeper, boosts to 3120 then tapers down to 3090 as the temperature rises. Memory can only do +1400 though, +1500 crashes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 202 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


What are your afterburner settings on the core? +? if you are using that software that is. I have mine at +220 and mem +1000 daily use.


----------



## lassek1981

KingEngineRevUp said:


> People have it already. I'm thinking of cancelling my order and waiting for Heatkiller, Alphacool or the Corsair one.
> 
> The EKWB one seems a bit pricey and the performance doesn't seem that great. Like +17-20C over water at just 450W.
> 
> But that user could have had a bad mount. I'll keep my order if someone else can report better results.
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/yml6dq


Yeah I noticed after I replied... Its strange though because mine are not shipped before end of month. Maybe because its a Gigabyte and not FE. I know they did the FE first. But if I google EKWB FE it still says pre-order on their site.. Anyway dosent matter I can see you have it!  and results look great friend.. cant wait to get mine installed!!!! I ordered exactly the same for the Gigabyte OC Gaming and will for the first time try with RISER cable in a DistroCase 350P. but a month more is a long time to wait!  hehe..

And yes its RIDICOLOUS pricey! the most expensive water block I have ever bought.


----------



## KingEngineRevUp

lassek1981 said:


> Yeah I noticed after I replied... Its strange though because mine are not shipped before end of month. Maybe because its a Gigabyte and not FE. I know they did the FE first. But if I google EKWB FE it still says pre-order on their site.. Anyway dosent matter I can see you have it!  and results look great friend.. cant wait to get mine installed!!!! I ordered exactly the same for the Gigabyte OC Gaming and will for the first time try with RISER cable in a DistroCase 350P. but a month more is a long time to wait!  hehe..
> 
> And yes its RIDICOLOUS pricey! the most expensive water block I have ever bought.



Those are not my results, it's someone else's and the results actually don't look good.

Their water is at 25C and the card maxed out like at +20C over water. That's not that great, specially for EKWB charging so much.

I'm concerned and am thinking of cancelling my order and waiting for something else.


----------



## LordGurciullo

Whats up guys! I'm not nearly as technical as all of you guys but I've learned a ton and really appreciate yall.
In any event - Heres my thoughts/ramblings/overclock findings on the msi gaming trio 4090


----------



## lassek1981

mirkendargen said:


> Make sure you don't have a framerate limit enabled in NVCP. Port Royal will pass 120fps in most sections, some by quite a bit, so if you have a ~120fps limiter on it will take a few thousand off your score.


Just did another Port Royal .. And basically it crash if over 200 core 1000 mem. The max clock was 3000mhz. It says PerfCap PWR VREL and hotspot got 82.5c. I can reach higher clocks in other bench or games but not Royal unfortunatly. So I will keep it at +200 for now for full stability. I will have to wait for my new rig 13900K, DDR5, ATX 3.0 PSU 16pin and waterblock on 4090 to see further improvements I guess. I am using the latest hotfix driver though which GPU-Z says is BETA. I am also running 10900K Gen3 now with the 4-1 adapter and 3333Mhz DDR4.

EDIT: With full fans I can get 3030-3045 Mhz in PR. I need my water block now!


----------



## GQNerd

Honing in my OC...

3075Mhz (~3056Mhz Effective)
+1600 Mem

Note:
The 28.8k PR I posted below is VERY repeatable! My highest score is 29.2k+ with no artifacts, but that was a very lucky run.


----------



## dk_mic

Miguelios said:


> Honing in my OC...
> 
> 3075Mhz (~3056Mhz Effective)
> +1600 Mem


would be very interested, if you can pass this test for 10 minutes without errors at +1600 MHz Memory








GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability


Vulkan compute tool for testing video memory stability - GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability




github.com


----------



## yzonker

Miguelios said:


> Honing in my OC...
> 
> 3075Mhz (~3056Mhz Effective)
> +1600 Mem
> 
> Note:
> The 28.8k PR I posted below is VERY repeatable! My highest score is 29.2k+ with no artifacts, but that was a very lucky run.
> 
> View attachment 2580995
> 
> View attachment 2580994
> 
> View attachment 2580993
> View attachment 2580999


You can't get a "lucky run" that is 400pts higher without artifacting or something happening to boost the score. 400 pts is A LOT.


----------



## Nizzen

yzonker said:


> You can't get a "lucky run" that is 400pts higher without artifacting or something happening to boost the score. 400 pts is A LOT.


I just stopped looking at Port Royal, because the points is jus all over the place. Still have valid 31k in PR LOL. Just hided it, because it's going to be deleted anyway.
Variance of points with the same clocks is just epic fail. I think most runs over 28500 is flawed in one way or another.

example:









Result







www.3dmark.com





Point's don't make sense.....

Looks cool to be #1 with flawed result, but that's pretty much it....


----------



## yzonker

Nizzen said:


> I just stopped looking at Port Royal, because the points is jus all over the place. Still have valid 31k in PR LOL. Just hided it, because it's going to be deleted anyway.
> Variance of points with the same clocks is just epic fail. I think most runs over 28500 is flawed in one way or another.
> 
> example:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Point's don't make sense.....
> 
> Looks cool to be #1 with flawed result, but that's pretty much it....


Yea I agree. This notion that OC'ed CPU/RAM will get hundreds of points is incorrect as well. It does help for sure, but it's more double digit gains, not triple digit (100's).


----------



## yzonker

KingEngineRevUp said:


> People have it already. I'm thinking of cancelling my order and waiting for Heatkiller, Alphacool or the Corsair one.
> 
> The EKWB one seems a bit pricey and the performance doesn't seem that great. Like +17-20C over water at just 450W.
> 
> But that user could have had a bad mount. I'll keep my order if someone else can report better results.
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/yml6dq


Looks like [email protected] in the "OC" example. I tried to replicate that condition on my machine with the Bykski block (running PR stress test). Set 75% PL (Strix 500w default bios), +150 core, +1500 mem. I got right at [email protected] So possibly better. 

There are some variables that affect the delta though. Flow rate and temp sensor placement/accuracy are the 2 biggest. I have 2 D5's which helps 1-2C over a single D5. And loop temp can vary 1-2C even with fairly high flow rate, so if the temp sensors are located differently in the loops, this may introduce enough error to explain some of the difference as well.

Probably safe to say the Bykski at half the price is as good as the EK though. Possibly better.


----------



## Benni231990

hi

i have a question why is my effective clock 50-80 mhz below that my clock what i set in afterburner?

I this a vBIOS problem?


----------



## cletus-cassidy

Damaged__ said:


> Try furmark


Still cannot pull over 550W. Tuf OC with Strix bios under Bykski block with 2x480 and 2*360 HWLabs Black Ice in 1000D. Curious if anyone else has seen this or if it's anything to worry about? 










My 3DMark scores seem about right. TSE got me to 562W max. Does this look similar to others with Tuf (other than temps as I'm under water)?

TSE: 18.2k
PR: 28.3k


----------



## cletus-cassidy

LunaP said:


> Where did u apply the 0.2mm pads? Surprised fuji doesn't make any of these, since they're higher rated, are Artic/Honeywell good for long term?


I cut a piece to put on my GPU core (not mem just using the stock Byski pads because I see no difference in performance). 7950 seems to perform about the same as KPX, but the beauty is in the long run it doesn't degrade, dry, or pump out. I have it on a furnace of a mobile CPU and it performs well there, so this is a lower temp extreme use case.


----------



## yzonker

cletus-cassidy said:


> Still cannot pull over 550W. Tuf OC with Strix bios under Bykski block with 2x480 and 2*360 HWLabs Black Ice in 1000D. Curious if anyone else has seen this or if it's anything to worry about?
> 
> View attachment 2581018
> 
> 
> My 3DMark scores seem about right. TSE got me to 562W max. Does this look similar to others with Tuf (other than temps as I'm under water)?
> 
> TSE: 18.2k
> PR: 28.3k


It's power limiting. Mine is the same. Depending on which flavor of Kombustor I run, it tops out anywhere from 520w to 580w. Seems like my card reports lower power running the same benchmark than other AIB cards. People have reported 550w in Port Royal on air, but I never saw more than 510w or so. Now it's just 450w with the water block. 

Bottom line, if you don't see that green graph in GPUZ (indicating power limits are being hit) while running games and benchmarks, then it doesn't matter at all.


----------



## yzonker

Benni231990 said:


> hi
> 
> i have a question why is my effective clock 50-80 mhz below that my clock what i set in afterburner?
> 
> I this a vBIOS problem?


No, it's normal for effective to be lower than reported. The card actually bounces between several of the VF points even when it is indicating a steady frequency/voltage. So effective clock is basically the average.


----------



## DokoBG

LordGurciullo said:


> Whats up guys! I'm not nearly as technical as all of you guys but I've learned a ton and really appreciate yall.
> In any event - Heres my thoughts/ramblings/overclock findings on the msi gaming trio 4090



With Suprim X air bios it lifts off planet earth !


----------



## KingEngineRevUp

yzonker said:


> Looks like [email protected] in the "OC" example. I tried to replicate that condition on my machine with the Bykski block (running PR stress test). Set 75% PL (Strix 500w default bios), +150 core, +1500 mem. I got right at [email protected] So possibly better.
> 
> There are some variables that affect the delta though. Flow rate and temp sensor placement/accuracy are the 2 biggest. I have 2 D5's which helps 1-2C over a single D5. And loop temp can vary 1-2C even with fairly high flow rate, so if the temp sensors are located differently in the loops, this may introduce enough error to explain some of the difference as well.
> 
> Probably safe to say the Bykski at half the price is as good as the EK though. Possibly better.
> 
> View attachment 2581010


Thanks for the test. That user has an active backplate too. I'm wondering if they installed it wrong. I know they have dual Mora so they should have 2 pumps for their setup.


----------



## Xavier233

I tried to update to the latest bios, with the OC mode on and using the OC bios file. It says the bios mode does not match. Any ideas?

When I select the silent mode bios file, on a OC switch on the card, it seems to want to continue to flash.


----------



## mattskiiau

LordGurciullo said:


> Whats up guys! I'm not nearly as technical as all of you guys but I've learned a ton and really appreciate yall.
> In any event - Heres my thoughts/ramblings/overclock findings on the msi gaming trio 4090


Thanks for the vid.
Was going through hoping it to peak on your HWINFO64 but wasn't able to find it, may have missed it.

Are you able to provide a screenshot of HWINFO64 for the Trio under load? 

Specifically trying to investigate why this NVVDD output power is spiking way into 600w.

I've seen it as high as 650w just playing COD MW2, not benching.


----------



## lassek1981

mattskiiau said:


> Thanks for the vid.
> Was going through hoping it to peak on your HWINFO64 but wasn't able to find it, may have missed it.
> 
> Are you able to provide a screenshot of HWINFO64 for the Trio under load?
> 
> Specifically trying to investigate why this NVVDD output power is spiking way into 600w.
> 
> I've seen it as high as 650w just playing COD MW2, not benching.
> 
> View attachment 2581024


Okey insane! That should in theory be impossible lol ! ... what kind of clocks are you seeing at past 600w? Like many others here I just read min tops out at around 561w . . . Is this water cooling?


----------



## lassek1981

mirkendargen said:


> It does, needs to be enabled with Nvidia profile inspector like previously mentioned. It's no giant change though, 100-200pts maybe, but that can be the difference of a lot of places on the leaderboard.


So I tried to download the profile inspector, I was not able to find that option/setting


----------



## lassek1981

Nizzen said:


> I just stopped looking at Port Royal, because the points is jus all over the place. Still have valid 31k in PR LOL. Just hided it, because it's going to be deleted anyway.
> Variance of points with the same clocks is just epic fail. I think most runs over 28500 is flawed in one way or another.
> 
> example:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Point's don't make sense.....
> 
> Looks cool to be #1 with flawed result, but that's pretty much it....


WOW! Thats the highest score I have seen so far. Very impressive! I just dont understand how you can archieve this with the same clocks basically as others? unless 3dmark reported incorect clocks on the picture. Is it water cooled?

I am running mine now with a 10900K which is Gen3 and DDR4 3333 .. And 4090 is on air... So in a few weeks I am getting my 13900K Gen4 and DDR5 + My 4090 get EKWB Waterblock.... it should boost my score a little, or at least I hope so haha.. The Gen4 and Water cooling should give a difference compared to Gen 3 and air.

My current score is around 27300 PR.


----------



## Zero989

yzonker said:


> It's power limiting. Mine is the same. Depending on which flavor of Kombustor I run, it tops out anywhere from 520w to 580w. Seems like my card reports lower power running the same benchmark than other AIB cards. People have reported 550w in Port Royal on air, but I never saw more than 510w or so. Now it's just 450w with the water block.
> 
> Bottom line, if you don't see that green graph in GPUZ (indicating power limits are being hit) while running games and benchmarks, then it doesn't matter at all.


I'm using the 600W liquid bios and I hit the power limit around 530W-560W in Cyberpunk @ max graphics 4K no DLSS. Clocks throttle down because of it.


----------



## lassek1981

Miguelios said:


> Honing in my OC...
> 
> 3075Mhz (~3056Mhz Effective)
> +1600 Mem
> 
> Note:
> The 28.8k PR I posted below is VERY repeatable! My highest score is 29.2k+ with no artifacts, but that was a very lucky run.
> 
> View attachment 2580995
> 
> View attachment 2580994
> 
> View attachment 2580993
> View attachment 2580999


Its a mystery to me that some mem apparently can clock 200-300 mhz more than others.. I am pretty stretched out at 1375 Mhz or around 23Gpbs... I think thats very common at that range. But yours is 225 Mhz faster which is insane because its quite a step compared to average. Maybe I can improve mine when water cooled.


----------



## PhuCCo

I looked through Optimus' twitter and saw that they were testing their 4090 Strix block, and they said they have a delta of 19C from water to core at 515W. I was shocked because that sounds terrible, but apparently the 4090 has more of the power going to the core compared to the 3090.. like nearly twice the amount. I would assume the extra power on the 3090 was going to the 24 memory chips and the pcie slot whereas the 4090 uses less power on both of those IIRC. More losses on the 3090 I guess. And the obvious architecture change.

__ https://twitter.com/i/web/status/1584950589393293312

__ https://twitter.com/i/web/status/1584971946558902272
That would explain why my 3090 pulling 400W only has a delta of 10C, and the results I am seeing on here for 4090s and my own are much higher at a "similar" wattage. I think I'll need to temper my expectations on how well water-cooling does on this gen. The stock coolers are even more impressive to me now.


----------



## lassek1981

Zero989 said:


> I'm using the 600W liquid bios and I hit the power limit around 530W-560W in Cyberpunk @ max graphics 4K no DLSS. Clocks throttle down because of it.


Hmm okey whats the difference with this BIOS? Because im just running stock at thats also 600W max.. Mine is also topping at around 561-562w (Gigabyte Gaming OC)


----------



## Zero989

lassek1981 said:


> Hmm okey whats the difference with this BIOS? Because im just running stock at thats also 600W max.. Mine is also topping at around 561-562w (Gigabyte Gaming OC)


I suspect it's the drivers, they can do that... I also have a PCIE5 PSU so it's not related to the cable.

Difference with what BIOS? Suprim Liquid X ships at 520W max but there is a liquid BIOS that is 480W/600W (125% slider).


----------



## Nizzen

lassek1981 said:


> WOW! Thats the highest score I have seen so far. Very impressive! I just dont understand how you can archieve this with the same clocks basically as others? unless 3dmark reported incorect clocks on the picture. Is it water cooled?
> 
> I am running mine now with a 10900K which is Gen3 and DDR4 3333 .. And 4090 is on air... So in a few weeks I am getting my 13900K Gen4 and DDR5 + My 4090 get EKWB Waterblock.... it should boost my score a little, or at least I hope so haha.. The Gen4 and Water cooling should give a difference compared to Gen 3 and air.
> 
> My current score is around 27300 PR.


That's what we are talking about now in this very thread. High vram OC results in bugged runs. Results are valid, but they aren't "real". 3dmark team knows about it, and are deleting results manually. Right now there is not much they can do, because the results are Valid, so they have to compare results with other results with the same clocks.

Now, there is no sense to look at Port Royal scores to compare others overclocks. This may be the case in other 3dmark benchmarks too, but I haven't tested as much yet with 4090.

My score is WORLD RECORD high, but a flawed run....


----------



## GQNerd

yzonker said:


> You can't get a "lucky run" that is 400pts higher without artifacting or something happening to boost the score. 400 pts is A LOT.


And yet, I just told you I did.. so here we are.

I figured some would be skeptical, which is why I called it out myself... I hit that score with no artifacts or weird lights/patterns. Was there something else that affected that score? Maybe, but not visibly...

So I posted my 28.8 instead because I can repeat that run ALL DAY LONG

Guess I glitched my way to all the other scores..




yzonker said:


> So effective clock is basically the average.


Wrong.

If you're talking about CPU's, yes.. effective clock is the avg.

For GPUs (since Ampere) Effective clock is a secondary clock, not an avg. reading..

For example, that's why Kingpin 3090/Ti owners used the Classified tool to tweak voltages, and got the best performance when you could make the reported boost clock and effective clocks as close to matching as possible..


----------



## newls1

is the gigabyte gaming OC a good choice to purchase? Is its board design of good quality and is its bios 600W? I can snag 1 right now for 2k, yes i know its a few hundred over msrp, but its brand new and imtired of waiting


----------



## Zero989

newls1 said:


> is the gigabyte gaming OC a good choice to purchase? Is its board design of good quality and is its bios 600W? I can snag 1 right now for 2k, yes i know its a few hundred over msrp, but its brand new and imtired of waiting


It has one of the better coolers IMO. The memory temps are amazing. Originally I had Gigabyte OC on order. Gigabyte gets hated as a value brand but they make awesome cards with a 4 year "warranty". 

I still remember my Xtreme 1080 Ti from them that did 2.12Ghz.


----------



## newls1

Zero989 said:


> It has one of the better coolers IMO. The memory temps are amazing. Originally I had Gigabyte OC on order. Gigabyte gets hated as a value brand but they make awesome cards with a 4 year "warranty".
> 
> I still remember my Xtreme 1080 Ti from them that did 2.12Ghz.


so should i bite down and get this? it'll be waterblocked as soon as someone releases a block. is the bios 600w?


----------



## Zero989

newls1 said:


> so should i bite down and get this? it'll be waterblocked as soon as someone releases a block


Oh if you're going to waterblock then it doesn't matter which one you buy as long as the block is compatible, IMO.

And yea, the OC model is 450W up to 600W.


----------



## newls1

Zero989 said:


> Oh if you're going to waterblock then it doesn't matter which one you buy as long as the block is compatible, IMO.
> 
> And yea, the OC model is 450W up to 600W.


well, looking at certain other vendors, some of their boards look like crap... zotac for one.... looked terrible cheap. I just dont want to get a gimpy card after spending nearly 2000$. Want to get a solid card on the first try. Would love to get a strix but damn you cant find them anywhere for decent prices


----------



## motivman

Nizzen said:


> I just stopped looking at Port Royal, because the points is jus all over the place. Still have valid 31k in PR LOL. Just hided it, because it's going to be deleted anyway.
> Variance of points with the same clocks is just epic fail. I think most runs over 28500 is flawed in one way or another.
> 
> example:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Point's don't make sense.....
> 
> Looks cool to be #1 with flawed result, but that's pretty much it....


LMAO... didn't even realise that was my score!!!! I am flabbergasted...


----------



## dk_mic

lassek1981 said:


> So I tried to download the profile inspector, I was not able to find that option/setting











Here's How You Can Enable Resizable BAR Support in Any Game via NVIDIA Inspector


While only sixteen games have been officially whitelisted for Resizable BAR support by NVIDIA, there's a procedure to manually enable any.




wccftech.com


----------



## Gking62

newls1 said:


> so should i bite down and get this? it'll be waterblocked as soon as someone releases a block. is the bios 600w?


you speaking of the regular Gigabyte Gaming OC, not Aorus etc? if the former there are a few on ebay for $2100


----------



## newls1

Gking62 said:


> you speaking of the regular Gigabyte Gaming OC, not Aorus etc? if the former there are a few on ebay for $2100


I just bought the Gigabyte Gaming OC for 2000$ im okaywith this.. Be here in a few days! Im so damn stoked!!!!!! Hoping for a fun toy to tinker with and 3dmark the hell out of it under water


----------



## Gking62

good find, congrats!


----------



## yzonker

KingEngineRevUp said:


> Thanks for the test. That user has an active backplate too. I'm wondering if they installed it wrong. I know they have dual Mora so they should have 2 pumps for their setup.


Glad someone else posted that comment from Optimus. I had seen that before but couldn't find it again so didn't bother to mention it. But anyway the block deltas we're seeing may be better than we think of the core power really is quite a bit higher than the 3080ti/3090. 

I'm actually seeing about the same core temps as my 3080ti/3090 since power draw is lower at the same voltage. The 3080ti has an Alphacool block and the 3090 has a Heatkiller V. The HKV is the best of the 2,but only by a couple of degrees.


----------



## newls1

Can someone please point me in the right direction as to what cable adapter to purchase so I dont have to use the 4x4pin y-splitter adapter in the box for my 4090 coming. My PSU is a eVGA 1600T2


----------



## mirkendargen

newls1 said:


> so should i bite down and get this? it'll be waterblocked as soon as someone releases a block. is the bios 600w?


Byski block for the Gigabyte Gaming OC/Aorus Master is out. I have one on, several other people here have one arriving shortly. Performance seems to be about the same as what Optimus claims they're getting with their Strix/Tuf prototype ¯\_(ツ)_/¯


----------



## MrTOOSHORT

newls1 said:


> is the gigabyte gaming OC a good choice to purchase? Is its board design of good quality and is its bios 600W? I can snag 1 right now for 2k, yes i know its a few hundred over msrp, but its brand new and imtired of waiting


Love mine. Very quiet, great cooling and no coil whine. Can't complain.

Adapter that I bought and is just being shipped:






CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store







store.cablemod.com


----------



## newls1

MrTOOSHORT said:


> Love mine. Very quiet, great cooling and no coil whine. Can't complain.
> 
> Adapter that I bought and is just being shipped:
> 
> 
> 
> 
> 
> 
> CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store
> 
> 
> 
> 
> 
> 
> 
> store.cablemod.com


finally i made a good purchase decision then!! about damn time. you using the adapter that came with it?


----------



## MrTOOSHORT

Being shipped. Mean time I'm using this extension:









Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.ca


Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.ca



www.amazon.ca





Solid too.


----------



## J7SC

It is indeed possible to get a 'lucky' run w/o artefacts  ( - or, alternatively, one 'boosted by' artefacts ). With the former, it wasn't really luck though - for example, Windows 10 and 11 muck about in the background depending on the run timing after boot-up etc. Hitting the right timing point with all else equal can yield an extra 150 - 200 points. That is also why some folks go through the trouble and load a stripped-down 'bench' OS, in addition to great cooling - after all, most CPUs and GPUs have temp-sensitive boost algorithms and the bigger your cooling capacity, the better. Beyond that, folks get into chillers, sub-zero etc. Bios tweaks to various parameters on CPU and system RAM can also paly a big role re. efficiency over an identical system running identical clocks, especially with PR. Then there is changing 'priority' in Win OS on certain bench .exe, or running safemode where possible (more for CPU benchies, though)

When I briefly tested my 4090 on air (w/ 5950X CPU) w/somewhat suspect factory cooler mount and 'that' 4-into-1 dongle, I got to the mid-38k in PR, and 18.8k in TSE. Some runs started out well over 3.1x Ghz and dropped to 3060 or 3045. Setup runs with HWInfo active showed 4 -5 speed bin drops, and hotspot at 95 C w/o full power mac OC settings used - naturally, I am so happy that my Bykski waterblock is on the way now (btw, US$123 before shipping and taxes), along with Cablemod's 12VHPWR specific to my Seasonic PX1300. 

I think folks should leave the door open to differences beyond clocks in explaining some (though certainly not all) results where clocks are the same but the efficiency was greater for a variety of different reasons. Simply saying it was bugged unless proven otherwise is annoying to me...for one, you can't really prove a negative, and assuming guilty unless proven innocent doesn't sit well. I fully realize that there are some 'suspect results' at 3DM, but unless honesty prevails, not much one can do. 3DM staff removing results by hand via parsing 'similar clocks' certainly could catch some bugged runs, but it also punishes those who managed to set up top-efficiency of the whole system at the same clocks. 4090s generally seem to have a fairly narrow 'bin range', so cooling and efficiency steps are even more important.

These days, I do my benchies on the 'daily' stock system - that is once the waterblock for the 4090 is mounted. It includes 1320x63 triple core rads, push-pull fans and 2x/3x D5s. I can drop ambient temps to some extent with heat off and open windows (after a hot & dry summer and fall, rain and snow has finally arrived here). That said, this is a steel-concrete high-rise condo building w/central heat and there is only so low I can go (apart from my family nagging me about the 'cold', i.e. 18C ). Folks with a back porch in a more extreme winter climate region of Canada and elsewhere have an advantage there...but then, they also have to shovel the driveway and footpaths (been there, including at - 30 C outside).

As posted before, years back, I did a lot of HWBot (Elite league / sub-zero) with very good results, in part because I also had some vendor support. But the sheer amount of time it took to set up a run via prepping for DICE and/or LN2 is very significant. I am glad I did HWBot for about two years, and also glad I stopped doing it once I had achieved certain goals. That said, even (or especially with) subzero, efficiency setups were and are still extremely important. These days, I enjoy benching just w/ my water-cooling system - mostly to find the outer limits of a new card I purchased, then dial it back for daily use / gaming etc. Then I try to max system efficiency - I found a few more MHz and other efficiency w/ the 5950X setup and I look forward to apply that to water-cooled 4090 runs - just for fun !


----------



## newls1

MrTOOSHORT said:


> Being shipped. Mean time I'm using this extension:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.ca
> 
> 
> Fasgear PCI-e 5.0 Extension Cable 30cm/1ft 16 Pin(12+4) Male to PCIE 5.0 3x8Pin(6+2) Female Sleeved Extension Cable with 4 Cable Combs 12VHPWR 16AWG Cable Compatible for GPU 3090Ti & RTX 4080 4090 : Amazon.ca
> 
> 
> 
> www.amazon.ca
> 
> 
> 
> 
> 
> Solid too.


the cable mod one is out of stock, and the link you provided is 3x4pin... my OCD wants to use 4x4pin... is there such thing?


----------



## Youngtimer

yzonker said:


> Glad someone else posted that comment from Optimus. I had seen that before but couldn't find it again so didn't bother to mention it. But anyway the block deltas we're seeing may be better than we think of the core power really is quite a bit higher than the 3080ti/3090.
> 
> I'm actually seeing about the same core temps as my 3080ti/3090 since power draw is lower at the same voltage. The 3080ti has an Alphacool block and the 3090 has a Heatkiller V. The HKV is the best of the 2,but only by a couple of degrees.


Thanks for sharing your watercooling data end delta, this is what enthusiasts are looking for. My Asus 4090 TUF is actually in Braunschweig (Germany) at Alphacool HQ for 3D scanning. It takes part on the "send it in and get one free" program... Alphacool
It will take some time, but ASUS TUF/Strix block will also bei availiable from Alphacool. I like that they are chrome-coating thier copper now and the fin-density has been increased by 33% since the Ampere blocks.

Greetings from Germany
Youngtimer

PS: My current setup, missing the 3080 Suprim... Cellar cooling for hot hours: Not everyday water cooling from our community | Construction diary | igor'sLAB


----------



## MrTOOSHORT

newls1 said:


> the cable mod one is out of stock, and the link you provided is 3x4pin... my OCD wants to use 4x4pin... is there such thing?


Not sure, but you could also use a direct from psu to 4090 cable, no extension:









Fasgear PCI-e Gen 5.0 Power Cable - 70cm 12VHPWR 16-pin (12+4 pin) Male to PCIE 3x8pin(6+2 pin) Male Sleeved Cable - 600W GPU Power Cord for RTX 3090 Ti & RTX 4080 4090 Graphics Card | Modular PSU : Amazon.ca: Electronics


Fasgear PCI-e Gen 5.0 Power Cable - 70cm 12VHPWR 16-pin (12+4 pin) Male to PCIE 3x8pin(6+2 pin) Male Sleeved Cable - 600W GPU Power Cord for RTX 3090 Ti & RTX 4080 4090 Graphics Card | Modular PSU : Amazon.ca: Electronics



www.amazon.ca


----------



## Panchovix

Miguelios said:


> Honing in my OC...
> 
> 3075Mhz (~3056Mhz Effective)
> +1600 Mem
> 
> Note:
> The 28.8k PR I posted below is VERY repeatable! My highest score is 29.2k+ with no artifacts, but that was a very lucky run.
> 
> View attachment 2580995
> 
> View attachment 2580994
> 
> View attachment 2580993
> View attachment 2580999


You got pretty high scores on 3DMark, I'm surprised but I have a 5800X, so my scores are way lower (except on TSE which is very similar).

Damn, didn't expect to get limited by my CPU on benchmarks now lol (PR and SpeedWay), and I'm assuming you didn't got the VRAM bug.




newls1 said:


> is the gigabyte gaming OC a good choice to purchase? Is its board design of good quality and is its bios 600W? I can snag 1 right now for 2k, yes i know its a few hundred over msrp, but its brand new and imtired of waiting


Pretty good actually, it seems to OC a lot better than the other cards, also with better temps. I wish I went to the Gaming OC instead of the TUF tbh, my +1175-+1200 on the VRAM hurts so much


----------



## newls1

MrTOOSHORT said:


> Love mine. Very quiet, great cooling and no coil whine. Can't complain.
> 
> Adapter that I bought and is just being shipped:
> 
> 
> 
> 
> 
> 
> CableMod E-Series Pro ModFlex Sleeved 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store
> 
> 
> 
> 
> 
> 
> 
> store.cablemod.com


I see you went through the "global" store site. I switched to the USA store and all of em are OOS. So using your link I just checked out and purchased it. No idea when ill see it but ill just use the factory "Y" splitter until it shows up. I hope my house wont burn down in the 2-3weeks it takes to get this!

Can I assume that this cablemod cable is of higher quality then the factory y spiltter?


----------



## Panchovix

Managed to improve my TSE graphics score a little, but no way close to the n°1 with my CPU/GPU combo










Also, managed to get into the top 100 on normal TS (leaderboard search mode) by graphics score: prob gonna be out of this soon, but it feels nice having an below-average card


----------



## MrTOOSHORT

newls1 said:


> I see you went through the "global" store site. I switched to the USA store and all of em are OOS. So using your link I just checked out and purchased it. No idea when ill see it but ill just use the factory "Y" splitter until it shows up. I hope my house wont burn down in the 2-3weeks it takes to get this!
> 
> Can I assume that this cablemod cable is of higher quality then the factory y spiltter?


Feels and looks like quality. Did some benchies using it, pins and connectors look fine.

Edit, I'm using the Fasgear extension now, Great cable.

I hear the Cablemod one is great too. But yet to use it as it was just shipped a couple days ago. Will receive it Nov. 10th.


----------



## J7SC

MrTOOSHORT said:


> Feels and looks like quality. Did some benchies using it, pins and connectors look fine.
> 
> Edit, I'm using the Fasgear extension now, Great cable.
> 
> I hear the Cablemod one is great too. But yet to use it as it was just shipped a couple days ago. Will receive it Nov. 10th.


...same here re. my CableMod order - 'scheduled' for Nov 10 delivery, same day my Bykski block is supposed to arrive. I am also getting an additional 12VHPWR one (free as I qualified for their 12VHPWR program ) from Seasonic for the PX1300...not sure yet when that will arrive. Both 12VHPWR cables have the 4x PCIe 8 on the PSU end...Cablemod is white, Seasonic is black - decisions, decisions...


----------



## newls1

just hope ill be okay using the factory adapter until my cablemod one comes in.... Really appreciate everyones help here!


----------



## MrTOOSHORT

newls1 said:


> just hope ill be okay using the factory adapter until my cablemod one comes in.... Really appreciate everyones help here!



Just stick the factory cable into the slot nice and easy, all the way in firm too.

Then don't bend it the best you can. Should be ok. I used mine for a couple weeks before the extension, pins and connector look fine there too.


----------



## newls1

MrTOOSHORT said:


> Just stick the factory cable into the slot nice and easy, all the way in firm too.
> 
> Then don't bend it the best you can. Should be ok. I used mine for a couple weeks before the extension, pins and connector look fine there too.


yes sir! thats the plan. Ill post back when card gets in and settled into its new home!!! DAMN IM STOKED... wasnt even this stoked waiting for the 3090ti....!


----------



## MrTOOSHORT

newls1 said:


> yes sir! thats the plan. Ill post back when card gets in and settled into its new home!!! DAMN IM STOKED... wasnt even this stoked waiting for the 3090ti....!


You should be stoked, card is a real beast!


----------



## newls1

MrTOOSHORT said:


> You should be stoked, card is a real beast!


im getting anxiety already!


----------



## LuckyImperial

I received my ModDIY D8P-12VHPWR cable today and was a bit concerned when I saw that only 2 of the 4 sense pins were terminated. I'm in a bit of an edge case situation building in a Lian Li Q58 with an SFX EVGA SuperNOVA 850 GM small form factor power supply, which only has two 8pin VGA plugs. I was elated when I saw that ModDIY was releasing a _PSU specific_ *dual *8pin to 12VHPWR cable because it was ideal for my build...but I was curious what the reported power limit was going to be. The default power draw for my Liquid X is 480W so I was also concerned that even a 450W "sense" may not be sufficient to boot the card.

Good news! The BIOS reports 110% power slider, which is the maximum for my card (480W default 530W limit). I think it's safe to speculate that this cable would provide a 600W power limit to other BIOS'. Fwiw, I only run this card at 70% power slider (thermal reasons) but I figured would report to the community that the Dual 8pin ModDIY cable seems to "sense" 600W to the BIOS.

I purchased the bare, soft silicone option and the bend radius is in the millimeter range, 10-15mm maybe. The pins are only single split and gold plated. Much better construction than the stock cable. Also worth noting...not all of the 8pin ports are terminated and again...only two sense pins are terminated. Curious.


----------



## dr/owned

I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:


----------



## changboy

dr/owned said:


> I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:
> 
> View attachment 2581091


Lol i want cancel my Bykski order for my 4090 gaming oc after see this, is all those block are like this ?


----------



## kairi_zeroblade

changboy said:


> Lol i want cancel my Bykski order for my 4090 gaming oc after see this, is all those block are like this ?


I have my sandwich block from them and have no issues like that (Distilled+Biocide (2 drops)+anti corrosion(2 drops) only for every 2 liters)


----------



## Gking62

dr/owned said:


> I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:
> 
> View attachment 2581091


sure is, what coolant? I recently sold my 3080 Ti FTW3 Ultra and was using the clear Cryofuel premix im my EK Vector since last Dec and the block and coolant was as clear and clean as the day I filled it.


----------



## DokoBG

Wow that block looks awful - corroded mess.


----------



## zkareemz

*I have a question:*

is this a good score for
TUF 4090 OC + Strix bios
+195 CORE
+1300 RAM









I scored 33 328 in Time Spy


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Baasha

Has anyone had issues with the adapter cable or plug on GPU PCB melting on the 4090 FE? The ones on Reddit all seem to indicate the affected samples are AIBs.

In other words, is the FE safe or not?


----------



## yzonker

Made a few runs. Since I didn't go to the effort of turning off HT, etc... I didn't see any gains in PR or Speedway. Just not enough extra clocks to make a difference with just the waterblock. Maybe if I fire up my chiller I can inch those up a little. Won't be much though. Did see gains in TS/TSE/FS. The good news is I was able to make runs in the +1775-1800 range for the VRAM. So no significant loss there. 

Edit: and I guess this is better than it seems as my previous runs were set by using my chiller to blow cold air through the case when the card was air cooled. So temps didn't really go down that much.

TS CPU score is a little weak due to DDR4. It really loves DDR5 for whatever reason.









I scored 37 450 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 20 038 in Time Spy Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





And my combined score is weak for some reason in FS. Not sure if that's DDR4 or something else. I've had issues with it in the past running inconsistently on my 12900k/DDR5 system. Just had to keep running it until it gave me a good score. Didn't bother this time.









I scored 57 063 in Fire Strike


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Pollomir

Guys, check your conectors, maybe is our fault.



Testing the Nvidia 12VHPWR adapter


----------



## Gking62

Pollomir said:


> Guys, check your conectors, maybe is our fault.
> 
> 
> 
> Testing the Nvidia 12VHPWR adapter


man, nice find and I was thinking the same thing regarding the dielectric greese as I have some handy, same brand linked and probably best to use a mini cotton swab to apply.


----------



## Zero989

This was my TimeSpy run, not even at max GPU clocks. No E cores or hyperthreading so my CPU score is poop.









I scored 31 268 in Time Spy


Intel Core i7-13700KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Baka_boy

dr/owned said:


> I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:
> 
> View attachment 2581091


Happens to EK blocks too, even with proper coolant. Running only distilled will sometimes cause this but it's hard to say if or when it will happen due to disparity between setups.


----------



## Zero989

Pollomir said:


> Guys, check your conectors, maybe is our fault.
> 
> 
> 
> Testing the Nvidia 12VHPWR adapter


Wow so I got jebaited into buying a PCIE5 PSU because WoW addicts that live in their moms basement cant plug in a connector properly. huge L


----------



## yzonker

Zero989 said:


> This was my TimeSpy run, not even at max GPU clocks. No E cores or hyperthreading so my CPU score is poop.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 31 268 in Time Spy
> 
> 
> Intel Core i7-13700KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Almost feels like the bigger difference we're seeing is in the RT benchmarks. Same story for you and @KedarWolf. But so far neither of you have posted a raster only score that's well above.


----------



## Zero989

yzonker said:


> Almost feels like the bigger difference we're seeing is in the RT benchmarks. Same story for you and @KedarWolf. But so far neither of you have posted a raster only score that's well above.


We can do some Cyberpunk RT testing if you have it.


----------



## olrdtg

dr/owned said:


> I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:
> 
> View attachment 2581091


Similar thing happened to my Byski 3090FE block, when I put my 4090 in and moved the 3090 to my server, I had to take the entire thing apart and scrub it clean, scrubbed off the nickle plating too as it was just falling off by that point. I think it's just a cheap ass nickel plating, the copper layer looked fine on mine.


----------



## ZealotKi11er

dr/owned said:


> I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:
> 
> View attachment 2581091


Same experience here with Radeon 7 and 6900 XT block. Never again. I rather spend the money for a good block.


----------



## PhuCCo

yzonker said:


> Looks like [email protected] in the "OC" example. I tried to replicate that condition on my machine with the Bykski block (running PR stress test). Set 75% PL (Strix 500w default bios), +150 core, +1500 mem. I got right at [email protected] So possibly better.
> 
> There are some variables that affect the delta though. Flow rate and temp sensor placement/accuracy are the 2 biggest. I have 2 D5's which helps 1-2C over a single D5. And loop temp can vary 1-2C even with fairly high flow rate, so if the temp sensors are located differently in the loops, this may introduce enough error to explain some of the difference as well.
> 
> Probably safe to say the Bykski at half the price is as good as the EK though. Possibly better.
> 
> View attachment 2581010


I got my EK 4090FE single sided block as good as it will get. Loaded up mw2 and cranked the resolution until I sat at 370W at the menu and the best I could get out of the block was a 17C delta between water and core. 9C delta between core and core hotspot. The guy with the active backplate version of the FE block is like 1.5C better on the core than mine, which makes sense as it cools the back of the core and my backplate definitely gets warm. I think the EK block design just sucks, especially for the money. Thank you for posting your results


----------



## th3illusiveman

Have any of these burnt adapters happened on the FE model? Just wondering if it might be an AIB PCB defect that might be causing it. 

I have an FE coming in on tuesday and i have no intention of babysitting it or looking at the cable everyday because i paid waaayy too much on this card for that BS. If it burns, Nvidia can deal with the RMA or refund and deal with the bad PR on that.


----------



## newls1

dr/owned said:


> I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:
> 
> View attachment 2581091


that happened on my 3090 block too only after a few months. ill never purchase these cheap waterbocks again....


----------



## J7SC

changboy said:


> Lol i want cancel my Bykski order for my 4090 gaming oc after see this, is all those block are like this ?


I have a Byski block on my 6900XT work machine - it is still absolutely pristine after 18 months of daily use. Same for the Phanteks Glacier for the 3090 Strix gamer setup, which in return had replaced a disastrous EKWB Strix block and backplate (the latter peeled off nickel coating after taking it out of the box...there were other issues with the main block). I should add that I have a Heatkiller TR CPU block that also had nickel peeling off inside & opposite the flow inlet (after 12 months...)....so it's all a bit of a crapshoot lately trying to rely on 'brand' alone. Finally, it also depends what kind of liquids / additives you run.


----------



## yzonker

Horizontal mount fixes the nickel issue.


----------



## newls1

soi have a quick question. is the ram on the 4090 the EXACT SAME ram used on the 3090ti? i only ask cause Im constantly seeing ram clocks in the 12-1500+ range in MSI AB. My 3090ti was completely unstable @ anything over 800+ and my 3090 (Non ti) was only stable too 1000+. So is the Gddr6x like a new bin or something that OC's better??


----------



## GAN77

yzonker said:


> Horizontal mount fixes the nickel issue.


Details?)


----------



## Mad Pistol

I need someone to test this please.

Cyberpunk Benchmark:
4090 at 4K, RT Psycho, DLSS off, Max settings - 40.68 FPS.
4090 at 4K, RT off, DLSS off, Max settings - 44.10 FPS.

That's less than a 10% uptick in performance between crazy RT and no RT. Can someone confirm this?


----------



## Panchovix

Mad Pistol said:


> I need someone to test this please.
> 
> Cyberpunk Benchmark:
> 4090 at 4K, RT Psycho, DLSS off, Max settings - 40.68 FPS.
> 4090 at 4K, RT off, DLSS off, Max settings - 44.10 FPS.
> 
> That's less than a 10% uptick in performance between crazy RT and no RT. Can someone confirm this?


I can test it, but with the card at default settings or with OC?


----------



## th3illusiveman

Mad Pistol said:


> I need someone to test this please.
> 
> Cyberpunk Benchmark:
> 4090 at 4K, RT Psycho, DLSS off, Max settings - 40.68 FPS.
> 4090 at 4K, RT off, DLSS off, Max settings - 44.10 FPS.
> 
> That's less than a 10% uptick in performance between crazy RT and no RT. Can someone confirm this?


whats your GPU usage? try turning down crowd density for CPU bottleneck checks.


----------



## Mad Pistol

th3illusiveman said:


> whats your GPU usage? try turning down crowd density for CPU bottleneck checks.


GPU usage is 100% the entire time.


----------



## Arizor

Mad Pistol said:


> I need someone to test this please.
> 
> Cyberpunk Benchmark:
> 4090 at 4K, RT Psycho, DLSS off, Max settings - 40.68 FPS.
> 4090 at 4K, RT off, DLSS off, Max settings - 44.10 FPS.
> 
> That's less than a 10% uptick in performance between crazy RT and no RT. Can someone confirm this?


this could be a few things. The first is that cyberpunk notoriously can fail to change settings properly until you quit and restart the game.

the other perhaps more likely possibility is that the benchmark doesn’t really take advantage of psycho RT as it does in some areas of the game (eg Afterlife and the Moxxie bar).


----------



## Mad Pistol

Arizor said:


> this could be a few things. The first is that cyberpunk notoriously can fail to change settings properly until you quit and restart the game.
> 
> the other perhaps more likely possibility is that the benchmark doesn’t really take advantage of psycho RT as it does in some areas of the game (eg Afterlife and the Moxxie bar).


True.

Being morbidly curious, I reinstalled my old 3080 FE and tried it at the same settings.

RTX 3080 at 4K, RT Psycho, DLSS off, Max settings - 10 FPS
RTX 3080 at 4K, RT off, DLSS off, Max settings - 25 FPS.

The RT off settings confirm that the 44.1 FPS from the 4090 is more than likely correct. (still would like confirmation from another owner, though).

The shocker?

4090 is 4x faster in RT than a 3080... what the hell?!?!?!?! Is the 4090 just insanely fast at RT?


----------



## dr/owned

olrdtg said:


> Similar thing happened to my Byski 3090FE block, when I put my 4090 in and moved the 3090 to my server, I had to take the entire thing apart and scrub it clean, scrubbed off the nickle plating too as it was just falling off by that point. I think it's just a cheap ass nickel plating, the copper layer looked fine on mine.





ZealotKi11er said:


> Same experience here with Radeon 7 and 6900 XT block. Never again. I rather spend the money for a good block.





newls1 said:


> that happened on my 3090 block too only after a few months. ill never purchase these cheap waterbocks again....


Funny part is, a 1080Ti Bykski block in the same loop (I have a giant loop that goes through all my computers) had no problems at all, running for years. But then my 3090 TUF and now 3090 EVGA blocks both look like this. And this is Mayhem X1 Clear that's 90% pure water and only 10% glycol - corrosion inhibitor. So for sure they've got something wrong with their nickel plating supplier. So my message to them is twofold: will they take responsibility and say "this isn't right let's help you out" and 2) "we're going to look at the people doing the nickel plating and figure out what they're doing"

Cleaned 1080Ti Block that ran for 3? years:












yzonker said:


> Horizontal mount fixes the nickel issue.


Yep, because I couldn't see it either until taking the block out  Cosmetically I don't really care but I just don't want nickel flakes clogging up stuff downstream.


----------



## Panchovix

Mad Pistol said:


> I need someone to test this please.
> 
> Cyberpunk Benchmark:
> 4090 at 4K, RT Psycho, DLSS off, Max settings - 40.68 FPS.
> 4090 at 4K, RT off, DLSS off, Max settings - 44.10 FPS.
> 
> That's less than a 10% uptick in performance between crazy RT and no RT. Can someone confirm this?


Here are mine

4090 at 4K, RT Psycho, DLSS off, Max settings - 40.87 FPS
4090 at 4K, RT Off, DLSS off, Max settings - 62.59 FPS

RT Psycho:









RT Off:


----------



## th3illusiveman

Mad Pistol said:


> True.
> 
> Being morbidly curious, I reinstalled my old 3080 FE and tried it at the same settings.
> 
> RTX 3080 at 4K, RT Psycho, DLSS off, Max settings - 10 FPS
> RTX 3080 at 4K, RT off, DLSS off, Max settings - 25 FPS.
> 
> The RT off settings confirm that the 44.1 FPS from the 4090 is more than likely correct. (still would like confirmation from another owner, though).
> 
> The shocker?
> 
> 4090 is 4x faster in RT than a 3080... what the hell?!?!?!?! Is the 4090 just insanely fast at RT?


Thats an insane jump in performance lol. Nvidia did mention they improved RT performance in ADA and i guess this is the result. They seem very focused on RT so large gains in that department are not unexpected.


----------



## Mad Pistol

Panchovix said:


> Here are mine
> 
> 4090 at 4K, RT Psycho, DLSS off, Max settings - 40.87 FPS
> 4090 at 4K, RT Off, DLSS off, Max settings - 62.59 FPS
> 
> RT Psycho:
> View attachment 2581127
> 
> 
> RT Off:
> View attachment 2581128


Yea, something is wrong with my non-RT numbers. Let me go back and retest.


----------



## mattskiiau

I *think *I've figured out my problem of this random 600w-650w spike on a 480w Trio:











For whatever reason, setting core voltage to 100% makes NVVDD output randomly spike to 600w-650w. This happens to me in both simply opening COD MW2 or running TSE.
I wonder if this is because my Trio has bad VRMs? I'm not too knowledgeable about this stuff.

That being said, I've decided to play around with undervolting and it has significantly lowered that spike when doing the the above two tasks.
Strangely, I'm still able to pass TSE and play COD MW2 for a few hours without crashing when on 3000mhz @ 1.025v.

Going to do more testing with undervolting under different loads to find a sweet spot.


----------



## BigMack70

ArcticZero said:


> Yeah TDR errors point to an unstable OC/UV if either are applied whether manually or by custom BIOS. Might have found your culprit then. If you were running stock I'd be more worried since that's usually a faulty GPU, assuming drivers and stuff are properly installed.


Well, my PC has been crashing repeatedly today - black screen + GPU fans to 100%. I tried the new Nvidia driver and that seems to be when it started. It's crashing without any OC and not generating any display driver / device related errors in the windows event log. 

My windows event log is FULL of other errors/warnings related to TPM and permissions though. I may try re-installing Windows to see if that solves the issue.


----------



## Mad Pistol

I think I figured out what's tanking performance on Cyberpunk 2077 at 4K max, no RT. Apparently, Screen Space Reflections set to "Psycho" is an absolute performance killer. Setting it to "Ultra" instead sends the framerates to an average of 80-ish.

It's weird that a single setting like that could cause the 4090 to buckle. I wonder why that is...


----------



## J7SC

Mad Pistol said:


> Yea, something is wrong with my non-RT numbers. Let me go back and retest.


Gamers Nexus had a comment on that Cyberpunk issue the other day re. some other published benches (shortly after 4090 release)...certain changes in settings didn't 'take' until ? restart ?


----------



## yzonker

Well the chiller helps some. I lost about +125 on the VRAM OC though. Had to run +1675 max. 









Result not found







www.3dmark.com





And once again Speedway likes mem more than core clock. Barely budged.









I scored 11 050 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





I also did a quick run in TS, but it was with HT off, e cores off so nerfed CPU score. Another minimal gain. 









I scored 31 535 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## motivman

yzonker said:


> Well the chiller helps some. I lost about +125 on the VRAM OC though. Had to run +1675 max.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> And once again Speedway likes mem more than core clock. Barely budged.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 050 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I also did a quick run in TS, but it was with HT off, e cores off so nerfed CPU score. Another minimal gain.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 31 535 in Time Spy
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


I also noticed that colder temps leads to less stability on vram, why is that?


----------



## Panchovix

yzonker said:


> Well the chiller helps some. I lost about +125 on the VRAM OC though. Had to run +1675 max.


Can confirm this (you already said it some days ago)

Fans at 100% = about 100 Mhz less on the VRAM for some reason. GRRD6X is pretty weird.


----------



## motivman

PhuCCo said:


> I got my EK 4090FE single sided block as good as it will get. Loaded up mw2 and cranked the resolution until I sat at 370W at the menu and the best I could get out of the block was a 17C delta between water and core. 9C delta between core and core hotspot. The guy with the active backplate version of the FE block is like 1.5C better on the core than mine, which makes sense as it cools the back of the core and my backplate definitely gets warm. I think the EK block design just sucks, especially for the money. Thank you for posting your results


Try kingpin KPX, should reduce your delta T to around 13C. I think I am skipping EKWB this gen, and going with the corsair block for my FE.


----------



## yzonker

motivman said:


> I also noticed that colder temps leads to less stability on vram, why is that?


It's a behavior the LN2 guys ran in to when the 3090Ti was released. It's the 2Gb chips. My 3080ti/3090 would keep gaining mem clocks all the way down to 3C water. (lowest my chiller goes). But those are 1Gb chips.


----------



## yzonker

motivman said:


> Try kingpin KPX, should reduce your delta T to around 13C. I think I am skipping EKWB this gen, and going with the corsair block for my FE.


That's what I used on my Bykski block. I've had good success with it on my CPU and appeared to slightly outperform Kryonaut Extreme.


----------



## chrismohsseni

29,943 in Port Royal
4090 Founders on LN2









I scored 29 943 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## KingEngineRevUp

yzonker said:


> Glad someone else posted that comment from Optimus. I had seen that before but couldn't find it again so didn't bother to mention it. But anyway the block deltas we're seeing may be better than we think of the core power really is quite a bit higher than the 3080ti/3090.
> 
> I'm actually seeing about the same core temps as my 3080ti/3090 since power draw is lower at the same voltage. The 3080ti has an Alphacool block and the 3090 has a Heatkiller V. The HKV is the best of the 2,but only by a couple of degrees.


I read that Optimus post and that's cool they shared that information with us. At this point, I guess water blocking is more for an aesthetic and noise kinda thing. Well I mean it always was but I will curve my expectations on any OC abilities and noise.


----------



## long2905

LuckyImperial said:


> I received my ModDIY D8P-12VHPWR cable today and was a bit concerned when I saw that only 2 of the 4 sense pins were terminated. I'm in a bit of an edge case situation building in a Lian Li Q58 with an SFX EVGA SuperNOVA 850 GM small form factor power supply, which only has two 8pin VGA plugs. I was elated when I saw that ModDIY was releasing a _PSU specific_ *dual *8pin to 12VHPWR cable because it was ideal for my build...but I was curious what the reported power limit was going to be. The default power draw for my Liquid X is 480W so I was also concerned that even a 450W "sense" may not be sufficient to boot the card.
> 
> Good news! The BIOS reports 110% power slider, which is the maximum for my card (480W default 530W limit). I think it's safe to speculate that this cable would provide a 600W power limit to other BIOS'. Fwiw, I only run this card at 70% power slider (thermal reasons) but I figured would report to the community that the Dual 8pin ModDIY cable seems to "sense" 600W to the BIOS.
> 
> I purchased the bare, soft silicone option and the bend radius is in the milometer range, 10-15mm maybe. The pins are only single split and gold plated. Much better construction than the stock cable. Also worth noting...not all of the 8pin ports are terminated and again...only two sense pins are terminated. Curious.
> View attachment 2581082
> View attachment 2581085
> View attachment 2581083
> View attachment 2581084


do some more research mate. only 2 of the 4 sense pins are populated, the other 2 have no function for now.


----------



## yzonker

KingEngineRevUp said:


> I read that Optimus post and that's cool they shared that information with us. At this point, I guess water blocking is more for an aesthetic and noise kinda thing. Well I mean it always was but I will curve my expectations on any OC abilities and noise.


Case is a lot less crowded and it's not dumping all that heat in to the case right on the connector. Feels worth it for me.


----------



## KingEngineRevUp

yzonker said:


> Case is a lot less crowded and it's not dumping all that heat in to the case right on the connector. Feels worth it for me.


Well seems like EKWB might be a dud this generation, do you know when Heatkiller or Alphacool will be releasing their blocks? I know Corsair is releasing theirs in 2 weeks it seems. I'm just not sure about the aesthetics of the Corsair block. Not sure if I like or dislike it.


----------



## motivman

KingEngineRevUp said:


> Well seems like EKWB might be a dud this generation, do you know when Heatkiller or Alphacool will be releasing their blocks? I know Corsair is releasing theirs in 2 weeks it seems. I'm just not sure about the aesthetics of the Corsair block. Not sure if I like or dislike it.


Lets think about this though... is 17C delta T really that bad for a chip that dumps 500W-600W? Even optimus development block has high delta T....

when I ran the ekwb block and active backplate on my shunt modded 3090 strix

350W -- 10C delta T
450W to 525W -- 13C delta T
640W - 670W (Time Spy extreme) - 17C to 20C Delta T


----------



## PhuCCo

motivman said:


> Try kingpin KPX, should reduce your delta T to around 13C. I think I am skipping EKWB this gen, and going with the corsair block for my FE.


Funny enough that's what I am using lol


----------



## J7SC

yzonker said:


> *Case is a lot less crowded* and it's not dumping all that heat in to the case right on the connector. Feels worth it for me.


...case a lot less crowded - whatever do you mean  ?

Below is one of my work-and-play-in-one builds which has to be very quiet even under full load of both systems. Big pic shows 6900XT on the left (w/Bykski block, btw), 3090 Strix w/Phanteks block on the right. I would have liked to get a Phanteks block for my 4090 Giga-G-OC, but nothing available from them yet (I emailed them, no ETA), and the Bykski block next door has performed beyond reproach. Inset pic on the lower right shows GPU speed and temps (while near-silent) and bottom left inset is how it currently looks w/ the air-cooled 4090 until I can rearrange things .

Obviously, this setup is based_ already_ on watercooling both the work and play half, so a no-brainer to put a waterblock on the 4090 as well. Besides the lack of noise, and not wanting heat in the case, my 4090 gets too toasty on hotspot when running it on air full tilt - it has a lot of MHz headroom, but cannot sustain it for long re. temps. Waterblock is really there to get to its best level in a regular OS build and water-cooling at room temps config for daily use.


----------



## Paynal

BigMack70 said:


> Well, my PC has been crashing repeatedly today - black screen + GPU fans to 100%. I tried the new Nvidia driver and that seems to be when it started. It's crashing without any OC and not generating any display driver / device related errors in the windows event log.
> 
> My windows event log is FULL of other errors/warnings related to TPM and permissions though. I may try re-installing Windows to see if that solves the issue.


Oh man. Black screen + maxed GPU fans sounds sounds a lot like the problems that plagued the RTX 3090 in 2020 and 2021, especially the early eVGA FTW3 ones. They would crash with a black screen, fans at max, and red light over one or more of the 8 pin power plugs. Often times, the game or whatever was running at the time of the crash wouldn't terminate, so you'd continue to hear audio, but you'd have no video until a reboot. We tended to see this kind of crash happen when playing relatively low load 3D games (Halo MCC was a repeat offender), or games that suddenly went from high load to a low load for a second. If you check the eVGA forums and look for "red light of death" you'll find a lot of threads on the topic. I really hope this issue isn't back again, I thought it was fixed with the 3090TI...

What've you been playing or running when you've been seeing these blackscreen/screaming fan crashes?


----------



## Nico67

dr/owned said:


> I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:
> 
> View attachment 2581091


I recon the EK 3090 ref block I have was showing wear after a month and the 2080ti block was pretty bad after two years. Funnily I never had any problems prior to those. I think the main cause is the steel jet plates causing galvanic corrosion, which could be why EK are using plexi inserts now?
I'm leaning toward Optimus myself, but would probably get an EK if the did a acetal+copper version.


----------



## KedarWolf

So, are memory clocks going to suffer running an Optimus water block and RAM temps really low?


----------



## BigMack70

Paynal said:


> Oh man. Black screen + maxed GPU fans sounds sounds a lot like the problems that plagued the RTX 3090 in 2020 and 2021, especially the early eVGA FTW3 ones. They would crash with a black screen, fans at max, and red light over one or more of the 8 pin power plugs. Often times, the game or whatever was running at the time of the crash wouldn't terminate, so you'd continue to hear audio, but you'd have no video until a reboot. We tended to see this kind of crash happen when playing relatively low load 3D games (Halo MCC was a repeat offender), or games that suddenly went from high load to a low load for a second. If you check the eVGA forums and look for "red light of death" you'll find a lot of threads on the topic. I really hope this issue isn't back again, I thought it was fixed with the 3090TI...
> 
> What've you been playing or running when you've been seeing these blackscreen/screaming fan crashes?


This is what I've been seeing. MCC is what I was playing. 

I'm hoping the newest driver 526.61 was the problem, because I used DDU to roll back to 522.25 and the past few hours have not had problesm. I also think I may need to do a fresh Windows install - my event log is full of random nonsense warnings and errors. The crash seemed like something to do with switching power states - the final black screen that got me to roll back my driver was just Windows going into sleep mode after 10 minutes of idling that sent the PC into a crash.


----------



## mirkendargen

dr/owned said:


> I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:
> 
> View attachment 2581091


Email [email protected], I also had the plating fail on my first Strix 3090 block and they sent me a new one DHL express, no questions asked. The replacement they sent still looks pristine today 18 months later. I'm curious what EK did for people with this issue, after they were done blaming whatever fluid they used 😇

I have a Bykski CPU block still going that's been in use for 5 years now, it not losing it's plating is how I knew the problem was the GPU block not my coolant, and if my 4090 block has plating issues I'll compare it to the CPU block too.


----------



## KingEngineRevUp

motivman said:


> Lets think about this though... is 17C delta T really that bad for a chip that dumps 500W-600W? Even optimus development block has high delta T....
> 
> when I ran the ekwb block and active backplate on my shunt modded 3090 strix
> 
> 350W -- 10C delta T
> 450W to 525W -- 13C delta T
> 640W - 670W (Time Spy extreme) - 17C to 20C Delta T


But yzonker has a byski block at the same power draw has a lower delta over water beating a EKWB that also has an active backplate for cheaper.

I think the EKWB block looks nice but it's expensive and doesn't perform for what we're paying for it, I guess that's subjective because maybe someone might like the aesthetics and that's why they pay that price.

Historically, from my research, heat kler and alpha cool usually perform as well or better at a lower cost.

It would be nice to see more data points though. The reddit guy could have made a mistake with mounting torque.


----------



## mirkendargen

KingEngineRevUp said:


> But yzonker has a byski block at the same power draw has a lower delta over water beating a EKWB that also has an active backplate for cheaper.
> 
> I think the EKWB block looks nice but it's expensive and doesn't perform for what we're paying for it, I guess that's subjective because maybe someone might like the aesthetics and that's why they pay that price.
> 
> Historically, from my research, heat kler and alpha cool usually perform as well or better at a lower cost.
> 
> It would be nice to see more data points though. The reddit guy could have made a mistake with mounting torque.


You have to take deltas reported by different people with a grain of salt. Coolant temp sensors (motherboards that translate the resistance into a temp reading actually) aren't that accurate, I've had the same sensor in the same loop read several degrees differently after swapping the motherboard. I've also seen different sensors read a few degrees differently on the same motherboard in the same place in the same loop. Then where they are in the loop (right after the rad(s), right before the rad(s), somewhere in between, etc) will change the reading in the same loop. One person can reliably compare the delta on different blocks putting them in the same loop back to back, but a distributed forum of people can't compare results apples-apples. I think of any delta comparison being something like +/-5c.


----------



## Arizor

PSA lads receiving Cablemod cables: remove the Cablemod “cover” over the connector, pictured before you plug in.

whilst it looks nice, the cover “slides” up and down with any pressure, and can make it look like the cable is fully plugged in when it’s not. This is especially true of vertical mounts where this cover is all you can see.

Upon removing mine (usual paranoia), I noticed from the back there was a slight gap in the connection. You _really_ have to push these things to get them fully inserted.

Now without this cover I can clearly see my connector is fully in or not.


----------



## KingEngineRevUp

mirkendargen said:


> You have to take deltas reported by different people with a grain of salt. Coolant temp sensors (motherboards that translate the resistance into a temp reading actually) aren't that accurate, I've had the same sensor in the same loop read several degrees differently after swapping the motherboard. I've also seen different sensors read a few degrees differently on the same motherboard in the same place in the same loop. Then where they are in the loop (right after the rad(s), right before the rad(s), somewhere in between, etc) will change the reading in the same loop. One person can reliably compare the delta on different blocks putting them in the same loop back to back, but a distributed forum of people can't compare results apples-apples. I think of any delta comparison being something like +/-5c.


I agreed with this, I've said multiple times it would be nice to have more data points. But right now I've seen two people on reddit with Delta 18-20C not even drawing 100% TDP.


----------



## Arizor

Port Royal seems so wonky with the 4090 (though perhaps it's also the drivers, perhaps it's 3DMark).

I can get 99.4-7% on all stress tests, run any game for hours, but as soon as I boot up Port Royal it will crash within seconds. So odd.

Used to be the one that would never crash on my 3090, meanwhile Cyberpunk with RT would crash in seconds. Now I can play maxxed out Cyberpunk for hours, but even the slightest overclock and Port Royal ****s the bed. So odd.


----------



## bmagnien

Anyone with bykski blocks - are there any thermal pads between the pcb and backplate? Their diagram seems to only show pads on the front, or am I missing something? If you did add back pads - where did you place them?


----------



## mirkendargen

bmagnien said:


> Anyone with bykski blocks - are there any thermal pads between the pcb and backplate? Their diagram seems to only show pads on the front, or am I missing something? If you did add back pads - where did you place them?


They didn't specify any, I just copied what Gigabyte did on the stock cooler, plus behind the 12VHPWR connector. I had plenty of pads so I wasn't worried about doing extra.

Here's what Gigabyte did. That isn't any torn off, their placement was just...less than ideal lol.


----------



## msky73

bmagnien said:


> Anyone with bykski blocks - are there any thermal pads between the pcb and backplate? Their diagram seems to only show pads on the front, or am I missing something? If you did add back pads - where did you place them?


Just for comparison. Alphacool layout of backplate thermalpads - page 3 in yelllow.


----------



## Xarrion

Hi guys,
I'm trying to understand the reason why my Gigabyte OC shows little results in the benchmarks. I don't have a 4K monitor (yet), just a 3440x1440, GSync off. Could it be I'm constrained by the CPU / RAM?
With a stock Corsair 5600 ram (which runs at 4800 MHz only! and it caused chaos and crashes when EXPO is enabled), I got the following results:
























Now I returned that really nasty Corsair ram and installed a Kingston 6000, which also doesn't want to work with 6000 MHz on Expo, but I see it stable at 5800 MHz.
The results went up by a bit:



























What's wrong? should I return the card and get another? Especially Port Royal shows very poor results.
Thanks for any help!


----------



## doom3crazy

Hey my fellow overclockers. I just picked up a msi gaming trio 4090 and if I recall correctly I thought I noticed someone here post about flashing the suprim bios to the card and potentially getting a little more juice out of it? If so, could someone provide me with that bios? And if so, does it matter whether the bios switch on the card is set to gaming or silent? Lastly, it's been some time since I have flashed any bios(back to the days of the 980 ti) Is NVFLASH still the tool that everyone uses to do that?


----------



## yzonker

KingEngineRevUp said:


> Well seems like EKWB might be a dud this generation, do you know when Heatkiller or Alphacool will be releasing their blocks? I know Corsair is releasing theirs in 2 weeks it seems. I'm just not sure about the aesthetics of the Corsair block. Not sure if I like or dislike it.


No I haven't seen any release dates for other blocks.


----------



## yzonker

KedarWolf said:


> So, are memory clocks going to suffer running an Optimus water block and RAM temps really low?


Probably some. The issue is that it cool/cold soaks at idle and then doesn't want to work at as high of a clock. An XOC bios might be enough to fix this since they keep the mem running at full speed (or maybe prevent the GPU from idling?). I did consider using cheap pads because of this but opted for the thermal putty though. That might help a little when I'm intentionally heating the mem, but probably not for the idle cold soak is my guess.


----------



## doom3crazy

motivman said:


> I beg to differ. 600W bios on my 4090 trio, card draws over 570W using 3X8 cable
> 
> View attachment 2579319


you by chance have that bios you can share with me?  I am looking to flash my gaming trio msi 4090 with a better bios for higher TDP


----------



## PLATOON TEKK

J7SC said:


> @PLATOON TEKK - nice, ahem, dichotomy re. new 1,000 W XOC and existing 600 W connector melt posts here today


Haha it’s like the devil on one of your shoulders chanting “burn it down, more powah!”. Hopefully plugging it in fully and enough dielectric grease will do the trick. The HOF will be even more interesting since you’ll have to check two connectors lol


----------



## YouKnowSedri

Arizor said:


> Upon removing mine


its hard to remove it?


----------



## Pi3.14

And the thing about it is AMD could be better than 4090 fps in games for $999, benchmarks coming soon


----------



## jootn2kx

Pi3.14 said:


> And the thing about it is AMD could be better than 4090 fps in games for $999, benchmarks coming soon


Very unlikely AMD has admit the 7900xtx will be a direct competitor of the 4080, not the 4090. 
Also the difference will be even further away from the 4090 when it comes to ray tracing performance.


----------



## Spiriva

I wouldnt not be to worried about the connector. How many cases is there now world wide of this problem? like 20-24ish? 
Nvidia sold 100.000 4090´s so far, and out of them 20-24 had bad cables? Thats like 0.020% - 0.024%


----------



## neteng101

Benchmark runs from my MSI 4090 Gaming Trio in case anyone is curious what a lesser 4090 does. [email protected] undervolt, +1500 memory, 480W max. Totally worth the upgrade, so much more efficient per frame compared to running the Galax bios on my old 3080 Ti.









I scored 17 277 in Time Spy Extreme


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com













I scored 26 032 in Port Royal


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com













I scored 10 214 in Speed Way


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com


----------



## long2905

neteng101 said:


> Benchmark runs from my MSI 4090 Gaming Trio in case anyone is curious what a lesser 4090 does. [email protected] undervolt, +1500 memory, 480W max. Totally worth the upgrade, so much more efficient per frame compared to running the Galax bios on my old 3080 Ti.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 17 277 in Time Spy Extreme
> 
> 
> Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 26 032 in Port Royal
> 
> 
> Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 10 214 in Speed Way
> 
> 
> Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


now flash a 600w vbios on it and show us its true power


----------



## Zero989

long2905 said:


> now flash a 600w vbios on it and show us its true power


1000w*


----------



## motivman

doom3crazy said:


> you by chance have that bios you can share with me?  I am looking to flash my gaming trio msi 4090 with a better bios for higher TDP


Do a search man, that bios has been posted on here several times, plus my 4090 trio is long gone... so I deleted the bios from my pc.


----------



## motivman

Zero989 said:


> 1000w*


1000w on the 4090 is a waste man. If timespy extreme cannot even pull anything close to 600W (highest I have been able to pull is 577w at 1.1v), what do you need 1000W for? I guess until we can up the voltage of these cards to over 1.1V, 1000w is not needed.


----------



## motivman

motivman said:


> change the extension to .rom, had to change to .pdf to upload here.


for those that need the 600W msi bios... people are too lazy to just do a SEARCH, SMH


----------



## Zero989

motivman said:


> 1000w on the 4090 is a waste man. If timespy extreme cannot even pull anything close to 600W (highest I have been able to pull is 577w at 1.1v), what do you need 1000W for? I guess until we can up the voltage of these cards to over 1.1V, 1000w is not needed.


If derbauer can get access to the tools anyone can

😁


----------



## neteng101

motivman said:


> for those that need the 600W msi bios... people are too lazy to just do a SEARCH, SMH


That's the version that came on the card before I flashed it. Still showing 450W, maximum 480W in GPU-Z - need to use a direct cable from the PSU for 600W or a different adapter?


----------



## cletus-cassidy

yzonker said:


> Well the chiller helps some. I lost about +125 on the VRAM OC though. Had to run +1675 max.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> And once again Speedway likes mem more than core clock. Barely budged.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 050 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I also did a quick run in TS, but it was with HT off, e cores off so nerfed CPU score. Another minimal gain.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 31 535 in Time Spy
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Which chiller / power rating are you using out of curiosity?


----------



## J7SC

Xarrion said:


> Hi guys,
> I'm trying to understand the reason why my Gigabyte OC shows little results in the benchmarks. I don't have a 4K monitor (yet), just a 3440x1440, GSync off. Could it be I'm constrained by the CPU / RAM?
> With a stock Corsair 5600 ram (which runs at 4800 MHz only! and it caused chaos and crashes when EXPO is enabled), I got the following results:
> View attachment 2581213
> 
> View attachment 2581211
> View attachment 2581212
> 
> 
> Now I returned that really nasty Corsair ram and installed a Kingston 6000, which also doesn't want to work with 6000 MHz on Expo, but I see it stable at 5800 MHz.
> The results went up by a bit:
> 
> View attachment 2581214
> 
> 
> View attachment 2581218
> 
> View attachment 2581217
> 
> 
> What's wrong? should I return the card and get another? Especially Port Royal shows very poor results.
> Thanks for any help!


A couple of quick points / questions:
1.) Are you on the quite or the OC vbios ?
2.) Do you have MAI Afterburner installed ? With Power and voltage slider maxed (and your OC for GPU and VRAM dialed in) ? Do run it a few times with HWInfo to check all temps (incl. VRAM)
3.) NVCP settings to 'performance' per usual steps ?
4.) For now, roll the driver back to the earlier one (522.x)


----------



## bmagnien

bmagnien said:


> For anyone that's interested, I've compiled a downloadable collection of next-gen engine demos, available here: Demos - Google Drive
> 
> Included are:
> 
> UE5 City Demo (Matrix Awakens) - 19.1gb
> UE5 Broadleaf Forest - 1.7gb
> UE5 Connifer Forest (this one has awesome snow effects) -1.9gb
> Unity Enemies demo - 1.1gb
> Just download, extract, and run the executable. These are all real-time with various levels of interactivity (make sure to try the flashlight in the forest), and many quality settings to play with.
> 
> Enjoy!


Just added another:
UE5 Rainforest - 1.1GB

Now this really feels like 'can it run Crysis?'


----------



## yzonker

cletus-cassidy said:


> Which chiller / power rating are you using out of curiosity?


 https://a.co/d/e0Kp3hl


----------



## Gking62

yzonker said:


> https://a.co/d/e0Kp3hl


that's a nice unit, would you happen to have pics of your setup, i.e. build log by chance?


----------



## Panchovix

Arizor said:


> Port Royal seems so wonky with the 4090 (though perhaps it's also the drivers, perhaps it's 3DMark).
> 
> I can get 99.4-7% on all stress tests, run any game for hours, but as soon as I boot up Port Royal it will crash within seconds. So odd.
> 
> Used to be the one that would never crash on my 3090, meanwhile Cyberpunk with RT would crash in seconds. Now I can play maxxed out Cyberpunk for hours, but even the slightest overclock and Port Royal ****s the bed. So odd.


Same here, PR on my case really doesn't likes 3045Mhz or more, meanwhile on all other benchs, and games, 3075-3090Mhz is fine.

On my 3080, 2145-2160Mhz was fine for benchs, including PR (in fact, PR was easier to go through than TimeSpy for example), on games I had to go to 2130-2115Mhz.

Probably a few RT cores that are very sensible to the PR bench in particular on my 4090, or something like that I guess lol


----------



## Sheyster

Arizor said:


> PSA lads receiving Cablemod cables: remove the Cablemod “cover” over the connector, pictured before you plug in.
> 
> whilst it looks nice, the cover “slides” up and down with any pressure, and can make it look like the cable is fully plugged in when it’s not. This is especially true of vertical mounts where this cover is all you can see.
> 
> Upon removing mine (usual paranoia), I noticed from the back there was a slight gap in the connection. You _really_ have to push these things to get them fully inserted.
> 
> Now without this cover I can clearly see my connector is fully in or not.


I have a Cablemod cable shipping soon. I assume this cover is there to help prevent bending of the cable close to the connector on the card? If so, I will probably leave it on but double-check that the plug is fully seated.


----------



## jediblr

Arizor said:


> I can get 99.4-7% on all stress tests, run any game for hours, but as soon as I boot up Port Royal it will crash within seconds. So odd.


you can run port royal stress test 20 loop with 99.7%, but cant run Port royal bench? For me port royal is best test for fail OC of my 4090, so try to low memory oc


----------



## Sheyster

motivman said:


> 1000w on the 4090 is a waste man. If timespy extreme cannot even pull anything close to 600W (highest I have been able to pull is 577w at 1.1v), *what do you need 1000W for*? I guess until we can up the voltage of these cards to over 1.1V, 1000w is not needed.


Hard mods I guess... I am not a member of that elite club.


----------



## motivman

neteng101 said:


> That's the version that came on the card before I flashed it. Still showing 450W, maximum 480W in GPU-Z - need to use a direct cable from the PSU for 600W or a different adapter?


you didn't flash the bios correctly then... I got 577w with this exact bios, using the 3 pin adapter that came with the card from factory.


----------



## 4090

Sheyster said:


> I have a Cablemod cable shipping soon. I assume this cover is there to help prevent bending of the cable close to the connector on the card? If so, I will probably leave it on but double-check that the plug is fully seated.


No, the cover is to hide the ugly sense wire

From their homepage: "our new Cable Shroud is also included to help hide unsightly sense wires"


----------



## yzonker

Gking62 said:


> that's a nice unit, would you happen to have pics of your setup, i.e. build log by chance?


No not really. I posted some pics on the 3090 thread a long time ago. Buried in there. It's remote located in another room with its own reservoir and pump.


----------



## Globespy

Is there really any point in these high VRAM OC's on the 4090's other than for benchmark pissing contests?
I've tried running+1500 on the Mem vs +350 and the average FPS in games is the same which makes me wonder what's the point of adding additional stress and heat for minimal gains at best?


----------



## Panchovix

Globespy said:


> Is there really any point in these high VRAM OC's on the 4090's other than for benchmark pissing contests?
> I've tried running+1500 on the Mem vs +350 and the average FPS in games is the same which makes me wonder what's the point of adding additional stress and heat for minimal gains at best?


On my case, I a good performance bump from +0 to +1000Mhz (about 5% or so); now, my VRAM sucks so can't do more than +1150Mhz, but from +1000Mhz to +1150Mhz I get 0.5-1% maybe? which is a lot for such an small jump

I would guess the jump from +1000 to +1500Mhz is a pretty nice jump, and from +0 to +1500, a noticeable one.

Though IMO is on each one if you OC that much, since my Pascal card, I tend to undervolt (less voltage for the same stock clocks, maybe a little more) and OC the VRAM.

EDIT: to add, on my case, based on some game tests

Stock (2760Mhz core clock, +0 VRAM, default voltage) = 100% (Up to 450W in games)
OC (3045Mhz core clock, +1000Mhz VRAM, 1.1V) = 109%-111% (depending of the game) (Up to 550W in games)
UV + OC (2820Mhz core clock, +1000Mhz VRAM, 0.990V) = 104-106% (depending of the game) (Up to 380W in games)


----------



## Xarrion

J7SC said:


> A couple of quick points / questions:
> 1.) Are you on the quite or the OC vbios ?
> 2.) Do you have MAI Afterburner installed ? With Power and voltage slider maxed (and your OC for GPU and VRAM dialed in) ? Do run it a few times with HWInfo to check all temps (incl. VRAM)
> 3.) NVCP settings to 'performance' per usual steps ?
> 4.) For now, roll the driver back to the earlier one (522.x)


1) I'm on OC bios. Just for the sake of trial I switched to Silent bios and there's not been much of a difference.
2) Yes, I have it. With power limit 133% (maximum) and voltage to +0%, didn't touch that. Is it really safe to put voltage +100%?? I thought we are normally dealing with max ~+10% here...
3) Yes.
4) Done that now.
Added some overclocking (+10% voltage on GPU and 133% of target power level). Now I got over the first hump in Port Royal, but still I'm very far away from the average...
Checked the temps like you suggested - they seem very fine, max temp of VRAM was 36 degrees.


----------



## Zero989

Globespy said:


> Is there really any point in these high VRAM OC's on the 4090's other than for benchmark pissing contests?
> I've tried running+1500 on the Mem vs +350 and the average FPS in games is the same which makes me wonder what's the point of adding additional stress and heat for minimal gains at best?


Nah, set manual V/F curve and be done. Overclocking to 3.1Ghz core and 1550Mhz on the memory though is 10% in gains.


----------



## doom3crazy

motivman said:


> Do a search man, that bios has been posted on here several times, plus my 4090 trio is long gone... so I deleted the bios from my pc.





motivman said:


> for those that need the 600W msi bios... people are too lazy to just do a SEARCH, SMH





neteng101 said:


> That's the version that came on the card before I flashed it. Still showing 450W, maximum 480W in GPU-Z - need to use a direct cable from the PSU for 600W or a different adapter?





motivman said:


> you didn't flash the bios correctly then... I got 577w with this exact bios, using the 3 pin adapter that came with the card from factory.


Not lazy man. I did search and could not find the 600w bios. What you just posted is from your own post providing the original gaming trio bios because someone jumped the gun and didn’t save their original before flashing. I believe we need the suprim x bios if I’m not mistaken?


----------



## bmagnien

Panchovix said:


> Same here, PR on my case really doesn't likes 3045Mhz or more, meanwhile on all other benchs, and games, 3075-3090Mhz is fine.
> 
> On my 3080, 2145-2160Mhz was fine for benchs, including PR (in fact, PR was easier to go through than TimeSpy for example), on games I had to go to 2130-2115Mhz.
> 
> Probably a few RT cores that are very sensible to the PR bench in particular on my 4090, or something like that I guess lol


Exact same experience here. 3090 benches I could push way higher without crash, but had to lower to a more reasonable OC to be 100% stable across all major game benchmarks.

on 4090 Port Royal will crash at 3075 but I can go way higher in all games without issue.


----------



## lawson67

Hi guys i have just become the proud of the Zotac RTX 4090 amp extreme Airo, i have noticed that in afterburner i cant move the voltage slider?, voltage unlocked etc in settings but still no dice even with a restart, has anyone else come across this?


----------



## doom3crazy

lawson67 said:


> Hi guys i have just become the proud of the Zotac RTX 4090 amp extreme Airo, i have noticed that in afterburner i cant move the voltage slider?, voltage unlocked etc in settings but still no dice even with a restart, has anyone else come across this?


Update after burner


----------



## lawson67

doom3crazy said:


> Update after burner


I've got the latest version, Edit there was a Beta version now it works thanks mate


----------



## mirkendargen

msky73 said:


> Just for comparison. Alphacool layout of backplate thermalpads - page 3 in yelllow.


That square frame "GPU pad" is interesting, Gigabyte didn't have anything even close to that from the factory. I wonder if other cards have something closer to that.


----------



## motivman

doom3crazy said:


> Not lazy man. I did search and could not find the 600w bios. What you just posted is from your own post providing the original gaming trio bios because someone jumped the gun and didn’t save their original before flashing. I believe we need the suprim x bios if I’m not mistaken?
> View attachment 2581295


Here you go, sorry I was not in a good mood this morning, lol.


----------



## Arizor

Sheyster said:


> I have a Cablemod cable shipping soon. I assume this cover is there to help prevent bending of the cable close to the connector on the card? If so, I will probably leave it on but double-check that the plug is fully seated.


It’s simply there to hide the look of the cables.

super easy to remove to those who asked - just slide it down and off.


----------



## neteng101

motivman said:


> Here you go, sorry I was not in a good mood this morning, lol.


You had me there - could have swore that was the original Gaming Trio bios. This one works. 600w unlocked.

Edit - damn impressive - I can hit 3Ghz now and 27k+ PR...









I scored 27 112 in Port Royal


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com





And I was just going to be dumb and happy living with 480w. LOL. Probably going to keep this and just run full bore!


----------



## doom3crazy

motivman said:


> Here you go, sorry I was not in a good mood this morning, lol.


Totally okay man. I appreciate you! I have those days too. And I get it on the search thing. Sometimes I get annoyed with people being like “gimme gimme gimme” without taking any time to try and find a solution to the problem / find an answer etc haha


----------



## Zero989

doom3crazy said:


> Not lazy man. I did search and could not find the 600w bios.


could have just asked me lol


----------



## J7SC

Xarrion said:


> 1) I'm on OC bios. Just for the sake of trial I switched to Silent bios and there's not been much of a difference.
> 2) Yes, I have it. With power limit 133% (maximum) and voltage to +0%, didn't touch that. Is it really safe to put voltage +100%?? I thought we are normally dealing with max ~+10% here...
> 3) Yes.
> 4) Done that now.
> Added some overclocking (+10% voltage on GPU and 133% of target power level). Now I got over the first hump in Port Royal, but still I'm very far away from the average...
> Checked the temps like you suggested - they seem very fine, max temp of VRAM was 36 degrees.
> 
> View attachment 2581292
> 
> 
> View attachment 2581294


...try with the voltage slider maxed to the right - that kicks in an extra power envelope. On air with 20+ C, my Gigabyte Gaming OC scores in the mid 28k range for PR (on a 5950X CPU setup), my GPU shows about 560W ++ peak then. Just make sure to do a few test runs just w/ HWI logging (because it also lists the VRAM temps). I would also add +1000 Mhz to the VRAM - remember, this is GDDR6X so the actual MHz gain is far less. Still, everything you do at your own responsibility = legal fineprint


----------



## doom3crazy

Zero989 said:


> could have just asked me lol


I made a post asking if anyone had it but I think people missed it. Got buried. Had I known who to ask I would of totally asked you!  haha


----------



## Krzych04650

Finally managed to buy 4090. It is Inno3D reference pcb model with 450W power limit and single BIOS, so that sucks, but it was actually a bit cheaper than official 9699 PLN MSRP, I got it for 9500, and any random models go for 11000+ now, with most desirable ones like Asus and MSI going for 13000, so it was a very good deal. Will see how it behaves. Alphacool has ES block for it already, so it all makes sense. Not exactly what I wanted but I guess you have to make do with what is available in current circumstances. I am not going to sit around waiting for half a year, in fact I just finished tuning new platform, went from 6900K to 13900K, over 2x performance gain in games, all that is left now is GPU. 2080 Ti is so comically underpowered for this system that I had to create custom 640p or even 540p resolutions to even bench it in some of the games I've used for testing without hitting GPU limits.


----------



## TSportM

motivman said:


> Here you go, sorry I was not in a good mood this morning, lol.


hello is this bios only for the gaming x trio?

would like to test 600w on my suprim X

cheers


----------



## Zero989

TSportM said:


> hello is this bios only for the gaming x trio?
> 
> would like to test 600w on my suprim X
> 
> cheers


It's a special BIOS from select Liquid X cards (not all Liquid X cards have it, they go to 530W only). MSI is really annoying in this regard.






SuprimX4090G.rom







drive.google.com


----------



## doom3crazy

Zero989 said:


> It's a special BIOS from select Liquid X cards (not all Liquid X cards have it, they go to 530W only). MSI is really annoying in this regard.
> 
> 
> 
> 
> 
> 
> SuprimX4090G.rom
> 
> 
> 
> 
> 
> 
> 
> drive.google.com


I am trying to save my original bios before I flash on my gaming trio 4090 and it keeps saying "bios reading not supported" am I missing something? Currently on latest drivers 526.47


----------



## neteng101

doom3crazy said:


> I am trying to save my original bios before I flash on my gaming trio 4090 and it keeps saying "bios reading not supported" am I missing something? Currently on latest drivers 526.47


Get the latest Nvflash - yours is old.









NVIDIA NVFlash (5.792.0) Download


NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID




www.techpowerup.com





Also it seems they flipped the naming by accident - nvflash32 is actually the 64-bit exe in the latest release.


----------



## doom3crazy

neteng101 said:


> Get the latest Nvflash - yours is old.


I forgot you can do it with nvflash. I was actually trying to save it using the latest GPU Z


----------



## doom3crazy

neteng101 said:


> Get the latest Nvflash - yours is old.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA NVFlash (5.792.0) Download
> 
> 
> NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Also it seems they flipped the naming by accident - nvflash32 is actually the 64-bit exe in the latest release.


At the risk of spreading mis information, I would double check that. Or maybe it depends on which server you're selecting when downloading? I checked mine and everything checks out. 32bit is 32 and 64bit is 64.


----------



## Gking62

neteng101 said:


> Get the latest Nvflash - yours is old.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA NVFlash (5.792.0) Download
> 
> 
> NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Also it seems they flipped the naming by accident - nvflash32 is actually the 64-bit exe in the latest release.


sorry, how did you figure this out?


----------



## neteng101

doom3crazy said:


> At the risk of spreading mis information, I would double check that. Or maybe it depends on which server you're selecting when downloading? I checked mine and everything checks out. 32bit is 32 and 64bit is 64.


Its possible they fixed it since I pulled it down.

I have 9/7/22 file dates on my exes.

Tried downloading again, its right now. Maybe its one of the download servers???


----------



## Xarrion

J7SC said:


> ...try with the voltage slider maxed to the right - that kicks in an extra power envelope. On air with 20+ C, my Gigabyte Gaming OC scores in the mid 28k range for PR (on a 5950X CPU setup), my GPU shows about 560W ++ peak then. Just make sure to do a few test runs just w/ HWI logging (because it also lists the VRAM temps). I would also add +1000 Mhz to the VRAM - remember, this is GDDR6X so the actual MHz gain is far less. Still, everything you do at your own responsibility = legal fineprint


Hah. Just crossed the fingers and did like you said - new hardware should survive a bit of mishandling!
I was able to get another 500 score by that - still not very close to a mass result. GPU power rails report max of 520 W, VRAMs are at 58 (edited: thought you wanted VRM temps) degrees, max hot spot temp on GPU was 79.2. GPU-Z reports that the Performance Cap reason all the time is VRel. Can it be a problem of a PSU?

Are you running a 4K resolution? I'm not sure how to push it further and trying to blame my CPU for bottlenecking at 1440p.


----------



## Xarrion

I just ran 3DMark's PCIe bandwidth test and the results are strange. I god 3.39 GB/s - I scored 0 in PCI Express. According to google, a proper PCIe 4.0 x16 should give *32 GB/s*. Could someone confirm they are getting 32 GB/s in 3DMark's "PCI Express feature test"?
I'm using a vertical GPU placement and a riser cable from LinkUp - Extreme4+. Could this be a faulty cable?


----------



## Shaded War

Installed the newest 526.47 drivers and started having a bunch of flickering issues in MW2, similar to a bad display cable. Didn't have this issue on 522.25 but there was a few textures that would pop in. Hopefully rolling back to older drivers fixes it.


----------



## Zero989

Shaded War said:


> Installed the newest 526.47 drivers and started having a bunch of flickering issues in MW2, similar to a bad display cable. Didn't have this issue on 522.25 but there was a few textures that would pop in. Hopefully rolling back to older drivers fixes it.


Use the hotfix 😐


----------



## Arizor

Xarrion said:


> I just ran 3DMark's PCIe bandwidth test and the results are strange. I god 3.39 GB/s - I scored 0 in PCI Express. According to google, a proper PCIe 4.0 x16 should give *32 GB/s*. Could someone confirm they are getting 32 GB/s in 3DMark's "PCI Express feature test"?
> I'm using a vertical GPU placement and a riser cable from LinkUp - Extreme4+. Could this be a faulty cable?


Yep I'm on a riser cable and I get 28GB/s on that test, I would definitely say it's your riser cable (I'm using the Coolermaster vertical mount V3 with its included PCIe 4.0 riser).


----------



## Panchovix

Xarrion said:


> Could someone confirm they are getting 32 GB/s in 3DMark's "PCI Express feature test"?


I did a quick test, with ton of things on the background lol (not my bench OS) and I got 26.19 GB/s, if that helps


----------



## mirkendargen

Xarrion said:


> I just ran 3DMark's PCIe bandwidth test and the results are strange. I god 3.39 GB/s - I scored 0 in PCI Express. According to google, a proper PCIe 4.0 x16 should give *32 GB/s*. Could someone confirm they are getting 32 GB/s in 3DMark's "PCI Express feature test"?
> I'm using a vertical GPU placement and a riser cable from LinkUp - Extreme4+. Could this be a faulty cable?


Get the latest BIOS for your mobo, every x670E board has an update with something in the notes about "fix PCIE speed on 4090". I'm using a 20cm Linkup riser cable ("ultra" not "extreme+") with no issues.


----------



## bradgraves86

Just got on here and o ly read the last couple of pages so sorry for asking stupid questions. I have a sttix 4090 and in all the games I play and with 3dmark testing I haven't seen the card use above 490watts. Ot is overclocked to 3980 core and 2200 mhz memory. I know the memory can go higher but I was wondering if it ever uses more watts than that normally. I am using a thermaltake gf3 1650w psu. Native 600w power cable. So far so good on the melting I think.


----------



## Arizor

bradgraves86 said:


> Just got on here and o ly read the last couple of pages so sorry for asking stupid questions. I have a sttix 4090 and in all the games I play and with 3dmark testing I haven't seen the card use above 490watts. Ot is overclocked to 3980 core and 2200 mhz memory. I know the memory can go higher but I was wondering if it ever uses more watts than that normally. I am using a thermaltake gf3 1650w psu. Native 600w power cable. So far so good on the melting I think.


I'm assuming you meant to type 3080, not 3980  . The wattage will fluctuate based upon the specific load, no games really want all 600 watts (unless you're running something experimental or crazy like Quake 2 RTX).

What are you using to overclock? Most of us recommend the latest beta of Afterburner (AB).

If using AB, somewhere in the 3ghz range for your core is fine, and you should be able to add +1500mhz quite easily for your memory.

As myself and others have mentioned though, massively diminished returns here for running your day-to-day clocks and wattage so high.

These cards are really good undervolters. I run at 0.95V @ 2730mhz, +1650 on the memory, wattage regularly sits in the low 300s in most games and temps stay in the high 50s.


----------



## gamerMwM

Just got a FE 4090. Booted it up with MSI Afterburner. Should Unlock voltage control be set to Reference or Third Party in settings?

Sent from my SM-G975U using Tapatalk


----------



## bradgraves86

Arizor said:


> I'm assuming you meant to type 3080, not 3980  . The wattage will fluctuate based upon the specific load, no games really want all 600 watts (unless you're running something experimental or crazy like Quake 2 RTX).
> 
> What are you using to overclock? Most of us recommend the latest beta of Afterburner (AB).
> 
> If using AB, somewhere in the 3ghz range for your core is fine, and you should be able to add +1500mhz quite easily for your memory.
> 
> As myself and others have mentioned though, massively diminished returns here for running your day-to-day clocks and wattage so high.
> 
> These cards are really good undervolters. I run at 0.95V @ 2730mhz, +1650 on the memory, wattage regularly sits in the low 300s in most games and temps stay in the high 50s.


Right. I'm using msi afterburner. I only have the memory up 500mhz so I will try a little higher than that. But in the afterburner osd the highest I see gpu power draw is like 470w. I was just hoping I could try a different benchmark (maybe furmark or something) just to see if I can get it to go higher. I just wanted to make sure it's not being limited to 450 some how. Thanks for the help.


----------



## Arizor

gamerMwM said:


> Just got a FE 4090. Booted it up with MSI Afterburner. Should Unlock voltage control be set to Reference or Third Party in settings?
> 
> Sent from my SM-G975U using Tapatalk


Either should work as long as you have the latest AB beta.



bradgraves86 said:


> Right. I'm using msi afterburner. I only have the memory up 500mhz so I will try a little higher than that. But in the afterburner osd the highest I see gpu power draw is like 470w. I was just hoping I could try a different benchmark (maybe furmark or something) just to see if I can get it to go higher. I just wanted to make sure it's not being limited to 450 some how. Thanks for the help.


Download and run Quake 2 RTX on Steam (it's free). Turn on all the options and turn up the resolution render to 200% in the options. You'll quickly see how high you can go...!


----------



## cletus-cassidy

bmagnien said:


> Exact same experience here. 3090 benches I could push way higher without crash, but had to lower to a more reasonable OC to be 100% stable across all major game benchmarks.
> 
> on 4090 Port Royal will crash at 3075 but I can go way higher in all games without issue.


Same for me. Time Spy tripped my 3090 if it was over 2115 but PR could do 2160. Now it’s reversed. My effective clocks generally don’t get above 3000 much though.


----------



## cletus-cassidy

neteng101 said:


> Its possible they fixed it since I pulled it down.
> 
> I have 9/7/22 file dates on my exes.
> 
> Tried downloading again, its right now. Maybe its one of the download servers???


You aren’t crazy. Same happened to me. Had to use 32 because every time I used 64 it said i was in a 64 bit OS. They must have flipped the files names.


----------



## Panchovix

Arizor said:


> If using AB, somewhere in the 3ghz range for your core is fine, and you should be able to add +1500mhz quite easily for your memory.
> 
> As myself and others have mentioned though, massively diminished returns here for running your day-to-day clocks and wattage so high.
> 
> These cards are really good undervolters. I run at 0.95V @ 2730mhz, +1650 on the memory, wattage regularly sits in the low 300s in most games and temps stay in the high 50s.


It hurts reading this with my +1000-+1100 Mhz VRAM 

How much more performance you get with that UV + VRAM OC vs stock? I get about 5% or so



neteng101 said:


> Its possible they fixed it since I pulled it down.
> 
> I have 9/7/22 file dates on my exes.
> 
> Tried downloading again, its right now. Maybe its one of the download servers???


I can confirm it, had to do the same when I downloaded it some days ago


----------



## J7SC

Xarrion said:


> Hah. Just crossed the fingers and did like you said - new hardware should survive a bit of mishandling!
> I was able to get another 500 score by that - still not very close to a mass result. GPU power rails report max of 520 W, VRAMs are at 58 (edited: thought you wanted VRM temps) degrees, max hot spot temp on GPU was 79.2. GPU-Z reports that the Performance Cap reason all the time is VRel. Can it be a problem of a PSU?
> 
> Are you running a 4K resolution? I'm not sure how to push it further and trying to blame my CPU for bottlenecking at 1440p.





Xarrion said:


> I just ran 3DMark's PCIe bandwidth test and the results are strange. I god 3.39 GB/s - I scored 0 in PCI Express. According to google, a proper PCIe 4.0 x16 should give *32 GB/s*. Could someone confirm they are getting 32 GB/s in 3DMark's "PCI Express feature test"?
> I'm using a vertical GPU placement and a riser cable from LinkUp - Extreme4+. Could this be a faulty cable?





Arizor said:


> Yep I'm on a riser cable and I get 28GB/s on that test, I would definitely say it's your riser cable (I'm using the Coolermaster vertical mount V3 with its included PCIe 4.0 riser).


...looks like you might have found the culprit / far bigger part of the issue (PCIe), notwithstanding the 500 point pick-up. BTW, I am also using the LinkUp PCIe 4.0 cables (one of them w/ the 4090, another one in another build) with zero problems...I take it that it is a PCIe 4.0 cable (not 3.0).

First, open_ CPU_-Z and check 'Mainboard' / bottom screen. You might have to put a slight load on the GPU in case it 'downshifted' for power savings. Screengrab of that, pls, with slight load. Next, you can try setting 'gen3' for PCI in the mobo bios. You can of course also just test w/o a riser cable altogether if you can fit the monster card onto your system / in the case. While at it, also check if the PCIe slot on the mobo has any dust etc in it (it happens) and while at it, check the little gold-plated fingers of the PCIe on the card itself...


----------



## cletus-cassidy

yzonker said:


> https://a.co/d/e0Kp3hl


Couple of options there. You have the 1/4 HP? Asking because I bought a 1/2 HP about a year ago that is still sitting in the box and curious if it could even cool these beasts 600w GPU and 300 w CPUs.


----------



## Arizor

Panchovix said:


> It hurts reading this with my +1000-+1100 Mhz VRAM
> 
> How much more performance you get with that UV + VRAM OC vs stock? I get about 5% or so


Yeah 5-7% depending upon the game, I wouldn't worry as long as you can sit around 1000 on the mem, it's diminishing returns the further you go.


----------



## J7SC

...want extra benchie scores ?


----------



## Xarrion

Thanks everyone so much!! @Arizor @Panchovix @mirkendargen @J7SC for your prompt reactions and advices.
Updated my mobo BIOS to the latest version - getting the normal bandwidth now, 26.78 GB/s.
CPU-Z reports proper bandwidth under load:









Now I'm getting much better scores at Port Royal - 26 459 (I scored 26 459 in Port Royal). Gotta try to clock my RAM to its advertised 6000 MHz, now I ran it with 4800Mhz stock speed as I was getting a bit of errors with EXPO activated.

P.S. Now the glitches start - MSI Afterburner doesn't see the GPU sensors anymore to display, and 3DMark says I'm running with 2 video adapters, second being Microsoft Basic Video Adapter.


----------



## changboy

What do you think about those thermal pad for the price ? :








Thermalright Coussin thermique 12,8 W/mK, 85 x 45 x 2 mm, non conducteur résistant à la chaleur, résistant aux hautes températures, coussinets thermiques en silicone pour ordinateur portable/GPU/refroidisseur LED (2 mm) : Amazon.ca: Électronique


Thermalright Coussin thermique 12,8 W/mK, 85 x 45 x 2 mm, non conducteur résistant à la chaleur, résistant aux hautes températures, coussinets thermiques en silicone pour ordinateur portable/GPU/refroidisseur LED (2 mm) : Amazon.ca: Électronique



www.amazon.ca


----------



## th3illusiveman

Spiriva said:


> I wouldnt not be to worried about the connector. How many cases is there now world wide of this problem? like 20-24ish?
> Nvidia sold 100.000 4090´s so far, and out of them 20-24 had bad cables? Thats like 0.020% - 0.024%


Nvidia reported all the 4090s they've sold, we dont know all the 4090s that have burnt up. Fact that this is an issue is quite pathetic on a $1600 GPU - they just throw the cheapest, crappiest adapter they can make in there to save some $$$


----------



## yzonker

Anybody happen to know what kind of heater the LN2 guys use on the VRAM?


----------



## gamerMwM

Panchovix said:


> It hurts reading this with my +1000-+1100 Mhz VRAM
> 
> How much more performance you get with that UV + VRAM OC vs stock? I get about 5% or so
> 
> 
> I can confirm it, had to do the same when I downloaded it some days ago


Yeah my Nvidia FE 4090 max semi-stable (still testing, 1st day) OC is:
+225 Core
+1200 Vram
85% Fans

I think my core clocks are decent as I can get above 3000 Mhz Avg.

I also thought I should get more on the VRAM, but my Max stable Memory OC I'm avg. around 1475 mhz or less. I see the best Vram scores in 3DMark are around 1525 mhz or higher. 

Guess my memory is decent. But I'm more satisfied with how the core clocks on my card.

Anyway, it's not like I'm gonna run a crazy OC for a daily driver while gaming. Just happy overall with what the 4090 can do!

Sent from my SM-G975U using Tapatalk


----------



## motivman

gamerMwM said:


> Yeah my Nvidia FE 4090 max semi-stable (still testing, 1st day) OC is:
> +225 Core
> +1200 Vram
> 85% Fans
> 
> I think my core clocks are decent as I can get above 3000 Mhz Avg.
> 
> I also thought I should get more on the VRAM, but my Max stable Memory OC I'm avg. around 1475 mhz or less. I see the best Vram scores in 3DMark are around 1525 mhz or higher.
> 
> Guess my memory is decent. But I'm more satisfied with how the core clocks on my card.
> 
> Anyway, it's not like I'm gonna run a crazy OC for a daily driver while gaming. Just happy overall with what the 4090 can do!
> 
> Sent from my SM-G975U using Tapatalk


my current FE is garbage:

best I can do is:
+150 core
+1200 Vram

hope my new FE overclocks better...


----------



## Zero989

changboy said:


> What do you think about those thermal pad for the price ? :
> 
> 
> 
> 
> 
> 
> 
> 
> Thermalright Coussin thermique 12,8 W/mK, 85 x 45 x 2 mm, non conducteur résistant à la chaleur, résistant aux hautes températures, coussinets thermiques en silicone pour ordinateur portable/GPU/refroidisseur LED (2 mm) : Amazon.ca: Électronique
> 
> 
> Thermalright Coussin thermique 12,8 W/mK, 85 x 45 x 2 mm, non conducteur résistant à la chaleur, résistant aux hautes températures, coussinets thermiques en silicone pour ordinateur portable/GPU/refroidisseur LED (2 mm) : Amazon.ca: Électronique
> 
> 
> 
> www.amazon.ca


This is better https://www.digikey.ca/en/products/detail/t-global-technology/tg-pp10-50/6204863


----------



## Gking62

...


----------



## Gking62

changboy said:


> What do you think about those thermal pad for the price ? :
> 
> 
> 
> 
> 
> 
> 
> 
> Thermalright Coussin thermique 12,8 W/mK, 85 x 45 x 2 mm, non conducteur résistant à la chaleur, résistant aux hautes températures, coussinets thermiques en silicone pour ordinateur portable/GPU/refroidisseur LED (2 mm) : Amazon.ca: Électronique
> 
> 
> Thermalright Coussin thermique 12,8 W/mK, 85 x 45 x 2 mm, non conducteur résistant à la chaleur, résistant aux hautes températures, coussinets thermiques en silicone pour ordinateur portable/GPU/refroidisseur LED (2 mm) : Amazon.ca: Électronique
> 
> 
> 
> www.amazon.ca


these are newer...

Thermalright Thermal Pad 14.8 W/mK Non Conductive Heat Dissipation Silicone Pad for PC Laptop Heatsink CPU/GPU/LED Graphics Card Motherboard Silicone Grease Multi-Size Pad (120x120x2.0mm) https://a.co/d/gtOkx1C


----------



## mirkendargen

Xarrion said:


> Thanks everyone so much!! @Arizor @Panchovix @mirkendargen @J7SC for your prompt reactions and advices.
> Updated my mobo BIOS to the latest version - getting the normal bandwidth now, 26.78 GB/s.
> CPU-Z reports proper bandwidth under load:
> View attachment 2581373
> 
> 
> Now I'm getting much better scores at Port Royal - 26 459 (I scored 26 459 in Port Royal). Gotta try to clock my RAM to its advertised 6000 MHz, now I ran it with 4800Mhz stock speed as I was getting a bit of errors with EXPO activated.
> 
> P.S. Now the glitches start - MSI Afterburner doesn't see the GPU sensors anymore to display, and 3DMark says I'm running with 2 video adapters, second being Microsoft Basic Video Adapter.


Igpu enabled?


----------



## changboy

Ok i saw it on amazon canada but they are more expensive but have more in the package.
Old : 85x45x2mm : $20.99
New: 120x120x2mm : $54.21


----------



## gamerMwM

Any good water block recommendations for the FE?


----------



## motivman

gamerMwM said:


> Any good water block recommendations for the FE?


definitely not EKWB, lol


----------



## TSportM

Zero989 said:


> It's a special BIOS from select Liquid X cards (not all Liquid X cards have it, they go to 530W only). MSI is really annoying in this regard.
> 
> 
> 
> 
> 
> 
> SuprimX4090G.rom
> 
> 
> 
> 
> 
> 
> 
> drive.google.com


Ok thanks, but its compatible with all suprimX cards ? mine is not the liquide

cheers


----------



## minitt_1216

Hulk1988 said:


> I had the Waterforce card and I cannot recommend it. The temperatures are similar compared to my Strix and Suprim X and the memory temperature was even higher on the Waterforce card.


Using this : GV-N4090AORUSX W-24GD the temp has been excellent so far. about 10C cooler than Air cooled models . around 50 to 54c at full load in gaming with 460rpm ,30% which is very quiet. This is probably because 4090 dont have any memory chip on the top side.


----------



## sugi0lover

bmagnien said:


> Anyone with bykski blocks - are there any thermal pads between the pcb and backplate? Their diagram seems to only show pads on the front, or am I missing something? If you did add back pads - where did you place them?


There is no thermal pads bet. the pcb and backplate~


----------



## Nico67

cletus-cassidy said:


> Couple of options there. You have the 1/4 HP? Asking because I bought a 1/2 HP about a year ago that is still sitting in the box and curious if it could even cool these beasts 600w GPU and 300 w CPUs.


1/2hp is good for around 790w, but depending on your liquid capacity it can absorb more but just not get as lower temps. You are right on the limit of needing to go up to 1hp, but they may scale power back a bit next gen? Kinda gotta hope they will get things back to more reasonable figures


----------



## Xarrion

mirkendargen said:


> Igpu enabled?


Not sure how to disable it lol. Will try DDU.


----------



## Benni231990

Why do you all guys push the RAM only to 1000-1300 my ram can 1750 ?
Is less clock better?

i See the First flicker on Screen with 2000+ so i reduce to 1750 in hope they dont error correct and dont crash


----------



## Azazil1190

__
https://www.reddit.com/r/nvidia/comments/yodzba


----------



## newls1

Just wondering what are y'alls opinion on using a byski block on a GB Gaming OC 4090? Ive used them one time back on my 3090 and didnt have the best outcome, so looking for opinions here, thank you>


----------



## newls1

GB has a updated F2 bios on the site for the gaming OC, should i flash my new card to this when it comes in? Anyone know what it changes?


----------



## Rithina

sugi0lover said:


> There is no thermal pads bet. the pcb and backplate~


Ya I am using the block now and backplate pad is not necessary


----------



## Xarrion

Does anyone know a way to raise the minimum clocks for the 4090? When it's idle (GPU clock 210 MHz, mem clock 405 MHz) I'm getting a really nasty coil whine. When it has some load, even light, it's much better.


----------



## Rithina

ShadowYuna said:


> Thank you so much. I put preoder on EK but their ETA is 30th Nov.
> Just ordered the Bykski block from Formula Mod as they can ship by DHL.
> Finally set up can be finish by next weekend.


The bykski block performance is good on my galax SG 4090. except u need to get your own thermal pad as their bundle size is way off


----------



## Brandur

Anyone got a Inno3d iChill Black?


----------



## Zero989

TSportM said:


> Ok thanks, but its compatible with all suprimX cards ? mine is not the liquide
> 
> cheers


That's the same bios X trio users are using afaik, so yes


----------



## Panchovix

Benni231990 said:


> Why do you all guys push the RAM only to 1000-1300 my ram can 1750 ?
> Is less clock better?
> 
> i See the First flicker on Screen with 2000+ so i reduce to 1750 in hope they dont error correct and dont crash


Luck of the draw, I would say that probably +1500Mhz is the average, there is some VRAM like mine who is below average, so can only do +1100 stable, and some others above average that can do +2000Mhz without flickering and stable.


----------



## dante`afk

motivman said:


> definitely not EKWB, lol


I like mine.

I'll even get a raiser cable now and mount it vertically because it's so pretty


----------



## Tideman

Moddiy cable finally installed today. I chose the flexible silicon wires, 3x8 version.

To those wanting a custom cable fast, I can't recommend them enough. The cable was at my door on Friday, 5 days after ordering.


----------



## Gking62

dante`afk said:


> I like mine.
> 
> I'll even get a raiser cable now and mount it vertically because it's so pretty
> 
> View attachment 2581439


very nice, I too ordered the Plexi for my Strix looking forward to mine.

when did you order? I ordered on Oct. 19 but still processing, no ship ETA yet...



Tideman said:


> Moddiy cable finally installed today. I chose the flexible silicon wires, 3x8 version.
> 
> To those wanting a custom cable fast, I can't recommend them enough. The cable was at my door on Friday, 5 days after ordering.
> 
> View attachment 2581443


concur, I already have the CableMod 4x8 arriving today, but yesterday ordered the ModDIY 3x8 HD Sleeved to see how it'll compare style wise, thanks for posting.

btw, how smooth was plugging it in?


----------



## bmagnien

Rithina said:


> Ya I am using the block now and backplate pad is not necessary


This is bizarre to me. So the bykski backplate doesn’t come in contact with the pcb at all? I’d hope not considering it’d be metal on metal contact with no pads. But if it doesn’t…then it’s providing 0 heat dissipation? That seems like a huge downgrade from the stock backplate which clearly dissipates a ton of heat given how hot it gets. What’s the point of the backplate at all then?


----------



## yzonker

bmagnien said:


> This is bizarre to me. So the bykski backplate doesn’t come in contact with the pcb at all? I’d hope not considering it’d be metal on metal contact with no pads. But if it doesn’t…then it’s providing 0 heat dissipation? That seems like a huge downgrade from the stock backplate which clearly dissipates a ton of heat given how hot it gets. What’s the point of the backplate at all then?


It doesn't dissapate heat so much as heat soak unless you are actively cooling it. My TUF didn't have any pads under the original backplate. Didn't put any under the Bykski either.


----------



## bmagnien

yzonker said:


> It doesn't dissapate heat so much as heat soak unless you are actively cooling it. My TUF didn't have any pads under the original backplate. Didn't put any under the Bykski either.


Giga OC has a ton of pads for its stock backplate. Literally any air movement over a backplate that size will dissipate heat. But it’s interesting that TUF doesn’t take advantage of that.


----------



## J7SC

Xarrion said:


> Not sure how to disable it lol. Will try DDU.


Mobo bios ? At least that was the case with the iGPU + GPU systems before.


----------



## newls1

dante`afk said:


> I like mine.
> 
> I'll even get a raiser cable now and mount it vertically because it's so pretty
> 
> View attachment 2581439


OMG that looks amazing! Nothing beats the looks of a FCWB on a gpu. Ill be waiting for EKWB to release one for mine...


----------



## Tideman

Gking62 said:


> very nice, I too ordered the Plexi for my Strix looking forward to mine.
> 
> concur, I already have the CableMod 4x8 arriving today, but yesterday ordered the ModDIY 3x8 HD Sleeved to see how it'll compare style wise, thanks for posting.
> 
> btw, how smooth was plugging it in?


It felt fine plugging it in. There is no audible click so obviously make sure to inspect it closely to make sure this is no gap there whatsoever.


----------



## Panchovix

Long post ahead, gonna spoiler it and add an TLDR below.

Did some comparisons between 4 presets (Stock, UV + OC, OC and OC + OV) between 3 benches (SpeedWay, Port Royal and TSE)



Spoiler: Long Post with details



The presets are: (All except Stock also have +1000Mhz on the VRAM)

Stock: 2685-2700Mhz at 1.05V, stock power limit
UV + OC: 2800-2830Mhz at 0.99V, stock power limit
OC: 2915-2940 Mhz at 1.05V, stock power limit
OC + OV: 3000-3030Mhz at 1.1V, 600W power limit
The table looks like this


RTX 4090 ASUS TUF (non-OC)​3DMark SpeedWay​3DMark Port Royal​3DMark TimeSpy Extreme (Graphics Score)​Stock​9951 (100%)​25863 (100%)​19643 (100%)​UV + OC​10282 (103.32%)​26777 (103.53%)​20121 (102.43%)​OC​10505 (105.56%)​27409 (105.97%)​20782 (105.79%)​OC + OV​10639 (106.91%)​27689 (107.06%)​21038 (107.10%)​


And as an extra table: average between the 3 benches, power consumption and temps. (Outside the spoiler)

*Note: Used worst case scenario for power consumption, actual games do not get close to the power consumption here, but it works as comparison*



TL-DR table:


RTX 4090 ASUS TUF (non-OC)​Average Peformance​Power Consumption (Max on TSE)​Average Temp​Stock​100%​450W​55°C​UV + OC​103.09% (3% faster)​370W​53°C​OC​105.77% (5.8% faster)​450W​57°C​OC + OV​107.02% (7% faster)​550W​60°C​

Comparisons links:


SpeedWay: Link
PortRoyal: Link
TimeSpy Extreme: Link


----------



## J7SC

Quick tip for those installing new GPU waterblocks...first, make sure you checked / tightened all screws on the block. My first custom blocks (about ten years ago) from EK both had loose screws / bolts, and one of them leaked through the top section until I tightened them which fixed that. Very few blocks I got since then over the years didn't have at least some loose screws /bolts (ie. top, front cover). In addition, flush the new blocks first with distilled / deionized water...usually they are clean anyway - usually, but not always...

On Bykski 4090 Giga-G-OC, mine is supposed to arrive in a few days, and I'll prep and mount it either on the weekend or the following week. On the Bykski I bought last year for another GPU, they actually had sent the wrong one associated with what was on their site (my GPU had extra phases and also length), but my trusty Dremel fixed that as they offered to exchange for free but had no ETA...also, the instructions were wrong > 'put this thermal pad on the VRAM and that _thermal pad_ _on the GPU die'_...still, once all that was addressed, it has been working flawlessly since and the nickel coating still looks pristine 18 months later...besides, the price is right.

Re. backplate, these and other power-hungry GPUs generate a lot of heat, so the more you can do to transfer that away, the better. I used a bit of thermal putty for both the backplate of the Bykski and the Phanteks below for another card. In addition, per below, both got big heatsinks on the backplates.


----------



## newls1

Would like to just ask a very simple and quick question if yall dont mind.... what is currently the preferred driver to use right now? is it the most current 526.47 or is it the 522.25 driver from a few weeks ago. Before I sold off my 3090Ti, that 522.25 driver was the cats ass! Does the most current one offer the same performance?


----------



## LunaP

Byski block arrives today along w/ extra water fittings tomorrow, I did the DHL no signature thing but now that I did its asking me to set it up again (or is that just standard whenever it updates? ) 
2 weeks now and no response from cable mod but seeing them up on amazon now so might just buy there. Support hasn't responded at all.


----------



## Panchovix

newls1 said:


> Would like to just ask a very simple and quick question if yall dont mind.... what is currently the preferred driver to use right now? is it the most current 526.47 or is it the 522.25 driver from a few weeks ago. Before I sold off my 3090Ti, that 522.25 driver was the cats ass! Does the most current one offer the same performance?


I'm using 522.25 and no issues so far, it seems on the latest 2 drivers (including the hotfix) there is some issues.


----------



## LunaP

dr/owned said:


> I'm following up with the Bykski seller on AliExpress to see if they'll replace my 3090 block. That'll impact whether I do them or Alphacool for 4090 Trio. Cause this is a bit ridiculous after only 6 months:
> 
> View attachment 2581091


Did u use any anti corrosive in your loop or just run as is ? Also did u clean the card out prior? Here's my 2080ti after being in my loop for 4+ years now nearly

I just use distilled + dead water ( 4 drops every 2 years ) + silver plugs in my resbay


----------



## lawson67

Whats the average overclock on the 4090 ?, best i can do with my Zotac without artefact's is 2940mhz, is that bad?? anything over that i get artefacts, memory is good though can do 1550mhz


----------



## Panchovix

lawson67 said:


> Whats the average overclock on the 4090 ?, best i can do with my Zotac without artefact's is 2940mhz, is that bad?? anything over that i get artefacts, memory is good though can do 1550mhz


Without more voltage (stock 1.05V) 2940Mhz is fine, average I would say.
VRAM at +1550Mhz is probably also average.


----------



## J7SC

LunaP said:


> 2 weeks now and no response from cable mod but seeing them up on amazon now so might just buy there. Support hasn't responded at all.


...Should not be long now.

FYI, in addition to the free 12VHPWR cable from Seasonic, I had also ordered one from Cablemod (for future reference). See below, and after checking w/ Fedex, it is in N/A and on the way here (NOV10th or before).









@LunaP ...I know that was recommended for years, but silver plugs would make me a bit nervous in multi-metal. Anyway, I typically use 2/3 deionized water and 1/3 Mayhems in a given loop.


----------



## lawson67

Panchovix said:


> Without more voltage (stock 1.05V) 2940Mhz is fine, average I would say.
> VRAM at +1550Mhz is probably also average.


Thanks for the reply, voltage slider set to the max in afterburner does not help with the artifacts unfortunately


----------



## BigMack70

newls1 said:


> Would like to just ask a very simple and quick question if yall dont mind.... what is currently the preferred driver to use right now? is it the most current 526.47 or is it the 522.25 driver from a few weeks ago. Before I sold off my 3090Ti, that 522.25 driver was the cats ass! Does the most current one offer the same performance?


I had constant problems with black screen crashes on both 526.47 and the new hotfix driver. 522.25 seems stable.


----------



## WayWayUp

pulled the trigger on a 4090 FE, just waiting for it to arrive.
I think i will skip the waterblock for now even though i have a duel pump/res with 3x 420 rad setup.

I want the waters to clear first on a potential 4090ti. Waiting 2 months for CES. If I hear of the ti then the 4090 will go up for sale. If no rumblings/announcement then i will grab the block


----------



## Panchovix

lawson67 said:


> Thanks for the reply, voltage slider set to the max in afterburner does not help with the artifacts unfortunately


You have to kinda move some points in the V/F curve to fit the newest voltage points (from 1.05V to 1.1V), that helped a lot on my case.


----------



## dante`afk

Gking62 said:


> very nice, I too ordered the Plexi for my Strix looking forward to mine.
> 
> when did you order? I ordered on Oct. 19 but still processing, no ship ETA yet...


ordered 10/25. shipped on 11/3 with estimated arrival for 11/7, was delivered via DHL on 11/4. 1 day shipping time from Slovenia to US is insane.


----------



## LunaP

J7SC said:


> ...Should not be long now.
> 
> 
> 
> @LunaP ...I know that was recommended for years, but silver plugs would make me a bit nervous in multi-metal. Anyway, I typically use 2/3 deionized water and 1/3 Mayhems in a given loop.


Hopefully just weird on the support email part.

As for the loop yeah its why I've always avoided EK w/ their religious nickel push, I get theres mixed feeligns on them but I try to keep mine purely copper/chromium. Granted I've had this loop since 2013 when I first built it here and only clean it on occasion whenever I upgrade the board/CPU which will be due in feb/march IF we get HEDT else grabbing raptorlake.

So far no issues w/ the loop in the past 12 years at least. Eventually I'll get around to adding a 2nd pump. I'm surprised this ones been going on for as long as it has lol.


----------



## Panchovix

BTW non related to the latest comments (lol I'm sorry), does 1.1V degrades the cards faster right? Have you guys used a Turing/Ampere card at 1.1V most of the time and saw any degradation?


----------



## cheddardonkey

Xarrion said:


> Does anyone know a way to raise the minimum clocks for the 4090? When it's idle (GPU clock 210 MHz, mem clock 405 MHz) I'm getting a really nasty coil whine. When it has some load, even light, it's much better.


If you are using afterburner you can lock the clock via the curve editor, select the clock/voltage desired by clicking on the dot and press "L"


----------



## Xavier233

I have a weird buzz noise from the GPU when using V-sync in-game (at 165 FPS). But if I set an FPS limit to 164 or even 165 FPS with the Nvidia control center, its dead silent. 

Anyone would know why?


----------



## LunaP

Panchovix said:


> BTW non related to the latest comments (lol I'm sorry), does 1.1V degrades the cards faster right? Have you guys used a Turing/Ampere card at 1.1V most of the time and saw any degradation?


Coming from Turing I've not experienced any, kept same overclocks for the past few years whenever gaming +1100 on mem and +110 on core everything else maxed. The hardware feels pretty up to snuff, granted I"m sure YMMV I think going over could def but 1.1 should be within the safety limits of the card, though don't take my feedback to heart, I'll wait for someone more experienced to answer, but just commenting on my experience ( since I own 3 (2 in sli on main desktop and 1 single for my VR desktop for our family room ) 



Speaking of VR anyone w/ an index or VR headset w/ steam having issues w/ purple bars on their 4090 like it just can't stop stuttering at times and I"m wondering if its driver related, tried multiple other settings so far no avail.


----------



## J7SC

LunaP said:


> Hopefully just weird on the support email part.
> 
> As for the loop yeah its why I've always avoided EK w/ their religious nickel push, I get theres mixed feeligns on them but I try to keep mine purely copper/chromium. Granted I've had this loop since 2013 when I first built it here and only clean it on occasion whenever I upgrade the board/CPU which will be due in feb/march IF we get HEDT else grabbing raptorlake.
> 
> So far no issues w/ the loop in the past 12 years at least. Eventually I'll get around to adding a 2nd pump. I'm surprised this ones been going on for as long as it has lol.


...yeah, HEDT is my preference also for work+play combos; the last dual build (2x X570 / 16c/32t) was the first time in 8 years that I didn't build HEDT. As to pumps, I started using D5s 9 years ago, and from the 9x D5s I have, only one died (because I accidentally killed it via dry-run). 

Our heatwave is definitely over - looking out the home-office window this morning...


Spoiler


----------



## newls1

newls1 said:


> GB has a updated F2 bios on the site for the gaming OC, should i flash my new card to this when it comes in? Anyone know what it changes?


anyone?


----------



## d5aqoep

New vBIOS Asus RTX 4090 (covering Strix, Strix OC, TUF and TUF OC models)


https://dlcdnets.asus.com/pub/ASUS/Graphic%20Card/NVIDIA/BIOSUPDATE_TOOL/RTX4090/RTX4090_V2.exe


----------



## pt0x-

d5aqoep said:


> New Asus RTX 4090 (covering Strix, Strix OC, TUF and TUF OC models)
> 
> 
> https://dlcdnets.asus.com/pub/ASUS/Graphic%20Card/NVIDIA/BIOSUPDATE_TOOL/RTX4090/RTX4090_V2.exe


This is an updated bios flash tool. They put out a version that didnt work for lots of ppl. This new exe flashes correctly it seems (have not tried it) but it's not an even newer bios as far as I understand.

Source: 4090 Strix OC vBios Update Error - Page 3

Also, pls dont link directly to exe files on the web, kinda unsafe.


----------



## KingEngineRevUp

dante`afk said:


> I like mine.
> 
> I'll even get a raiser cable now and mount it vertically because it's so pretty
> 
> View attachment 2581439


Can you report back your results? Your temperatures vs. water temperature?


----------



## RageOfFury

newls1 said:


> anyone?


If it ain't broke don't touch it


----------



## Panchovix

d5aqoep said:


> New Asus RTX 4090 (covering Strix, Strix OC, TUF and TUF OC models)
> 
> 
> https://dlcdnets.asus.com/pub/ASUS/Graphic%20Card/NVIDIA/BIOSUPDATE_TOOL/RTX4090/RTX4090_V2.exe


Where did you get it? Can't see it on the ASUS page. (TUF 4090)




LunaP said:


> Coming from Turing I've not experienced any, kept same overclocks for the past few years whenever gaming +1100 on mem and +110 on core everything else maxed. The hardware feels pretty up to snuff, granted I"m sure YMMV I think going over could def but 1.1 should be within the safety limits of the card, though don't take my feedback to heart, I'll wait for someone more experienced to answer, but just commenting on my experience ( since I own 3 (2 in sli on main desktop and 1 single for my VR desktop for our family room )


Many thanks for the answer! I'm asking since I remember (can't find now, so maybe I'm lying), that NVIDIA said using 1.1V degrades a "card from 5 years to one", can't remember if it was on Pascal or Turing Era, and can't find article.

Just kinda scared after remembering that touse 1.1V, but at least based on your experience, you did not see degradation.


----------



## mirkendargen

Panchovix said:


> Many thanks for the answer! I'm asking since I remember (can't find now, so maybe I'm lying), that NVIDIA said using 1.1V degrades a "card from 5 years to one", can't remember if it was on Pascal or Turing Era, and can't find article.
> 
> Just kinda scared after remembering that touse 1.1V, but at least based on your experience, you did not see degradation.


Definitely not true, since you can run 1.1V on stock BIOS'es with no hardware modification and the cards all have 3-4 year warranties.


----------



## Panchovix

mirkendargen said:


> Definitely not true, since you can run 1.1V on stock BIOS'es with no hardware modification and the cards all have 3-4 year warranties.


That's what I suspected, that's a relief to be honest, thanks! (Will spend a looot of time benching at 1.1V)


----------



## LunaP

J7SC said:


> ...yeah, HEDT is my preference also for work+play combos; the last dual build (2x X570 / 16c/32t) was the first time in 8 years that I didn't build HEDT. As to pumps, I started using D5s 9 years ago, and from the 9x D5s I have, only one died (because I accidentally killed it via dry-run).
> 
> Our heatwave is definitely over - looking out the home-office window this morning...
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2581483


lmao, yeah same d5 pump since 2013 been running strong, I dont' leave my pc on overnight usually ( here and there if doing projects yeah ) but def looking forward to fishhawk if it is given the 8 channel memory and 80 PCI-E 5 lanes, thats got me hella excited (if we get it) + The fact they changed the architecture design for the mesh vs ringbus meaning it should be way more up to part w/ ringbus ( fingers crossed )


----------



## dante`afk

KingEngineRevUp said:


> Can you report back your results? Your temperatures vs. water temperature?











[Official] NVIDIA RTX 4090 Owner's Club


Have you got any games you can test for frame rate? Does it perform below par on those too or perhaps this issue is isolated to 3DMark? It is behaving a bit screwy with the 4090s at the moment. On games I get more less the same fps my lows are better my system is faster. But my port Royal 3d...




www.overclock.net


----------



## GAN77

LunaP said:


> I just use distilled + dead water ( 4 drops every 2 years ) + silver plugs in my resbay


----------



## dr/owned

My PR score went up to 27300 from 26700 just from switching back to the original bios.


J7SC said:


> ...yeah, HEDT is my preference also for work+play combos; the last dual build (2x X570 / 16c/32t) was the first time in 8 years that I didn't build HEDT. As to pumps, I started using D5s 9 years ago, and from the 9x D5s I have, only one died (because I accidentally killed it via dry-run).


I had 1 D5 die while running but they're very reliable pumps. Whenever I see someone ask "should I run 2 in series for redundancy" a) no they're too reliable for that b) nothing is going to go nuclear meltdown just because a pump fails anyways.


----------



## Gking62

Tideman said:


> It felt fine plugging it in. There is no audible click so obviously make sure to inspect it closely to make sure this is no gap there whatsoever.


thanks bud, btw modDIY just shipped my custom cable I only ordered just yesterday afternoon, should have it end of week 😎


----------



## LunaP

GAN77 said:


>


jesus *** did u put in ur loop lmao (or forget to ? ) 


also my block just arrived shutting down to install


----------



## RageOfFury

Hmmm....All of the reported failures on reddit (as of 7 Nov) use the double split terminal design. Both the Nvidia adapter and the MSI ATX 3.0 cable use double split. Other cables such as cablemod and moddyi use single split and thus far no reported issues with those. I've yet to check the 3090 Ti adapter to see what terminal type it uses. I bet it's single split.

Thoughts, all?


----------



## LunaP

Ok for the gigabyte oc byski blocl owners how did u dissaemble ur card I got most screws off but i can see they go the other way inside how did u remove thr fans?


----------



## LuckyImperial

WayWayUp said:


> pulled the trigger on a 4090 FE, just waiting for it to arrive.
> I think i will skip the waterblock for now even though i have a duel pump/res with 3x 420 rad setup.
> 
> I want the waters to clear first on a potential 4090ti. Waiting 2 months for CES. If I hear of the ti then the 4090 will go up for sale. If no rumblings/announcement then i will grab the block


Do yourself a favor and get a 12VHPWR cable on order so you don't have to use the adapter. I think the adapter is probably fine to use...mine survived two weeks no problem, but there's a lot of peace of mind not having soldered joints or double split pins. The adapter is also very bulky.



Xavier233 said:


> I have a weird buzz noise from the GPU when using V-sync in-game (at 165 FPS). But if I set an FPS limit to 164 or even 165 FPS with the Nvidia control center, its dead silent.
> 
> Anyone would know why?


That's pretty typical coil wine behavior. It's coil wine, and in your case only occurs with VSync conditions, which has different GPU loads to an FPS limiter. I know it's counter intuitive, but VSync and FPS limiters operate differently. 

I have no idea why, but my MSI Liquid X has coil wine when playing the stock Windows Solitare game, and nothing else. Don't ask why a 4090 is playing Solitaire. 



RageOfFury said:


> Hmmm....All of the reported failures on reddit (as of 7 Nov) use the double split terminal design. Both the Nvidia adapter and the MSI ATX 3.0 cable use double split. Other cables such as cablemod and moddyi use single split and thus far no reported issues with those. I've yet to check the 3090 Ti adapter to see what terminal type it uses. I bet it's single split.
> 
> Thoughts, all?


I think the double split design is sufficient, and one could argue it _may _provide more pin to socket *surface contact* than a single split terminal...but that a single split terminal would have more contact "force" as there's more metal to deform upon terminal-pin contact. 

Personally, upon accessing my ModDIY cable, I believe that the gold coating of the terminals/pins on the custom cables are more impactful than the split of the terminal. 

Lastly - Can you see that the adapter terminals are recessed into the socket 1-2mm? My ModDIY cable did not have that terminal/socket recession. The terminals were basically flush with the plastic.


----------



## Xavier233

LuckyImperial said:


> Do yourself a favor and get a 12VHPWR cable on order so you don't have to use the adapter. I think the adapter is probably fine to use...mine survived two weeks no problem, but there's a lot of peace of mind not having soldered joints or double split pins. The adapter is also very bulky.
> 
> 
> 
> That's pretty typical coil wine behavior. It's coil wine, and in your case only occurs with VSync conditions, which has different GPU loads to an FPS limiter. I know it's counter intuitive, but VSync and FPS limiters operate differently.
> 
> I have no idea why, but my MSI Liquid X has coil wine when playing the stock Windows Solitare game, and nothing else. Don't ask why a 4090 is playing Solitaire.
> 
> 
> 
> I think the double split design is sufficient, and one could argue it _may _provide more pin to socket *surface contact* than a single split terminal...but that a single split terminal would have more contact "force" as there's more metal to deform upon terminal-pin contact.
> 
> Personally, upon accessing my ModDIY cable, I believe that the gold coating of the terminals/pins on the custom cables are more impactful than the split of the terminal.
> 
> Lastly - Can you see that the adapter terminals are recessed into the socket 1-2mm? My ModDIY cable did not have that terminal/socket recession. The terminals were basically flush with the plastic.


That is my conclusion. I know this card has a mild coil whine starting at 400 FPS, but anything below 400 has zero coil whine, except when I use v-sync on.

Whats funny, is if I set the FPS limiter to either 165 FPS (my monitor refresh rate), or anything below or higher (for example, 164 FPS, or 200 FPS), there will be no coil whine.


----------



## Xavier233

For those changing their cooling pads on the GPU, be very careful of the thickness of the new pads, it can damage the GPU itself or the memory chips:


----------



## cheddardonkey

Rarely see IN stock available @ MSRP so here ya go if anyone is looking.

GIGABYTE AORUS GeForce RTX 4090 Video Card GV-N4090AORUSX W-24GD - Newegg.com


----------



## LunaP

cheddardonkey said:


> Rarely see IN stock available @ MSRP so here ya go if anyone is looking.
> 
> GIGABYTE AORUS GeForce RTX 4090 Video Card GV-N4090AORUSX W-24GD - Newegg.com


Best buy has rm in stock all over the place still which is nice, but good to see others finally


----------



## mirkendargen

LunaP said:


> Ok for the gigabyte oc byski blocl owners how did u dissaemble ur card I got most screws off but i can see they go the other way inside how did u remove thr fans?


Remove all the screws you can from the backplate and a few from the IO slot area and that allows you to get the heatsink off. It takes an uncomfortable amount of force because they used a ton of pads, I kept checking like 3 times because I thought I missed a screw somewhere but I didn't. Once that if off you'll see some more screws attaching the PCB to the backplate that you remove.


----------



## mirkendargen

LuckyImperial said:


> Do yourself a favor and get a 12VHPWR cable on order so you don't have to use the adapter. I think the adapter is probably fine to use...mine survived two weeks no problem, but there's a lot of peace of mind not having soldered joints or double split pins. The adapter is also very bulky.
> 
> 
> 
> That's pretty typical coil wine behavior. It's coil wine, and in your case only occurs with VSync conditions, which has different GPU loads to an FPS limiter. I know it's counter intuitive, but VSync and FPS limiters operate differently.
> 
> I have no idea why, but my MSI Liquid X has coil wine when playing the stock Windows Solitare game, and nothing else. Don't ask why a 4090 is playing Solitaire.
> 
> 
> 
> I think the double split design is sufficient, and one could argue it _may _provide more pin to socket *surface contact* than a single split terminal...but that a single split terminal would have more contact "force" as there's more metal to deform upon terminal-pin contact.
> 
> Personally, upon accessing my ModDIY cable, I believe that the gold coating of the terminals/pins on the custom cables are more impactful than the split of the terminal.
> 
> Lastly - Can you see that the adapter terminals are recessed into the socket 1-2mm? My ModDIY cable did not have that terminal/socket recession. The terminals were basically flush with the plastic.


Gold isn't a good thing in this case. It's not a great conductor, worse than copper and the plating is so thin it's irrelevant in reducing resistance. Gold plated terminals exist to prevent corrosion/oxidation, not to improve conductivity. And gold can have a galvanic reaction with the tin that the connector on the GPU is.


----------



## LunaP

mirkendargen said:


> Remove all the screws you can from the backplate and a few from the IO slot area and that allows you to get the heatsink off. It takes an uncomfortable amount of force because they used a ton of pads, I kept checking like 3 times because I thought I missed a screw somewhere but I didn't. Once that if off you'll see some more screws attaching the PCB to the backplate that you remove.












Thanks i wasnt 100% positive but got it was trying to pry the back plate off to unscrew the fan shroud


----------



## Mad Pistol

The RTX 4090 is definitely an 8K120 card... as long as you render it way below 8K and add on some Frame Generation.


----------



## ArcticZero

Xavier233 said:


> For those changing their cooling pads on the GPU, be very careful of the thickness of the new pads, it can damage the GPU itself or the memory chips:


This is why I decided like many others to make the switch to thermal putty. I remember all the struggles, remounts and thermal shut downs I had finding the right thickness/hardness pads for my PNY 3090's stock cooler.


----------



## mirkendargen

LunaP said:


> View attachment 2581518


Those 5 you can't access attach the PCB to the backplate. You should be able to remove the heatsink if you also got the couple around the display connectors.


----------



## neteng101

Anyone here has thoughts about running a 3-connector VHPWR adapter with a 600W bios and 1.1V long term?


----------



## LunaP

mirkendargen said:


> Those 5 you can't access attach the PCB to the backplate. You should be able to remove the heatsink if you also got the couple around the display connectors.


Yeqh wrong photo its off just removing the thernal pads 9n the front and back, guessing byski is only on front not back?


----------



## mirkendargen

neteng101 said:


> Anyone here has thoughts about running a 3-connector VHPWR adapter with a 600W bios and 1.1V long term?


A PCIE 8pin has 3 +12V/ground wires. a 12VHPWR cable has 6 +12V/ground wires. If the wire in the 8pin cable isn't undersized, it's more than fine, it will probably only use 2 +12V/ground per connector.




LunaP said:


> Yeqh wrong photo its off just removing the thernal pads 9n the front and back, guessing byski is only on front not back?


Bykski doesn't have anything on the back in the instructions, I just copied what Gigabyte did on the back plus some behind the power connector because why not. You couldn't hurt anything if you coated the entire back of the PCB in thermal pad lol.


----------



## doom3crazy

Xavier233 said:


> For those changing their cooling pads on the GPU, be very careful of the thickness of the new pads, it can damage the GPU itself or the memory chips:


My question is........ is it even necessary ? I feel like these things already do such a great job at cooling.


----------



## LunaP

mirkendargen said:


> A PCIE 8pin has 3 +12V/ground wires. a 12VHPWR cable has 6 +12V/ground wires. If the wire in the 8pin cable isn't undersized, it's more than fine, it will probably only use 2 +12V/ground per connector.
> 
> 
> 
> 
> Bykski doesn't have anything on the back in the instructions, I just copied what Gigabyte did on the back plus some behind the power connector because why not. You couldn't hurt anything if you coated the entire back of the PCB in thermal pad lol.


Those are what 1.0 or .05?


----------



## neteng101

mirkendargen said:


> A PCIE 8pin has 3 +12V/ground wires. a 12VHPWR cable has 6 +12V/ground wires. If the wire in the 8pin cable isn't undersized, it's more than fine, it will probably only use 2 +12V/ground per connector.


Totally not worried on the PCIE 8-pin side - these 3 cables have been supplying a 3080Ti with the Galax 1000w bios for the past year... just the 3PCIE to 12VHPWR adapter side of the connection.


----------



## mirkendargen

LunaP said:


> Those are what 1.0 or .05?


The stock ones looked like 1.0, but the standoffs on the bykski backplate are a little taller. I used 1.5's and they make contact.




neteng101 said:


> Totally not worried on the PCIE 8-pin side - these 3 cables have been supplying a 3080Ti with the Galax 1000w bios for the past year... just the 3PCIE to 12VHPWR adapter side of the connection.


Honestly, nothing is melting anywhere except the 12VHPWR connection to the GPU. If people were pulling too much power through the 8pins, they'd either melt at the PSU or the connection to the adapter, which no one has reported.


----------



## LunaP

mirkendargen said:


> The stock ones looked like 1.0, but the standoffs on the bykski backplate are a little taller. I used 1.5's and they make contact.


Yeah i just took them off the backplate and stuck thrm back on for now
Removing the front ones and using the i cluded to see how they fare


----------



## doom3crazy

Zero989 said:


> could have just asked me lol


Hey I wanted to ask(cause I figured you might know) with the msi gaming trio and the suprim 600w bios, I am safe to use the three 8 pin adapter right? if I recall, I noticed some discussion about if some users should pick up a four 8 pin adapter(as technically thats what comes with the suprim when you buy it) but again if I recall correctly, I remember some users stated pulling close to 600w on their gaming trio cards with the suprim bios and the three 8 pin adapter that came with the gaming trio and having no issues. Just wanted to clarify that is the case


----------



## Zero989

doom3crazy said:


> Hey I wanted to ask(cause I figured you might know) with the msi gaming trio and the suprim 600w bios, I am safe to use the three 8 pin adapter right? if I recall, I noticed some discussion about if some users should pick up a four 8 pin adapter(as technically thats what comes with the suprim when you buy it) but again if I recall correctly, I remember some users stated pulling close to 600w on their gaming trio cards with the suprim bios and the three 8 pin adapter that came with the gaming trio and having no issues. Just wanted to clarify that is the case


In theory, with the 3 way cable fully inserted, you can pull 600w without issue. 8 pins go way beyond 150w.


----------



## Sheyster

newls1 said:


> anyone?


PCI-E slot compatibility with some motherboards including some AMD boards. I updated, didn't notice any difference.


----------



## Tideman

Gking62 said:


> thanks bud, btw modDIY just shipped my custom cable I only ordered just yesterday afternoon, should have it end of week 😎


Nice! Yeah they don't mess around!


----------



## Sheyster

cheddardonkey said:


> Rarely see IN stock available @ MSRP so here ya go if anyone is looking.
> 
> GIGABYTE AORUS GeForce RTX 4090 Video Card GV-N4090AORUSX W-24GD - Newegg.com


Still in stock surprisingly, must be a large drop. I was tempted because I have room for it but decided not to. Super happy with the GB-G-OC I already have.


----------



## Pollomir

The EVGA 4090 (probably one of the few)


----------



## Panchovix

Pollomir said:


> The EVGA 4090 (probably one of the few)


Damn that card went to +1900 on VRAM, I'm envy AF now LOL

Also it looks sooooooooo good, man what a bummer EVGA left (Thanks NVIDIA)


----------



## J7SC

Xavier233 said:


> For those changing their cooling pads on the GPU, be very careful of the thickness of the new pads, it can damage the GPU itself or the memory chips:


I really shouldn't watch vids like that before I take my card apart for water-cooling (even though I use thermal putty for VRAM) ...my water-block is scheduled for delivery here today (earlier than forecast). I'm a bit worried re. prying the factory cooler off, given what @mirkendargen and others with the same card have reported


----------



## Pollomir

Panchovix said:


> Damn that card went to +1900 on VRAM, I'm envy AF now LOL
> 
> Also it looks sooooooooo good, man what a bummer EVGA left (Thanks NVIDIA)


My strix goes up to +2000 but core is average, +195. Still on air.


----------



## mirkendargen

J7SC said:


> I really shouldn't watch vids like that before I take my card apart for water-cooling (even though I use thermal putty for VRAM) ...my water-block is scheduled for delivery here today (earlier than forecast). I'm a bit worried re. prying the factory cooler off, given what @mirkendargen and others with the same card have reported


Hahaha it's not like "I'm going to rip a GDDR6X IC off" level of force....just a "am I missing a screw...?" level of force. Doing a proper "peel" motion, or warming the card up a tiny bit would probably help too. I was doing it with the card at like 15C.


----------



## LunaP

Panchovix said:


> Damn that card went to +1900 on VRAM, I'm envy AF now LOL
> 
> Also it looks sooooooooo good, man what a bummer EVGA left (Thanks NVIDIA)


Evga was part of the blame too sadly, had they not 9verly bought stock over the crypto hype they might hqve stuck around (might that is)


K ready to block it up


----------



## LuckyImperial

mirkendargen said:


> Gold isn't a good thing in this case. It's not a great conductor, worse than copper and the plating is so thin it's irrelevant in reducing resistance. Gold plated terminals exist to prevent corrosion/oxidation, not to improve conductivity. And gold can have a galvanic reaction with the tin that the connector on the GPU is.



I disagree. Gold is more conductive than the tin that is used on the adapter and is relevant for reducing resistance. 









What Plating Option Is Best For My Connector? - The Samtec Blog


Choosing the right plating option is crucial to the success of a connector system. Plating affects electrical performance, life cycle, quality, and cost.




blog.samtec.com


----------



## neteng101

Don't think its worthwhile running at 1.1V daily, not much to be gained over 1.05V but this is peak I can manage now...









I scored 18 002 in Time Spy Extreme


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com













I scored 27 204 in Port Royal


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com













I scored 10 729 in Speed Way


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com







Pollomir said:


> The EVGA 4090 (probably one of the few)


Great seeing the card but IMO JayZ loses all credibility at the end of that video. Fear mongering much there?


----------



## Panchovix

neteng101 said:


> Don't think its worthwhile running at 1.1V daily, not much to be gained over 1.05V but this is peak I can manage now...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 002 in Time Spy Extreme
> 
> 
> Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 27 204 in Port Royal
> 
> 
> Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 10 729 in Speed Way
> 
> 
> Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


For some reason I can get like 105 Mhz going from 1.05V to 1.1V (2925Mhz to 3030Mhz stable on PR, 2955Mhz to 3060Mhz on games) which is the most I have gotten by increasing voltage on a GPU card.

Though, the difference in performance is pretty low, so I agree that's not worth it if you also meant that


----------



## neteng101

Panchovix said:


> Though, the difference in performance is pretty low, so I agree that's not worth it if you also meant that


Yeah - clocks go up, but performance increase is pretty low! Might be effective clocks not boosting as much as the targets.


----------



## yzonker

J7SC said:


> I really shouldn't watch vids like that before I take my card apart for water-cooling (even though I use thermal putty for VRAM) ...my water-block is scheduled for delivery here today (earlier than forecast). I'm a bit worried re. prying the factory cooler off, given what @mirkendargen and others with the same card have reported


Yea, seems like no matter how many times I take one apart and/or mount, I still feel the pucker moment when I'm waiting for the bios splash screen to pop up.


----------



## changboy

My Bykski for my gigabyte 4090 gaming oc on travel to come here, on there plan they use 1.8mm thermalpads and i order 2.0mm pads, do you think i will be fine ?

I order those : https://www.amazon.ca/gp/product/B09ZKCKTGB/ref=ppx_yo_dt_b_asin_title_o00_s00?ie=UTF8&th=1


----------



## LunaP

Lol the comparison , 2080 ti xc2 evga stock
Vs 2080ti and 4090 air vs block


----------



## J7SC

yzonker said:


> Yea, seems like no matter how many times I take one apart and/or mount, I still feel the pucker moment when I'm waiting for the bios splash screen to pop up.


Yup, exactly that. I've been water-blocking various GPUs for about a decade now, but that moment for first boot up post-ops is always, ahem, 'exhilarating'. Recently, I moved some things around the water-blocked 3090 Strix and upon boot-up, nothing, nada, no pic - gulp until I realized that I forgot to plug the monitor cable back in...


----------



## dr/owned

J7SC said:


> I really shouldn't watch vids like that before I take my card apart for water-cooling (even though I use thermal putty for VRAM) ...my water-block is scheduled for delivery here today (earlier than forecast). I'm a bit worried re. prying the factory cooler off, given what @mirkendargen and others with the same card have reported


Pro tip: set oven to lowest temperature it'll go...like 170F. Turn it off for about 10 minutes to let the temperature come down a bit. Put card in there and leave it for 30 minutes along with the thermal pads you're going to apply. It'll come apart waaaay easier and go back together nicer when it's warm.


----------



## KingEngineRevUp

dante`afk said:


> [Official] NVIDIA RTX 4090 Owner's Club
> 
> 
> Have you got any games you can test for frame rate? Does it perform below par on those too or perhaps this issue is isolated to 3DMark? It is behaving a bit screwy with the 4090s at the moment. On games I get more less the same fps my lows are better my system is faster. But my port Royal 3d...
> 
> 
> 
> 
> www.overclock.net


Thanks for replying, so the data points are:

1. For 323.5 W Board, Chip Power Draw 249.34, Delta T = 13.7C
2. For 409.709 W, Chip Power Draw 339.88, Delta T = 20.3C

Optimus claims their block is going to do Chip Power Draw 515W with Delta T = 19C.

Hmm


----------



## mirkendargen

LuckyImperial said:


> I disagree. Gold is more conductive than the tin that is used on the adapter and is relevant for reducing resistance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What Plating Option Is Best For My Connector? - The Samtec Blog
> 
> 
> Choosing the right plating option is crucial to the success of a connector system. Plating affects electrical performance, life cycle, quality, and cost.
> 
> 
> 
> 
> blog.samtec.com


The ~1micron thickness of the gold plating won't impact electrical conductivity in the slightest. They even say in your link that gold is for oxidation prevention. Gold on gold is fine. Tin on tin is also fine. Gold on tin is a no-no Gold or Tin versus Gold and Tin?.


----------



## J7SC

dr/owned said:


> Pro tip: set oven to lowest temperature it'll go...like 170F. Turn it off for about 10 minutes to let the temperature come down a bit. Put card in there and leave it for 30 minutes along with the thermal pads you're going to apply. It'll come apart waaaay easier and go back together nicer when it's warm.


Thanks  - I think it's just healthy paranoia on my part...I have reflowed solder on a GPU in an oven before (obviously at higher temps) but I will follow the advice to warm it up first before I take the factory cooler off for the water-block...

...speaking of which, DHL delivered three days early...inspected the block and the rest of the package (a lot of stuff for US122 + shipping)...looks like good machining...it also passed my 'camera flash' test - cheap nickel coating becomes transparent then and you can see the copper below the coating if it wasn't done right.


----------



## mirkendargen

J7SC said:


> Thanks  - I think it's just healthy paranoia on my part...I have reflowed solder on a GPU in an oven before (obviously at higher temps) but I will follow the advice to warm it up first before I take the factory cooler off for the water-block...
> 
> ...speaking of which, DHL delivered three days early...inspected the block and the rest of the package (a lot of stuff for US122 + shipping)...looks like good machining...it also passed my 'camera flash' test - cheap nickel coating becomes transparent then and you can see the copper below the coating if it wasn't done right.
> View attachment 2581536


Yeah mine had visible machining marks, but nothing you can feel running a fingernail over them. The fin area is smaller than most other blocks, but bigger than the GPU die so functionally good enough.


----------



## dr/owned

J7SC said:


> ...speaking of which, DHL delivered three days early...inspected the block and the rest of the package (a lot of stuff for US122 + shipping)...looks like good machining...it also passed my 'camera flash' test - cheap nickel coating becomes transparent then and you can see the copper below the coating if it wasn't done right.


I emailed [email protected] to follow up asking about the 4090 series plating. He said what I was suspecting: covid they used a different supplier that sucked, and (allegedly) are back to the original supplier that shouldn't have issue. My only other beef with Bykski was wishing they would use more than 4 screw holes around the GPU die for mounting. The stock PCB is usually like 12 screws and more along the edges.


----------



## mirkendargen

dr/owned said:


> I emailed [email protected] to follow up asking about the 4090 series plating. He said what I was suspecting: covid they used a different supplier that sucked, and (allegedly) are back to the original supplier that shouldn't have issue. My only other beef with Bykski was wishing they would use more than 4 screw holes around the GPU die for mounting. The stock PCB is usually like 12 screws and more along the edges.


I'm glad to hear he still works there in case I have any issues again, lol. I wish the main mounting screws went through the backplate so they were possible to tighten after a heat cycle without taking the backplate off, but of well.


----------



## yzonker

J7SC said:


> Thanks  - I think it's just healthy paranoia on my part...I have reflowed solder on a GPU in an oven before (obviously at higher temps) but I will follow the advice to warm it up first before I take the factory cooler off for the water-block...
> 
> ...speaking of which, DHL delivered three days early...inspected the block and the rest of the package (a lot of stuff for US122 + shipping)...looks like good machining...it also passed my 'camera flash' test - cheap nickel coating becomes transparent then and you can see the copper below the coating if it wasn't done right.
> View attachment 2581536


I put mine on a heating pad before I started and just worked to pull the screws, etc.. with it on the pad. It did come apart fairly easily. The nice thing is the TUF has a couple of screws only common to the PCB and backplate, so I was able to leave the backplate on while pulling off the heatsink for a little more rigidity.


----------



## J7SC

...that 'fingernail test' is why I removed the 3090 EK Strix block and bought a Phanteks for it instead. Apart from the nickel peeling off of the backplate in big chunks on the EK before it was even mounted, the nickel coating on the block also didn't pass my flash test (spoiler). Far worse with the EK was the actual touch-down point of the machine head right above the die area > could feel the ridge with your fingernail (only a few other samples apparently had that). I am not talking about the machining groves, either, but the actual 'start/end' position of the cutting head...

BTW, I'm not dissing EK wholesale; I have a lot of good water-cooling gear from them, and as motioned, I also had a Heatkiller TR Pro IV CPU block w/coating issues (now also replaced with a Phanteks TR CPU one). The only Bykski product I had before this new 4090 one has been working perfectly for 1.5+ years. 


Spoiler



camera 'flash test'...in pic on the left, both items are nickel-coated...


----------



## stahlhart




----------



## zhrooms

Added the unreleased FTW3 Ultra, just in case some people were curious, full card shown by GN and others


----------



## dr/owned

zhrooms said:


> View attachment 2581540
> 
> Added the unreleased FTW3 Ultra, just in case some people were curious, full card shown by GN and others


So basically a Zotac Amp Extreme that they probably would have charged $1900 for being "Ultra". 

Plus this ridiculously over-length PCB:










If their goal of putting this out to reviewers was to have a swell of "oh noes EVGA please come back we need you"...this ain't it.


----------



## J7SC

dr/owned said:


> So basically a Zotac Amp Extreme that they probably would have charged $1900 for being "Ultra".
> 
> Plus this ridiculously over-length PCB:
> 
> View attachment 2581545
> 
> 
> If their goal of putting this out to reviewers was to have a swell of "oh noes EVGA please come back we need you"...this ain't it.


On the one hand, it would be nice to have that 12VHPWR connector on the side (less bending & twisting), on the other hand, the pic shows how ridiculous that actually would have been - never mind that the behemoth-air-cooled monsters are already too wide to fit into some cases as they stick out way too far on the right compared to most of their accompanying PCBs (not ^that one though  ). Also - power delivery / traces length and inherent 'pcb flex'


----------



## Benni231990

@zhrooms

Thank you so much for all the new BIOS but have you a source for the Neptun OC 630watt BIOS?

And i asked you for the flash manuel it is super easy the command is : nvflash64 -6 XXX.rom after that 2x Y and the flash is done

XXX stands for the Bios file name

so if you want you can add the command to the first page for all the others who dont know how to flash there 4090


----------



## mirkendargen

mirkendargen said:


> I went ahead and did it because why not lol.
> View attachment 2580981


I joked about putting thermal pads behind the power connector...but after adding a couple slow quiet fans to move air around the backplate I just left Quake 2 RTX running pegging the power limiter and it made a serious difference. The connector is now only warm, I'd say 40C or less. Before it wasn't too hot to hold, but significantly warmer.


----------



## Mad Pistol

mirkendargen said:


> I joked about putting thermal pads behind the power connector...but after adding a couple slow quiet fans to move air around the backplate I just left Quake 2 RTX running pegging the power limiter and it made a serious difference. The connector is now only warm, I'd say 40C or less. Before it wasn't too hot to hold, but significantly warmer.
> 
> View attachment 2581554


Is Quake II RTX considered the most taxing game to run on a 4090?


----------



## mirkendargen

Mad Pistol said:


> Is Quake II RTX considered the most taxing game to run on a 4090?


At 200% resolution scaling it keeps the power limiter pegged 100% of the time, you can't really do more than that.


----------



## Mad Pistol

mirkendargen said:


> At 200% resolution scaling it keeps the power limiter pegged 100% of the time, you can't really do more than that.


Dang, that's brutal. Very nice!


----------



## Arizor

Yep Quake 2 RTX is the best way to test power and temps, loads straight into a really taxing environment straight away.


----------



## Sayenah

J7SC said:


> On the one hand, it would be nice to have that 12VHPWR connector on the side (less bending & twisting), on the other hand, the pic shows how ridiculous that actually would have been - never mind that the behemoth-air-cooled monsters are already too wide to fit into some cases as they stick out way too far on the right compared to most of their accompanying PCBs (not ^that one though  ). Also - power delivery / traces length and inherent 'pcb flex'


the whole bend-gate thing is overblown. Bending isn’t the problem, not connecting the connector all the way in has been the problem. Pure and simple.
I mean, sure if one is bending it like a crazy person then that is an issue, but I don’t think that is true in the reported cases here.


----------



## Arizor

Sayenah said:


> the whole bend-gate thing is overblown. Bending isn’t the problem, not connecting the connector all the way in has been the problem. Pure and simple.
> I mean, sure if one is bending it like a crazy person then that is an issue, but I don’t think that is true in the reported cases here.


Yep I'm increasingly convinced this is the case. I think poor QC has let through some adaptors that are _really_ hard to push in all the way, and the small socket itself already makes it quite a firm _push_.

I can imagine a lot of people give it one good push, it looks pretty much in, so they don't want to risk pushing harder because, well, $2k graphics card. So they leave it, and whoosh.

They definitely need to redesign so it's a clearer, easier connection between plug and socket. More haptic feedback, perhaps some clear indentations or colored marking etc.


----------



## mirkendargen

Arizor said:


> Yep I'm increasingly convinced this is the case. I think poor QC has let through some adaptors that are _really_ hard to push in all the way, and the small socket itself already makes it quite a firm _push_.
> 
> I can imagine a lot of people give it one good push, it looks pretty much in, so they don't want to risk pushing harder because, well, $2k graphics card. So they leave it, and whoosh.
> 
> They definitely need to redesign so it's a clearer, easier connection between plug and socket. More haptic feedback, perhaps some clear indentations or colored marking etc.


This is where I'm leaning too. And the reason less 3rd party cables are melting is because the kind of people who are super aware and careful to plug their cable in all the way are also the people that instantly bought and are using 3rd party cables now instead of the adapter.


----------



## Arizor

mirkendargen said:


> This is where I'm leaning too. And the reason less 3rd party cables are melting is because the kind of people who are super aware and careful to plug their cable in all the way are also the people that instantly bought and are using 3rd party cables now instead of the adapter.


Yep precisely.


----------



## KingEngineRevUp

RageOfFury said:


> Hmmm....All of the reported failures on reddit (as of 7 Nov) use the double split terminal design. Both the Nvidia adapter and the MSI ATX 3.0 cable use double split. Other cables such as cablemod and moddyi use single split and thus far no reported issues with those. I've yet to check the 3090 Ti adapter to see what terminal type it uses. I bet it's single split.
> 
> Thoughts, all?


No FE has melted. It can be PCB design specific. We don't know yet


----------



## Damaged__

Mad Pistol said:


> Is Quake II RTX considered the most taxing game to run on a 4090?


Yes. It’s great for all around testing and great for memory temps/stability as well. I haven’t found a better game in that regard as of yet.

If your memory overclock can handle Quake RTX maxed at 4k, I’d wager it can just about pass in any other game.


----------



## ShadowYuna

My Bykski 4090 Gaming OC Block has arrived. Pretty good performance.


















+250 core / 1400 Memory Timespy Extreme









Core Temp never exceed 55 degree. Very happy with the block also my Gaming OC does not have coil whine so this is my best satisfaction.


----------



## Sayenah

I have a question; my Strix 4090 has never crossed more than 510W in Port Royal. My best score is 28124

I thought I should be hitting above 550W no? 

and yes, MSI AB beta 2,OC locked at 210 (1.1V) and memory at 1710


----------



## LunaP

ShadowYuna said:


> My Bykski 4090 Gaming OC Block has arrived. Pretty good performance.
> View attachment 2581565
> 
> 
> View attachment 2581566
> 
> 
> +250 core / 1400 Memory Timespy Extreme
> View attachment 2581567
> 
> 
> Core Temp never exceed 55 degree. Very happy with the block also my Gaming OC does not have coil whine so this is my best satisfaction.


Beautiful clean build on that, I still need to get around to redoing mine.

getting 22-24 idle temps w/ my card which is nice, hopefully amazon delivers my cable before cablemod responds to my email from 2 weeks ago.


















Before and after made a huge diff lol


----------



## J7SC

LunaP said:


> Beautiful clean build on that, I still need to get around to redoing mine.
> 
> getting 22-24 idle temps w/ my card which is nice, hopefully amazon delivers my cable before cablemod responds to my email from 2 weeks ago.
> 
> View attachment 2581569
> 
> View attachment 2581568
> 
> 
> Before and after made a huge diff lol


HEDT, GentleTyphoon fans - what's not to like ?!

---

...does anyone have any feedback on the new NVidia 526.47 driver, for example when compared to 522.25, re. performance and compatibility ?


----------



## mirkendargen

J7SC said:


> ...does anyone have any feedback on the new NVidia 526.47 driver, for example when compared to 522.25, re. performance and compatibility ?


Unusable for me at least. My second monitor randomly stops getting a signal with it. The hotfix driver does the same thing.


----------



## KingEngineRevUp

LunaP said:


> Beautiful clean build on that, I still need to get around to redoing mine.
> 
> getting 22-24 idle temps w/ my card which is nice, hopefully amazon delivers my cable before cablemod responds to my email from 2 weeks ago.
> 
> View attachment 2581569
> 
> View attachment 2581568
> 
> 
> Before and after made a huge diff lol


What cable did you buy from Amazon?


----------



## ArcticZero

Looks like it's possible local suppliers here will get a restock of the Suprim X Liquid before the TUF/Strix, so I might just go with that since a 600w BIOS exists. My only question is if there are any custom blocks available for it yet other than the one EK has on preorder. I would love anything except EK at this point, honestly.


----------



## pt0x-

Has anyone received a waterblock for the Strix 4090 already?
And does anyone know the thermal pad thickness(es) for this card (stock ofc).


----------



## pfinch

Is there an elegant way to control the RGB of the FE version? Don't want to use bloatware like iCUE, X1 etc. 
Just using Afterburner


----------



## mattskiiau

pfinch said:


> Is there an elegant way to control the RGB of the FE version? Don't want to use bloatware like iCUE, X1 etc.
> Just using Afterburner


Haven't tried it but I believe OpenRGB added FE support recently if you wanna give it a go


----------



## alasdairvfr

J7SC said:


> ...does anyone have any feedback on the new NVidia 526.47 driver, for example when compared to 522.25, re. performance and compatibility ?


Maybe I'm in the minority - but I'm not noticing much. Then again, I'm not playing the games that are affecting people. Poised to roll back at the first sign of trouble.


----------



## sneida

vigorito said:


> disable HW accelaration in chrome,and go back to 522.25,new driver is a mess


freezing on idle issue is also related to the driver? experienced that one also now... 

so, summary: gaming all great, but pc freezes from time to time when watching youtube videos and on idle.


----------



## rahkmae

Start replacement connectors now


----------



## Artjsalina5

Keep getting Board ID Mismatch on nvflash, how do you guys get passed this? 4090 FE


----------



## gamervivek

mattskiiau said:


> I *think *I've figured out my problem of this random 600w-650w spike on a 480w Trio:
> 
> View attachment 2581141
> 
> 
> 
> For whatever reason, setting core voltage to 100% makes NVVDD output randomly spike to 600w-650w. This happens to me in both simply opening COD MW2 or running TSE.
> I wonder if this is because my Trio has bad VRMs? I'm not too knowledgeable about this stuff.
> 
> That being said, I've decided to play around with undervolting and it has significantly lowered that spike when doing the the above two tasks.
> Strangely, I'm still able to pass TSE and play COD MW2 for a few hours without crashing when on 3000mhz @ 1.025v.
> 
> Going to do more testing with undervolting under different loads to find a sweet spot.
> 
> View attachment 2581142


Are you sure those are spikes? nvidia pushed down the transients this gen and while I'm getting similarly high numbers( > 500W ) while the actual power use is limited to 300W


----------



## yzonker

Artjsalina5 said:


> Keep getting Board ID Mismatch on nvflash, how do you guys get passed this? 4090 FE


You can't. FE isn't flashable just like 30 series.


----------



## Artjsalina5

yzonker said:


> You can't. FE isn't flashable just like 30 series.


Damn, ok. Thanks


----------



## dante`afk

KingEngineRevUp said:


> Thanks for replying, so the data points are:
> 
> 1. For 323.5 W Board, Chip Power Draw 249.34, Delta T = 13.7C
> 2. For 409.709 W, Chip Power Draw 339.88, Delta T = 20.3C
> 
> Optimus claims their block is going to do Chip Power Draw 515W with Delta T = 19C.
> 
> Hmm


all depends on the cooling, over the weekend I managed to draw 570w and the delta was 20c as well.


----------



## J7SC

dante`afk said:


> all depends on the cooling, over the weekend I managed to draw 570w and the delta was 20c as well.


Looking a hotspot temps as a _percentage_ over general GPU temp rather than absolute values might also make sense, especially with cards capable of close to 600W.


----------



## N19htmare666

Benni231990 said:


> @zhrooms
> 
> Thank you so much for all the new BIOS but have you a source for the Neptun OC 630watt BIOS?


I too am after this. That BIOS with the MSI liquid suprim X would be what dreams are made of  




Damaged__ said:


> Yes. It’s great for all around testing and great for memory temps/stability as well. I haven’t found a better game in that regard as of yet.
> 
> If your memory overclock can handle Quake RTX maxed at 4k, I’d wager it can just about pass in any other game.


Memory = RAM or VRAM?


----------



## Sheyster

Artjsalina5 said:


> Damn, ok. Thanks


The FE has a 600w max PL BIOS. There is no need to flash it even if you put a block on it.


----------



## N19htmare666

Benni231990 said:


> @zhrooms
> 
> Thank you so much for all the new BIOS but have you a source for the Neptun OC 630watt BIOS?


If we can find the 630w BIOS I think I'll be having this MSI Suprim Liquid X RTX 4090 💦 | 14th Nov dispatch | eBay


----------



## newls1

N19htmare666 said:


> If we can find the 630w BIOS I think I'll be having this MSI Suprim Liquid X RTX 4090 💦 | 14th Nov dispatch | eBay


that was in stock at my local MC yesterday....


----------



## N19htmare666

newls1 said:


> that was in stock at my local MC yesterday....


Yep it feels like it's just the UK that's as dry as the desert. I've been looking every day for nearly a month for the liquid suprim or waterforce. It's painful.


----------



## changboy

*J7SC*
Did you paid importation fee on your Bykski block ?


----------



## KingEngineRevUp

We can get an idea of how the alphacool blocks will perform. 

410W board, 340W chip, Delta was 17-18C over water. He doesn't state the water temperature, but you can see the card is idling at 30C so the water was possibly at 28C-29C. So this is all a rough estimate.


----------



## mirkendargen

KingEngineRevUp said:


> We can get an idea of how the alphacool blocks will perform.
> 
> 410W board, 340W chip, Delta was 17-18C over water. He doesn't state the water temperature, but you can see the card is idling at 30C so the water was possibly at 28C-29C. So this is all a rough estimate.


Factory paste jobs can be questionable too.


----------



## cletus-cassidy

I have an extra Bykski Strix/Tuf waterblock (no RGB). Long story, but ordered an extra because our neighbors took the first package and only told me a week later. Looking to get what I paid for it ($127.75) and can ship immediately. PM me if interested.


----------



## PhuCCo

ShadowYuna said:


> My Bykski 4090 Gaming OC Block has arrived. Pretty good performance.
> View attachment 2581565
> 
> 
> View attachment 2581566
> 
> 
> +250 core / 1400 Memory Timespy Extreme
> View attachment 2581567
> 
> 
> Core Temp never exceed 55 degree. Very happy with the block also my Gaming OC does not have coil whine so this is my best satisfaction.


Looks great. Did you use the included thermal pads or different ones? 1.5mm? I am going to try one of these blocks


----------



## J7SC

...just waiting now for the Cablemod delivery - seems to be 'stuck' at Fedex' Memphis location, though still two days before promised delivery date. Then, I will mount the waterblock (hopefully on the weekend) as I have everything else ready to go...even rummaged around in our 'tech pantry' and found these two unused heatsinks for the backplate 🥶...


----------



## Sheyster

J7SC said:


> ...just waiting now for the Cablemod delivery - seems to be 'stuck' at Fedex' Memphis location, though still two days before promised delivery date.


Mine just left China via FedEx. For some reason it sat there for several days before FedEx picked it up. I'm hoping to have it by Friday.


----------



## KingEngineRevUp

mirkendargen said:


> Factory paste jobs can be questionable too.


He takes it apart, it doesn't look that bad.


----------



## GRABibus

Received today by Fedex ::


----------



## yzonker

I made this run last night. It artifacted very briefly (maybe 5-10s). Score jumped about 100pts. If I hadn't been watching it carefully, would not have even noticed. 









Result not found







www.3dmark.com





So I see no way to really weed this out completely. I did see a comment on another forum saying UL is considering requiring ECC, but no idea how serious they are. Would probably mean deleting the entire database of 4090 scores.


----------



## Laithan

J7SC said:


> ...just waiting now for the Cablemod delivery - seems to be 'stuck' at Fedex' Memphis location, though still two days before promised delivery date.
> 
> EDIT: The Corsair cable is the wrong one.. it does not have sense pins so it should only be used for the 3090Ti. It still serves to show the cabe differences just know that if you have a 4090 you should order the version with the 2 sense pins.


Yup, they are rolling in I just got mine. I was waiting to compare the Corsair solution to the CableMod solution. The CableMod has populated sense pins. Corsair = (2) PSU / CableMod = (4) PSU.


----------



## Gadfly

I am looking for a water block for the Gigabyte 4090 WindForce; anyone know of one?


----------



## TSportM

Hello 

what are the benifits of flashing the suprim X liquide 600w on my suprimx ?

cheers


----------



## dr/owned

Gadfly said:


> I am looking for a water block for the Gigabyte 4090 WindForce; anyone know of one?


You might be better off returning it if you want to waterblock. No blocks announced.









RTX 4090 Block Compatibility


RTX 4090 Models,Blocks Available/Upcoming,Is Reference,PCB ,Compatible Blocks,Dimensions (mm),Notes NVIDIA Founders Edition,Yes,No,<a href="https://www.techpowerup.com/review/nvidia-geforce-rtx-4090-founders-edition/4.html">PCB</a>,<a href="https://www.ekwb.com/shop/ek-quantum-vector2-fe-rtx-409...




docs.google.com


----------



## Sayenah

yzonker said:


> I made this run last night. It artifacted very briefly (maybe 5-10s). Score jumped about 100pts. If I hadn't been watching it carefully, would not have even noticed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> So I see no way to really weed this out completely. I did see a comment on another forum saying UL is considering requiring ECC, but no idea how serious they are. Would probably mean deleting the entire database of 4090 scores.


Is this under water?

Great score!! I am stuck at 28124


----------



## newls1

N19htmare666 said:


> Yep it feels like it's just the UK that's as dry as the desert. I've been looking every day for nearly a month for the liquid suprim or waterforce. It's painful.


why, get a normal card and put a real waterblock on it, not that AIO garbage


----------



## yzonker

Sayenah said:


> Is this under water?
> 
> Great score!! I am stuck at 28124


Yea, although I only gained about 100 pts with the block. Unfortunately minimal gains for this gen.


----------



## mirkendargen

yzonker said:


> Yea, although I only gained about 100 pts with the block. Unfortunately minimal gains for this gen.


The gains are the sound of silence for me! Remote rads are the best.


----------



## yzonker

mirkendargen said:


> The gains are the sound of silence for me! Remote rads are the best.


Yea I'm enjoying the low noise and lots of space in my case again. Haven't given up on the chiller yet. I've got a couple of ideas. An XOC bios that keeps the mem at full speed would be nice.


----------



## changboy

I just paid 43.08$ for duty taxe for my Bykski block from China, i always think we dont have duty from China lool.


----------



## Sheyster

Laithan said:


> Yup, they are rolling in I just got mine. I was waiting to compare the Corsair solution to the CableMod solution. The CableMod has populated sense pins. Corsair = (2) PSU / CableMod = (4) PSU.
> 
> View attachment 2581654
> 
> 
> View attachment 2581656


Sooo which one are you actually going to use? 2 x 8-pin Corsair or 4 x 8-pin Cablemod?


----------



## GQNerd

Guess we're gonna have to start recording our benchmark sessions to prove no artifacting.. lol

Clean PR 28,866 Run






RIP Youtube compression


----------



## doom3crazy

Sheyster said:


> Sooo which one are you actually going to use? 2 x 8-pin Corsair or 4 x 8-pin Cablemod?


I feel like regardless of sense pins, at least for now, if the corsair is rated for 600w and has no issue, id take the corsair simply because of less case clutter haha.


----------



## doom3crazy

Speaking of cables, have we got any new word on the cablemod 90 degree/180 degree adapters yet as far as release?


----------



## LuckyImperial

newls1 said:


> why, get a normal card and put a real waterblock on it, not that AIO garbage


I didn't do a custom loop because more $$. The Liquid X actually has pretty dang good thermals with the 240mm rad, but it should have a 360mm like the Gigabyte Aurora. Then again, it's only a 530W max TDP vs the 600W Giga.

Still, I'm stoked this card exists because it was a drop in for my Lian Li Q58.

Another interesting note...it _seems _like GDDR6X OC's a little better when warm, and this card has warm mem chips. I can't actually speak to the validity of GDDR6X hot OC's, but this card does have warmer memory chips than some of the other cards....which certainly could be not desirable for some.


----------



## Glottis

Laithan said:


> Yup, they are rolling in I just got mine. I was waiting to compare the Corsair solution to the CableMod solution. The CableMod has populated sense pins. Corsair = (2) PSU / CableMod = (4) PSU.
> View attachment 2581653
> 
> View attachment 2581654
> 
> View attachment 2581655
> 
> View attachment 2581656
> 
> View attachment 2581657
> 
> View attachment 2581658
> 
> View attachment 2581659


Why did you buy the old Corsair 12pin cable which was intended for 3090Ti ? It has no sense pins. Your GPU will be locked to 450W or even refuse to work at all.

You need 16 pin cable https://www.corsair.com/12vhpwr


----------



## mirkendargen

FYI, there are only two "sense pins". There are 4 "sideband pins", two of which are the sense pins. The other two are optional and the GPU doesn't care about them.






ATX 3.0 Power Supply Connectors and Pinouts


Intel's ATX 3.0 specification, connectors and pinouts. Diagram, pinout and wire colors of PCIe 12VHPWR auxiliary connector.




www.smpspowersupply.com


----------



## Glottis

mirkendargen said:


> FYI, there are only two "sense pins". There are 4 "sideband pins", two of which are the sense pins. The other two are optional and the GPU doesn't care about them.
> 
> 
> 
> 
> 
> 
> ATX 3.0 Power Supply Connectors and Pinouts
> 
> 
> Intel's ATX 3.0 specification, connectors and pinouts. Diagram, pinout and wire colors of PCIe 12VHPWR auxiliary connector.
> 
> 
> 
> 
> www.smpspowersupply.com


Is this post intended for me? I was just pointing out that person bought incorrect Corsair cable not compatible with the 4090.


----------



## mirkendargen

Glottis said:


> Is this post intended for me? I was just pointing out that person bought incorrect Corsair cable not compatible with the 4090.


No I just mean in general, multiple people have said "only 2 of the 4 sense pins are wired" when talking about various cables/adapters.

It needs to be a 12VHPWR connector with the 4pin sideband area or it plain won't physically fit, regardless of what's wired or not. You're totally right about that.


----------



## Zero989

LuckyImperial said:


> I didn't do a custom loop because more $$. The Liquid X actually has pretty dang good thermals with the 240mm rad, but it should have a 360mm like the Gigabyte Aurora. Then again, it's only a 530W max TDP vs the 600W Giga.
> 
> Still, I'm stoked this card exists because it was a drop in for my Lian Li Q58.
> 
> Another interesting note...it _seems _like GDDR6X OC's a little better when warm, and this card has warm mem chips. I can't actually speak to the validity of GDDR6X hot OC's, but this card does have warmer memory chips than some of the other cards....which certainly could be not desirable for some.


The Waterforce 4090 does not have a fan on the GPU. 360mm means less compatibility.


----------



## ShadowYuna

PhuCCo said:


> Looks great. Did you use the included thermal pads or different ones? 1.5mm? I am going to try one of these blocks


I used the thermal pad that it came with. Nothing wrong with it. 
I heard that if you use 1.5mm other brand should work as well.


----------



## bmagnien

folks with the giga oc bykski block: @J7SC @mirkendargen @LunaP what are your deltas between water temp and core temp at full load, like 550w, with a ‘every-day’ fan curve. I’m at about 20c with 35 water and 55 core. That is with LM on core, thermal putty on front vrms and mem, putty/pads on back and an mp5works cooler on the backplate. At full pump/fans I can get the delta between water and core down to 15c at max wattage. Mem temps are stupidly low (mem junction at 40c) and hotspot delta is good also max 10c above core. Just wasn’t expecting a 20c delta considering my 3090 with ek block was about 10c - but maybe the 4090s are just more thermally dense? I did gain about 3-4 bins on average across all benchmarks, and improved my scores across the board - so I’m not upset, just want to make sure that delta isn’t indicative of a bad mount.


----------



## LuckyImperial

Zero989 said:


> The Waterforce 4090 does not have a fan on the GPU. 360mm means less compatibility.


Yeah I'm grateful for the 240mm rad for it's compatibility. 

The Tech Power Up teardown has some good snapshots of the AIO block. The block goes over the memory chips, but doesn't have the best "water" coverage like an aftermarket block would. The VRM's are cooled by the single board fan.

I'm curious what the Giga Waterforce block looks like. I bet it's more of a full coverage block over the VRM's.


----------



## Panchovix

KedarWolf said:


> You can't change the Negative LOD Bias settings, all the rest you can.
> 
> Try Nvidia Settings these in the Imgur link. The Imgur link is just a preview. You need to actually click on the thumbnail to see everything.
> 
> And use Nvidia Inspector, Run As Admin, enable unknown settings in the top menu, do those tweaks too.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Releases · Orbmu2k/nvidiaProfileInspector
> 
> 
> Contribute to Orbmu2k/nvidiaProfileInspector development by creating an account on GitHub.
> 
> 
> 
> 
> github.com
> 
> 
> 
> 
> 
> 
> 
> http://imgur.com/a/HuzyQj2


Hyper late for this, but thanks for the tweaks, there is ton of them that I didn't know lol


----------



## J7SC

bmagnien said:


> folks with the giga oc bykski block: @J7SC @mirkendargen @LunaP what are your deltas between water temp and core temp at full load, like 550w, with a ‘every-day’ fan curve. I’m at about 20c with 35 water and 55 core. That is with LM on core, thermal putty on front vrms and mem, putty/pads on back and an mp5works cooler on the backplate. At full pump/fans I can get the delta between water and core down to 15c at max wattage. Mem temps are stupidly low (mem junction at 32c) and hotspot delta is good also max 10c above core. Just wasn’t expecting a 20c delta considering my 3090 with ek block was about 10c - but maybe the 4090s are just more thermally dense? I did gain about 3-4 bins on average across all benchmarks, and improved my scores across the board - so I’m not upset, just want to make sure that delta isn’t indicative of a bad mount.


...fyi, haven't installed my waterblock yet which just arrived yesterday...also waiting for new 12VHPWR cable(s)...probably get to it on the weekend (I hope)


----------



## doom3crazy

Glottis said:


> Why did you buy the old Corsair 12pin cable which was intended for 3090Ti ? It has no sense pins. Your GPU will be locked to 450W or even refuse to work at all.
> 
> You need 16 pin cable https://www.corsair.com/12vhpwr


thats the one I am waiting on. been outta stock forever it feels like now. apparently under my psu warranty, corsair will send me one for free but they couldn't give me any idea as to when they will be back in stock


----------



## dr/owned

bmagnien said:


> folks with the giga oc bykski block: @J7SC @mirkendargen @LunaP what are your deltas between water temp and core temp at full load, like 550w, with a ‘every-day’ fan curve. I’m at about 20c with 35 water and 55 core. That is with LM on core, thermal putty on front vrms and mem, putty/pads on back and an mp5works cooler on the backplate. At full pump/fans I can get the delta between water and core down to 15c at max wattage. Mem temps are stupidly low (mem junction at 32c) and hotspot delta is good also max 10c above core. Just wasn’t expecting a 20c delta considering my 3090 with ek block was about 10c - but maybe the 4090s are just more thermally dense? I did gain about 3-4 bins on average across all benchmarks, and improved my scores across the board - so I’m not upset, just want to make sure that delta isn’t indicative of a bad mount.


A bad mount would probably be way more blatant than that. You should probably tighten up the screws a little more now that everything has heat cycled.


----------



## Laithan

Glottis said:


> Why did you buy the old Corsair 12pin cable which was intended for 3090Ti ? It has no sense pins. Your GPU will be locked to 450W or even refuse to work at all.
> 
> You need 16 pin cable https://www.corsair.com/12vhpwr


Thank you for pointing this out, this was my mistake.. I actually had the correct cable at first but since I had upgraded the PSU cables to the sleeved version I was looking for one with the sleeves so they would match. At least I can use it for my 3090Ti.


----------



## mirkendargen

bmagnien said:


> folks with the giga oc bykski block: @J7SC @mirkendargen @LunaP what are your deltas between water temp and core temp at full load, like 550w, with a ‘every-day’ fan curve. I’m at about 20c with 35 water and 55 core. That is with LM on core, thermal putty on front vrms and mem, putty/pads on back and an mp5works cooler on the backplate. At full pump/fans I can get the delta between water and core down to 15c at max wattage. Mem temps are stupidly low (mem junction at 40c) and hotspot delta is good also max 10c above core. Just wasn’t expecting a 20c delta considering my 3090 with ek block was about 10c - but maybe the 4090s are just more thermally dense? I did gain about 3-4 bins on average across all benchmarks, and improved my scores across the board - so I’m not upset, just want to make sure that delta isn’t indicative of a bad mount.


18C for me, but where you measure matters and all that so I'd say we're in the same place. Considering Optimus says their block is [email protected], I think it's just a fact of life with Ada dies and we're doing everything right.


__ https://twitter.com/i/web/status/1584950589393293312


----------



## GRABibus

Laithan said:


> Yup, they are rolling in I just got mine. I was waiting to compare the Corsair solution to the CableMod solution. The CableMod has populated sense pins. Corsair = (2) PSU / CableMod = (4) PSU.
> View attachment 2581653
> 
> View attachment 2581654
> 
> View attachment 2581655
> 
> View attachment 2581656
> 
> View attachment 2581657
> 
> View attachment 2581658
> 
> View attachment 2581659


Which one do you recommend then ?


----------



## Zero989

yzonker said:


> I made this run last night. It artifacted very briefly (maybe 5-10s). Score jumped about 100pts. If I hadn't been watching it carefully, would not have even noticed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result not found
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> So I see no way to really weed this out completely. I did see a comment on another forum saying UL is considering requiring ECC, but no idea how serious they are. Would probably mean deleting the entire database of 4090 scores.


Artifacts are fun 









I scored 29 135 in Port Royal


Intel Core i7-13700KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## bmagnien

mirkendargen said:


> 18C for me, but where you measure matters and all that so I'd say we're in the same place. Considering Optimus says their block is [email protected], I think it's just a fact of life with Ada dies and we're doing everything right.
> 
> 
> __ https://twitter.com/i/web/status/1584950589393293312


Word. Yah if I’m getting optimus’ own advertised delta from a bykski I’ll call that a win.


----------



## J7SC

bmagnien said:


> Word. Yah if I’m getting optimus’ own advertised delta from a bykski I’ll call that a win.


Especially for US$ 122 (+shipping)


----------



## Azazil1190

My second 4090 arrived today .
This is for my second build.
Its a good sample much better than my tuf oc
+315core +1650memory its stable to superposition 8k .
More tests tomorrow 
















So happy with giga oc.


----------



## Antsu

God damnit you guys with your Bykski blocks. My card runs more than well on air and I was 100% sure I'd patiently wait for Aquacomputer, but I just finished cleaning + re-doing my GPU loop, the itch to get a card in there is real and all these watercooling posts are not helping that. 
If I didn't have to pay import fees I'd just pull the trigger, but after shipping and import tax it's about ~180€, which is probably not that far from a no-backplate AC block... It would be cool to be able to compare both blocks on the same loop, especially since I personally don't believe that Bykski can match Aquacomputer / Optimus cooling performance, but I'd be glad to be proven wrong. The whole 2GB GDDR6X temperature scaling makes it even more complicated, as one of the reasons that I am such an Aquacomputer fanboy is their direct contact memory cooling, but if (when) that doesn't matter one bit this generation, the Bykski might not be that bad... Decisions, decisions...


----------



## Mad Pistol

The more time passes, the more the RTX 4090 appears to be the 8800 GTX of our time.

A Ray Tracing patch was released today for Forza Horizon 5, so I decided to run the benchmark at 4K, max settings, no DLSS.












And then you've got the RTX 3090 which gets 75 FPS at 4K, but that's with DLSS at Quality.








Forza Horizon 5 Benchmark Runs at [email protected] with Ray Tracing and DLSS on an RTX 3090


Forza Horizon 5 recently added ray tracing during races, DLSS 2.0 support, and more. How does it all run on a GeForce RTX 3090?




wccftech.com











The RTX 4090 is an absolute brute of a GPU.


----------



## Laithan

GRABibus said:


> Which one do you recommend then ?


I really don't have any recommendations at this point (I needed the 16-pin corsair cable) but judging side by side the quality of CableMods is clearly at the next level. Since my PSU is made by Corsair, I am sure that Corsair provides a very suitable solution despite not "looking" as nice.

The main consideration may come down to using (*4*) 8-pin PSU connections (CableMods) VS (*2*) 8-pin PSU connections (Corsair). I am sure neither is "wrong" and perhaps just two ways to provide the same end result. Corsair makes the PSU so it is hard to question or criticize their decision to use only (2) connectors...


----------



## bmagnien

Azazil1190 said:


> My second 4090 arrived today .
> This is for my second build.
> Its a good sample much better than my tuf oc
> +315core +1650memory its stable to superposition 8k .
> More tests tomorrow
> View attachment 2581692
> 
> View attachment 2581694
> 
> So happy with giga oc.


GigaOCs are great cards. Just FYI, 1080P extreme is gonna expose unstable overclocks much more than 8k. And Port Royal even more so than that. Topped 16k in 8k last night while testing out the next block


----------



## bmagnien

Some more numbers from first night with the card underwater:

8k SP:



























1080P Extreme SP:









4k SP:


----------



## bmagnien

Mad Pistol said:


> The more time passes, the more the RTX 4090 appears to be the 8800 GTX of our time.
> 
> A Ray Tracing patch was released today for Forza Horizon 5, so I decided to run the benchmark at 4K, max settings, no DLSS.
> 
> View attachment 2581693
> 
> 
> 
> 
> And then you've got the RTX 3090 which gets 75 FPS at 4K, but that's with DLSS at Quality.
> 
> 
> 
> 
> 
> 
> 
> 
> Forza Horizon 5 Benchmark Runs at [email protected] with Ray Tracing and DLSS on an RTX 3090
> 
> 
> Forza Horizon 5 recently added ray tracing during races, DLSS 2.0 support, and more. How does it all run on a GeForce RTX 3090?
> 
> 
> 
> 
> wccftech.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The RTX 4090 is an absolute brute of a GPU.


Nice call - this could be a great benchmark for real world OC comparisons. Especially since it shows the settings used in the ending screenshot.


----------



## bmagnien

One last post, apologies for the flurry. This one showing the efficacy of the bykski block. 16 c cooler but only 1 bin and 100pts higher. It “feels” more consistently stable at that higher bin, but who knows.
Before:








After:


----------



## motivman

Azazil1190 said:


> My second 4090 arrived today .
> This is for my second build.
> Its a good sample much better than my tuf oc
> +315core +1650memory its stable to superposition 8k .
> More tests tomorrow
> View attachment 2581692
> 
> View attachment 2581694
> 
> So happy with giga oc.


****, that card is certified GOLDEN. How are people so lucky??? my two 4090's so far have been complete DUDS


----------



## Azazil1190

motivman said:


> ****, that card is certified GOLDEN. How are people so lucky??? my two 4090's so far have been complete DUDS


I risk one more time and i was thinking about you the truth is.
Finally im fully satisfied from the sample on stock cooler


----------



## J7SC

Azazil1190 said:


> My second 4090 arrived today .
> This is for my second build.
> Its a good sample much better than my tuf oc
> +315core +1650memory its stable to superposition 8k .
> More tests tomorrow
> View attachment 2581692
> 
> View attachment 2581694
> 
> So happy with giga oc.


Another happy Gigabyte Gaming OC customer...And those hotspot temps at 573 W max  !


----------



## KedarWolf

Laithan said:


> I really don't have any recommendations at this point (I needed the 16-pin corsair cable) but judging side by side the quality of CableMods is clearly at the next level. Since my PSU is made by Corsair, I am sure that Corsair provides a very suitable solution despite not "looking" as nice.
> 
> The main consideration may come down to using (*4*) 8-pin PSU connections (CableMods) VS (*2*) 8-pin PSU connections (Corsair). I am sure neither is "wrong" and perhaps just two ways to provide the same end result. Corsair makes the PSU so it is hard to question or criticize their decision to use only (2) connectors...


"So my concern for max power per 8-pin connection is unfounded if this is accurate. With the quality Corsair AX1600i that I am using, each 8-pin can supply 288W for a max of 576W + the PCIe slot at 75W. I’m underestimating the capacity of the cable. It appears to be able to supply much more juice than I had originally anticipated. I just wanted confirmation of this before proceeding with any change in how I’m powering the card."


----------



## KedarWolf

Antsu said:


> God damnit you guys with your Bykski blocks. My card runs more than well on air and I was 100% sure I'd patiently wait for Aquacomputer, but I just finished cleaning + re-doing my GPU loop, the itch to get a card in there is real and all these watercooling posts are not helping that.
> If I didn't have to pay import fees I'd just pull the trigger, but after shipping and import tax it's about ~180€, which is probably not that far from a no-backplate AC block... It would be cool to be able to compare both blocks on the same loop, especially since I personally don't believe that Bykski can match Aquacomputer / Optimus cooling performance, but I'd be glad to be proven wrong. The whole 2GB GDDR6X temperature scaling makes it even more complicated, as one of the reasons that I am such an Aquacomputer fanboy is their direct contact memory cooling, but if (when) that doesn't matter one bit this generation, the Bykski might not be that bad... Decisions, decisions...


I'm holding out for an Optimus block.

My 3090 block was absolutely incredible. 🐺


----------



## doom3crazy

KedarWolf said:


> "So my concern for max power per 8-pin connection is unfounded if this is accurate. With the quality Corsair AX1600i that I am using, each 8-pin can supply 288W for a max of 576W + the PCIe slot at 75W. I’m underestimating the capacity of the cable. It appears to be able to supply much more juice than I had originally anticipated. I just wanted confirmation of this before proceeding with any change in how I’m powering the card."


talking to a corsair rep they assured me that their high power cable with the two 8 pin to 16 pin is more than capable of safely delivering 600w


----------



## dansi

It seems Nvidia, MSI and Asus 4090 have the best power stages because of the MP used? 

The UP9xxx power stages were not that good during 3000 days IIRC. These were semi-digital and have lesser tweaking ranges.

But does a good power stages make a difference for 4090?

I am thinking of a Zotac AE and Byski block, this matchup is widely available now.


----------



## Panchovix

dansi said:


> It seems Nvidia, MSI and Asus 4090 have the best power stages because of the MP used?
> 
> The UP9xxx power stages were not that good during 3000 days IIRC. These were semi-digital and have lesser tweaking ranges.
> 
> But does a good power stages make a difference for 4090?
> 
> I am thinking of a Zotac AE and Byski block, this matchup is widely available now.


Probably VRM doesn't matter much, in my case my TUF (non-OC) it's a really, really bad overclocker so wouldn't recommend (the non-OC at least, it has "good" VRM but really poor binning) (I think the OC one is fine though)

Gigabyte Gaming OC seems to be one of the best overclockers at the moment for the price (even with "weak" VRMs); discarding price, Suprim X Liquid and Strix also seem really good (those 2 have really good VRM tho)


----------



## motivman

Azazil1190 said:


> I risk one more time and i was thinking about you the truth is.
> Finally im fully satisfied from the sample on stock cooler


congratulations. Hoping my 2nd 4090 FE that is coming in tomorrow is a silicon winner 🤞


----------



## mirkendargen

KedarWolf said:


> "So my concern for max power per 8-pin connection is unfounded if this is accurate. With the quality Corsair AX1600i that I am using, each 8-pin can supply 288W for a max of 576W + the PCIe slot at 75W. I’m underestimating the capacity of the cable. It appears to be able to supply much more juice than I had originally anticipated. I just wanted confirmation of this before proceeding with any change in how I’m powering the card."


Even if the cable has 4 connectors, I bet you it is still the exact same number of wires and pins, they are just leaving most of them bare in each connector. Meaning the exact same current is traveling through each wire and pin.


----------



## Antsu

KedarWolf said:


> I'm holding out for an Optimus block.
> 
> My 3090 block was absolutely incredible. 🐺


That's what I'm (supposed to be) doing, but unfortunately no Optimus here in the EU, Aquacomputer blocks have worked out great for me in the last two generations, and they seem to have a new design for this gen, so I will have to try and resist the Bykski temptation.


----------



## bmagnien

Mad Pistol said:


> The more time passes, the more the RTX 4090 appears to be the 8800 GTX of our time.
> 
> A Ray Tracing patch was released today for Forza Horizon 5, so I decided to run the benchmark at 4K, max settings, no DLSS.
> 
> View attachment 2581693
> 
> 
> 
> 
> And then you've got the RTX 3090 which gets 75 FPS at 4K, but that's with DLSS at Quality.
> 
> 
> 
> 
> 
> 
> 
> 
> Forza Horizon 5 Benchmark Runs at [email protected] with Ray Tracing and DLSS on an RTX 3090
> 
> 
> Forza Horizon 5 recently added ray tracing during races, DLSS 2.0 support, and more. How does it all run on a GeForce RTX 3090?
> 
> 
> 
> 
> wccftech.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The RTX 4090 is an absolute brute of a GPU.


I'd propose running DLAA as the AA solution, as it takes advantage of the NV architecture and is arguably better visually than MSAA or TAA. That being said, all other settings are the same as yours, no DLSS, all max, 4k. I'd be interested in others positing their screenshots if they have the game (with update), especially to see how it performs on some other CPUs:


----------



## mirkendargen

What kind of voltage drop are people seeing under max load (550W+)? I drop from pretty much right at 12V to 11.6V on an AX1600i.


----------



## GRABibus

Azazil1190 said:


> My second 4090 arrived today .
> This is for my second build.
> Its a good sample much better than my tuf oc
> +315core +1650memory its stable to superposition 8k .
> More tests tomorrow
> View attachment 2581692
> 
> View attachment 2581694
> 
> So happy with giga oc.


That’s a golden sample Gaming OC.
Mine take maximum +240MHz on core in PR.

hiw much does your sample take in PR ?


----------



## schoolofmonkey

Panchovix said:


> Probably VRM doesn't matter much, in my case my TUF (non-OC) it's a really, really bad overclocker so wouldn't recommend (the non-OC at least, it has "good" VRM but really poor binning) (I think the OC one is fine though)


My Galax 4090 SG can manage +225 on the core, haven't tried pushing the memory past +1500 yet, it is meant to be one of the cheaper slightly weaker cards with a 510w BIOS, but I did see 3Ghz so it's not too bad.


----------



## EEE-RAY

mirkendargen said:


> What kind of voltage drop are people seeing under max load (550W+)? I drop from pretty much right at 12V to 11.6V on an AX1600i.


I have a strix 4090 and an AX1600i and with furry donut at 600W the "GPU Misc0" voltage is 11.8 and the "GPU unknown" voltage is 11.96. I am not sure which one is the 12VHPWR cable. Voltage at idle is 12.1


----------



## mirkendargen

EEE-RAY said:


> I have a strix 4090 and an AX1600i and with furry donut at 600W the "GPU Misc0" voltage is 11.8 and the "GPU unknown" voltage is 11.96. I am not sure which one is the 12VHPWR cable. Voltage at idle is 12.1


I'm pretty sure it's "unknown rail voltage", since "unknown rail power" is the big wattage. That's interesting that you don't drop nearly as much as me, what 12VHPWR cable are you using?


----------



## EEE-RAY

mirkendargen said:


> I'm pretty sure it's "unknown rail voltage", since "unknown rail power" is the big wattage. That's interesting that you don't drop nearly as much as me, what 12VHPWR cable are you using?


I am using the worst possible combination - the included octupus adapter with 4 PCI-E 8 pins, some daisy chained.


----------



## TSportM

TSportM said:


> Hello
> 
> what are the benifits of flashing the suprim X liquide 600w on my suprimx ?
> 
> cheers


any inputs please ?

cheers


----------



## SilenMar

Too bad Gigabyte Waterforce could've had the best performance compared to other cards.


----------



## Azazil1190

GRABibus said:


> That’s a golden sample Gaming OC.
> Mine take maximum +240MHz on core in PR.
> 
> hiw much does your sample take in PR ?


Today im gonna try pr and other 3smark tests.
I guess that to pr i cant go so much high like the superposition .
I think i will be somewhere +270-285 for core but we'll see later for sure.
Any way the gigas ...yes they are binned higher


----------



## Brandur

I have gotten my Inno3d 4090 iChill Black, with full cover waterblock and the temperatures are insanely good. At full load the GPU sits at 51°, the VRAM at 60 and the hotspot at 61°. The card has 0 coilwhine, just some slite buzzing, when under load. The AIO pump is dead silent and the fans spin at 200rpm at idle and at 600rpm at load. Very impressive card so far. I had a 4090 Suprim X, which had terrible coil whine and one fan made terrible noises! I will look out for the Inno3d 4090Ti, since i never had a product of this manufacurer before but the cooperation between them and Artic is very nice and I am very happy. Was very disappointed with MSI tho.


----------



## EEE-RAY

Sounds awesome! Just wondering - isn't the buzzing the definition of coil whine?

What is the longevity of these AIOs? I was very tempted by the MSI one, but went for an air cooler in the end just out of durability concerns. Where do you mount your AIOs for CPU and the GPU respectively?


----------



## N19htmare666

newls1 said:


> why, get a normal card and put a real waterblock on it, not that AIO garbage


A custom loop would not fit in the computer this if going in.


----------



## Brandur

EEE-RAY said:


> Sounds awesome! Just wondering - isn't the buzzing the definition of coil whine?
> 
> What is the longevity of these AIOs? I was very tempted by the MSI one, but went for an air cooler in the end just out of durability concerns. Where do you mount your AIOs for CPU and the GPU respectively?


I guess, when u mount the radiator the right way and the pump doesn't get air, the pump should last couple of years. since i only keep my gpus 1-2 years max. the longevity doesn't realy bother me.

Edit: Mounted the CPU radiator at the top of the case and the GPU radiator at the side, since I have a Corsair 7000D, which has a lot of space for radiators.


----------



## Glottis

EEE-RAY said:


> Just wondering - isn't the buzzing the definition of coil whine?


Yup, buzzing is just coil whine at lower frequency (it changes based on power usage and framerate).


----------



## alasdairvfr

@Mad Pistol @bmagnien 

You guys have the 5800X3D so maybe a bit of an edge on the CPU side, here's me with the 5950x:










I was running a pretty steep OC on the GPU, +255 core and +1400 memory so well over 3000 on core the whole time. Not sure how much this game responds to GPU OC. I'll have to test more.

For sure I have something causing stutters, possibly need to drop OC and try again


----------



## Azazil1190

motivman said:


> congratulations. Hoping my 2nd 4090 FE that is coming in tomorrow is a silicon winner 🤞


Hope this time to get lucky mate! 💪


----------



## alasdairvfr

Looks like Cyberpunk patch 1.6.1 came out, no DLSS 3 though. Looking forward to try it out on this game


----------



## dante`afk

20 minutes quake II RTX with Ekwb block


----------



## LuckyImperial

TSportM said:


> any inputs please ?
> 
> cheers


Not much. The 70W increase provides a small performance boost. It's worth it if your chasing benchmark leaderboards, but pointless for gaming.


----------



## bmagnien

dante`afk said:


> 20 minutes quake II RTX with Ekwb block
> 
> View attachment 2581798


~17c delta between water and core - nice. That mora is doing work though as your water temp barely raises at all. But your wattage should be much higher if you’re maxing out resolution with unlocked fps - closer to 575w instead of the 436max,325 average you’re showing here.

run furmark for 20mins and let’s see that delta and water temp 😆


----------



## bmagnien

alasdairvfr said:


> @Mad Pistol @bmagnien
> 
> You guys have the 5800X3D so maybe a bit of an edge on the CPU side, here's me with the 5950x:
> 
> View attachment 2581786
> 
> 
> I was running a pretty steep OC on the GPU, +255 core and +1400 memory so well over 3000 on core the whole time. Not sure how much this game responds to GPU OC. I'll have to test more.
> 
> For sure I have something causing stutters, possibly need to drop OC and try again


Nice! Yah mine was run with my max stable gaming OC as well, definitely think it’s helping , plus the 5890x3d doesn’t hurt either.


----------



## dante`afk

bmagnien said:


> ~17c delta between water and core - nice. That mora is doing work though as your water temp barely raises at all. But your wattage should be much higher if you’re maxing out resolution with unlocked fps - closer to 575w instead of the 436max,325 average you’re showing here.
> 
> run furmark for 20mins and let’s see that delta and water temp 😆


settings for furmark?


----------



## bmagnien

dante`afk said:


> settings for furmark?


Make sure gsync/vsync is off, use settings below and then click 'gpu stress test'
i'd be curious to see both full blast fans/pumps and 'everyday sound-tolerable'


----------



## Benni231990

TSportM said:


> any inputs please ?
> 
> cheers


You have 600 watt instead of 520 like all others like Strix or TUF etc etc


----------



## dante`afk

bmagnien said:


> Make sure gsync/vsync is off, use settings below and then click 'gpu stress test'
> i'd be curious to see both full blast fans/pumps and 'everyday sound-tolerable'
> View attachment 2581802


55c,i don't wanna kill my card


----------



## bmagnien

dante`afk said:


> 55c,i don't wanna kill my card


Lol yeah don’t blame you. 55c is right at where most people have seemed to hit at max wattage under water. Good mount and thermals


----------



## 8472

__ https://twitter.com/i/web/status/1590373773609410560


----------



## Panchovix

alasdairvfr said:


> @Mad Pistol @bmagnien
> 
> You guys have the 5800X3D so maybe a bit of an edge on the CPU side, here's me with the 5950x:
> 
> View attachment 2581786
> 
> 
> I was running a pretty steep OC on the GPU, +255 core and +1400 memory so well over 3000 on core the whole time. Not sure how much this game responds to GPU OC. I'll have to test more.
> 
> For sure I have something causing stutters, possibly need to drop OC and try again


Now RT is available for the benchamrk? Welp, time to download it.

I'm doing some benchmarks on games on stock, undervolts, ocs, etc, so it's nice to have another one. (Though if my 5800X is too much of a bottleneck on Forza, each preset won't be that different of each other)


----------



## LunaP

Antsu said:


> God damnit you guys with your Bykski blocks. My card runs more than well on air and I was 100% sure I'd patiently wait for Aquacomputer, but I just finished cleaning + re-doing my GPU loop, the itch to get a card in there is real and all these watercooling posts are not helping that.
> If I didn't have to pay import fees I'd just pull the trigger, but after shipping and import tax it's about ~180€, which is probably not that far from a no-backplate AC block... It would be cool to be able to compare both blocks on the same loop, especially since I personally don't believe that Bykski can match Aquacomputer / Optimus cooling performance, but I'd be glad to be proven wrong. The whole 2GB GDDR6X temperature scaling makes it even more complicated, as one of the reasons that I am such an Aquacomputer fanboy is their direct contact memory cooling, but if (when) that doesn't matter one bit this generation, the Bykski might not be that bad... Decisions, decisions...


A few people already posted their reviews of it in which they were getting the same results as the Optimus blocks ( during the past 15 pages at least ) given how the required parts for cooling have changed, I'm way cooler on my byski than I was on my 2080ti w/ my XSPC blocks.


Also didn't realize cablemod.com was a chinese site? Just got an update this morning my cables prepping to ship from china. Though might be delayed a week + due to COVID-19? that sucks, I finally got everything else at least. Had to use a PCI-E riser to attach my USB card since my 2080ti was a 3 slot and blocked the remaining port so usb cards just hanging outside the case....


----------



## bmagnien

Panchovix said:


> Now RT is available for the benchamrk? Welp, time to download it.
> 
> I'm doing some benchmarks on games on stock, undervolts, ocs, etc, so it's nice to have another one. (Though if my 5800X is too much of a bottleneck on Forza, each preset won't be that different of each other)


RT is available on your own car in game now instead of just in the viewing room. Gotta set all settings to absolute max though


----------



## bmagnien

8472 said:


> __ https://twitter.com/i/web/status/1590373773609410560


These are gorgeous - especially the white one. Doesn’t look like they’re actually up for sale though on phanteks’ store. would go great with my all white build but can’t justify swapping out the bykski at this point


----------



## 8472

Oh boy...


__
https://www.reddit.com/r/nvidia/comments/yqbt95


----------



## bmagnien

8472 said:


> Oh boy...
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/yqbt95


looks like it’s the tolerances being too tight on either the cable or the port. My Nvidia 4 headed monster adapter didn’t make any audible click when fully inserted in the gigaoc. My silverstone native cable did make a barely audible, but repeatable click. They need to loosen the tolerances so that all connections are capable of making a audible/haptic feedback to let the user know it’s fully inserted.


----------



## Glottis

When is it realistic to expect *Rev. 2* of these GPUs with connector issues resolved? Early 2023?


----------



## Pollomir

8472 said:


> Oh boy...
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/yqbt95


The only change is the protection for the sideband wires. Nothing to see here.


----------



## KingEngineRevUp

MX-6 new thermal paste looks pretty good.






It replaces MX-5. It's supposed to do well with pump out. I just bought a tube. Kryonaut dries up too quickly for me. I'm looking to set a TIM and forgetting it.

TL;DW: MX-6 does 1-1.5C better than MX-5 and 5C better than MX-4.









Review : Arctic MX-6 Thermal paste - Page 4 of 5 - Overclocking.com


Test / Review of the MX-6 thermal paste from Arctic. A high performance thermal paste worthy successor to the MX-4.




en.overclocking.com


----------



## J7SC

mirkendargen said:


> What kind of voltage drop are people seeing under max load (550W+)? I drop from pretty much right at 12V to 11.6V on an AX1600i.


...11.6 V under max load is close to borderline, I was told. I had that with an older PSU when I put the 3090 Strix together and tested it, and then added a new Seasonic PX1300 which only ranges from 12.1 to 11.9 V. That said, if 11.6 V doesn't cause any issues for you, then I would not worry about it but re-test every once in a while.



N19htmare666 said:


> A custom loop would not fit in the computer this if going in.


...I have the opposite problem 🥴, per original size-checking with my work-play build (which isn't the 'Tardis' kind). Besides, w-block will help with high hotspot under big loads & high clocks this card is capable of...


----------



## Krzych04650

Just received my Inno3D X3 OC, so here are some first impressions. I got it second hand but it was slightly below MSRP and from an actual person, not a scalper. 

The card actually looks very nice. I am going to waterblock anyway, so I didn't think much about it, but they did a very good job and it is evident that some effort was put into styling and not making it look like a giant block of fins, it has a very nice shroud and looks quite good even despite its size. It is mostly plastic, but still, for an MSRP reference board model, this is good. 

And now here comes the best thing about it, and that is coil whine. It is just so, so much better than every GPU I've ever had. I am extremely sensitive to coil whine and even kept my PC outside of my room for a few years in the past, but this card is absolutely amazing in this regard. Sure if you put it at 1000FPS 99% load it is going to whine, no going around that, but as soon as FPS gets to reasonable levels, like 100-300, it is barely audible even with open case. Given that every single GPU I ever saw had massive coil whine by my standards and it was an absolute bane of mine in terms of building quiet system, I am absolutely shocked by this, I can build an actually quiet system now. I guess the reports of no coil whine from cards with this particular choice of components, so Inno3D and Zotac, turned out to be true. 

Performance wise it is obviously massive upgrade over 2080 Ti, I will post exact numbers against 380W 2080 Ti and 2080 Ti SLI later.

Power limit is 450W/450W unfortunately, so this needs to be flashed. Anyone tried it yet on Inno3D cards? I've found only one report in this thread so far. 

Also, two things I want to mention for 4090 in general, is that forcing SGSSAA with NVIDIA Inspector still works, Prismatik screen grabber for my ambilight that is using custom "illegal" NvFBC integration also still works, and refresh rate and Adaptive Sync range editing with CRU also still works for my monitor, so smooth sailing as far as transition goes.


----------



## stahlhart

I'm not even sure what "socery" is.


----------



## mirkendargen

J7SC said:


> ...11.6 V under max load is close to borderline, I was told. I had that with an older PSU when I put the 3090 Strix together and tested it, and then added a new Seasonic PX1300 which only ranges from 12.1 to 11.9 V. That said, if 11.6 V doesn't cause any issues for you, then I would not worry about it but re-test every once in a while.


Yeah I didn't really like it either, that's why I was trying to see what "normal" is. I also didn't check what my 3090 did on this PSU beforehand, oh well.


----------



## WayWayUp

no 4080 thread?

looks like it OCs very well








NVIDIA GeForce RTX 4080 overclocking and 4K gaming benchmarks leaked - VideoCardz.com


NVIDIA RTX 4080 with OC We have some new synthetic tests featuring overclocked RTX 4080 graphics cards. The RTX 4080 will gain up to 9% performance, with overclocking and power limit increased. That’s a good score considering the test was on a semi-custom RTX 4080 model (with NVIDIA reference...




videocardz.com


----------



## Krzych04650

Flashed Inno3D X3 OC 450W to Gigabyte Gaming OC 600W. It was semi-successful. The card still works properly, but the actual power limit seems to be 550W, not 600, it starts throttling at 550.

Too bad, the core on this card looks great. It can do almost 3200 at 1100mV if not power limited and 3000 is stable at 1025mV. Memory rolls over past 23.5 though.

3120 at 1075mV seems to be optimal for 550W. Not much of a gain from 3000 though. Stabilizing 3000 and overclocking the memory brings around 8-9% vs stock ~2770, but 3100 is only 1% on top of 3000.


----------



## GRABibus

Tested the Cablemod on my Giga Gaming OC on stock air cooler.
I also upgraded from Windows 10 to windows 11 this week end.
So fresh install.

=> +240MHz is my max "Benchable" offset on PR (whatever dongle coming with the card or Cablemod).
=> +285MHz is my max "Benchable" offset on Superposition 8K.


Here is my best PR run this evening @ 21°C :

















I scored 28 929 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Max power draw = 578W.

I noticed an improvment of the score by 70 points, which is maybe due to W11 instead of W10.

My gaming OC is not a Golden as the one's from @Azazil1190, but not too bad sample I would say 

I will receive the Corsair cable on next week.


----------



## Nizzen

GRABibus said:


> Tested the Cablemod on my Giga Gaming OC on stock air cooler.
> I also upgraded from Windows 10 to windows 11 this week end.
> So fresh install.
> 
> => +240MHz is my max "Benchable" offset on PR (whatever dongle coming with the card or Cablemod).
> => +285MHz is my max "Benchable" offset on Superposition 8K.
> 
> 
> Here is my best PR run this evening @ 21°C :
> View attachment 2581842
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 929 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I noticed an improvment of the score by 70 points, which is maybe due to W11 instead of W10.
> 
> My gaming OC is not a Golden as the one's from @Azazil1190, but not too bad sample I would say
> 
> I will receive the Corsair cable on next week.


Did you try gpuscaling in PR with stock memory?


----------



## Zero989

Your 


GRABibus said:


> Tested the Cablemod on my Giga Gaming OC on stock air cooler.
> I also upgraded from Windows 10 to windows 11 this week end.
> So fresh install.
> 
> => +240MHz is my max "Benchable" offset on PR (whatever dongle coming with the card or Cablemod).
> => +285MHz is my max "Benchable" offset on Superposition 8K.
> 
> 
> Here is my best PR run this evening @ 21°C :
> View attachment 2581842
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 929 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Max power draw = 578W.
> 
> I noticed an improvment of the score by 70 points, which is maybe due to W11 instead of W10.
> 
> My gaming OC is not a Golden as the one's from @Azazil1190, but not too bad sample I would say
> 
> I will receive the Corsair cable on next week.


Your 4090 is better than mine. Not golden but better than avg.


----------



## GRABibus

Nizzen said:


> Did you try gpuscaling in PR with stock memory?


What do you mean ?

EDIT : to increase only Core clock with Mem @ stock and see if the score follows ?


----------



## Nizzen

GRABibus said:


> What do you mean ?


How high PR score with core oc only 
Like 2950mhz-3000mhz-3050mhz-3100mhz.


----------



## GRABibus

Nizzen said:


> How high PR score with core oc only
> Like 2950mhz-3000mhz-3050mhz-3100mhz.


Not yet.
I wil when more time.


----------



## newls1

My Giga Gaming OC 4090 made it to the house today but im stuck at work till tomorrow, so no play time yet but stoked to go home tomorrow!!! Quick question before I jump right into OCing it.. would it be smart for me to start OCing the mem and use Time Spy repeating until I see scores decline, then back off on memory and then start OCing the core?


----------



## motivman

2nd 3090 FE, better than first, but lost silicon lottery on the core. Core crashes anything above 3015mhz and can do +1650 in afterburner on the memory before visible artifacts. At this point, I will have to hold off on getting a waterblock for the FE and try for the 4th and last time, for either a strix or gigabyte gaming OC...


----------



## LunaP

8472 said:


> Oh boy...
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/yqbt95


Just a small shroud change










looks like +1800 is my limit on MEM before timespy randomly just closes ( no artifacts ) unless its relating to something else, would it auto stop if the card gets to hot? I saw it jump to 71C for a split moment so I'm assuming I don't have the block tightened all the way, never even went that high on air, especially compared to everyone else w/ the same block here lol. 23-25 idle otherwise.


----------



## Panchovix

newls1 said:


> My Giga Gaming OC 4090 made it to the house today but im stuck at work till tomorrow, so no play time yet but stoked to go home tomorrow!!! Quick question before I jump right into OCing it.. would it be smart for me to start OCing the mem and use Time Spy repeating until I see scores decline, then back off on memory and then start OCing the core?


IMO SpeedWay (if you bought it) it's pretty good to test mem only, since it's very memory intensive benchmark (or you can also try on interactive mode)

If not, Port Royal seems to stress more the VRAM than TimeSpy (on my case at least)

The method that you say though is good, it's the way to test the VRAM imo


----------



## Krzych04650

Yeah, not much scaling past 3000.

Ryse
2770/21Gbps - 91,5 
3000/23.5Gbps - 99,6
3105/23.5Gbps - 101,7 

SOTTR
2770/21Gbps - 176
3000/23.5Gbps - 187
3105/23.5Gbps - 189

AC Origins

2770/21Gbps - 115
3000/23.5Gbps - 125
3105/23.5Gbps - 125
3180/23.5GBps - 126

This is at 5120x2160 ultrawide, Ryse is at 3840x1600 2x2 so 7680x3200.


----------



## GRABibus

Panchovix said:


> IMO SpeedWay (if you bought it) it's pretty good to test mem only, since it's very memory intensive benchmark (or you can also try on interactive mode)
> 
> If not, Port Royal seems to stress more the VRAM than TimeSpy (on my case at least)
> 
> The method that you say though is good, it's the way to test the VRAM imo


Heaven benchmark is also perfect to see artifacts.


----------



## GRABibus

GRABibus said:


> Tested the Cablemod on my Giga Gaming OC on stock air cooler.
> I also upgraded from Windows 10 to windows 11 this week end.
> So fresh install.
> 
> => +240MHz is my max "Benchable" offset on PR (whatever dongle coming with the card or Cablemod).
> => +285MHz is my max "Benchable" offset on Superposition 8K.
> 
> 
> Here is my best PR run this evening @ 21°C :
> View attachment 2581842
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 929 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Max power draw = 578W.
> 
> I noticed an improvment of the score by 70 points, which is maybe due to W11 instead of W10.
> 
> My gaming OC is not a Golden as the one's from @Azazil1190, but not too bad sample I would say
> 
> I will receive the Corsair cable on next week.


By the way, I played Spiderman at 4K.
45% to 60% GPU usage in some heavy RT + DLSS scenes....Sooooooo bound !

Time to upgrade my 5950X, most probably with coming 7000 X3D.


----------



## Azazil1190

GRABibus said:


> Tested the Cablemod on my Giga Gaming OC on stock air cooler.
> I also upgraded from Windows 10 to windows 11 this week end.
> So fresh install.
> 
> => +240MHz is my max "Benchable" offset on PR (whatever dongle coming with the card or Cablemod).
> => +285MHz is my max "Benchable" offset on Superposition 8K.
> 
> 
> Here is my best PR run this evening @ 21°C :
> View attachment 2581842
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 929 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Max power draw = 578W.
> 
> I noticed an improvment of the score by 70 points, which is maybe due to W11 instead of W10.
> 
> My gaming OC is not a Golden as the one's from @Azazil1190, but not too bad sample I would say
> 
> I will receive the Corsair cable on next week.


Haha isnt mine either...but thanks.
Your sample its great wish mine be like yours at least in pr
Later its time to give a try to pr.But i dont expect too much from my 10900k .
Need to put the giga on my 5950 and the tuf on the 10900 but it doesn't fit in that case.


*** The giga in the Corsair 540  the tuf is impossible to fit.Need a lian li dynamic at least


----------



## Azazil1190

GRABibus said:


> Heaven benchmark is also perfect to see artifacts.


Did you try superposition 8k?


----------



## neteng101

GRABibus said:


> Heaven benchmark is also perfect to see artifacts.


Nothing beats Heaven for simple GPU OC validation. The newer memory error correction can't be seen though, have to test and retest memory OC in increments with 3DMark instead.


----------



## Panchovix

Krzych04650 said:


> Yeah, not much scaling past 3000.
> 
> Ryse
> 2770/21Gbps - 91,5
> 3000/23.5Gbps - 99,6
> 3105/23.5Gbps - 101,7
> 
> SOTTR
> 2770/21Gbps - 176
> 3000/23.5Gbps - 187
> 3105/23.5Gbps - 189
> 
> AC Origins
> 
> 2770/21Gbps - 115
> 3000/23.5Gbps - 125
> 3105/23.5Gbps - 125
> 3180/23.5GBps - 126
> 
> This is at 5120x2160 ultrawide, Ryse is at 3840x1600 2x2 so 7680x3200.
> View attachment 2581851


Pretty nice info! I will post a new post (lol) today or tomorrow, basically showing performance difference in 7 synthetic benchs and 5 games between stock, undervolt, overclock, both, etc 

It will have a lot of info, but _spoiler_, it is as you mention most of the time.


----------



## GRABibus

Azazil1190 said:


> Did you try superposition 8k?












I have a benchable Core offset which is +45MHz higher than with PR.

So, if you can bench +315MHz in Superposition 8K, you should be able to bench PR at +285MHz offset.

Please report


----------



## GRABibus

neteng101 said:


> Nothing beats Heaven for simple GPU OC validation. The newer memory error correction can't be seen though, have to test and retest memory OC in increments with 3DMark instead.


Yes.
I start loosing score at PR at +1650MHz


----------



## Azazil1190

GRABibus said:


> View attachment 2581861
> 
> 
> I have a benchable Core offset which is +45MHz higher than with PR.
> 
> So, if you can bench +315MHz in Superposition 8K, you should be able to bench PR at +285MHz offset.
> 
> Please report


As i guess 2850 i think too but we will see later 
Your sp8k its close enough with my tuf.But my tuf at pr isnt so good.


----------



## Pollomir

stahlhart said:


> I'm not even sure what "socery" is.


You mean "fotball" 😏?


----------



## GRABibus

Azazil1190 said:


> As i guess 2850 i think too but we will see later
> Your sp8k its close enough with my tuf.But my tuf at pr isnt so good.
> View attachment 2581863


What were your OC settings ?
I didn’t reboot before the test and didn’t close any background tasks.
So I can easily go beyond 15000 with +285MHz on core on my Gaming OC.


----------



## Azazil1190

GRABibus said:


> What were your OC settings ?
> I didn’t reboot before the test and didn’t close any background tasks.
> So I can easily go beyond 15000 with +285MHz on core on my Gaming OC.


On my tuf im with strix oc bios.
+225 core +1400 memory cant go higher on memory without loose performance.win 11 and i only close the armoury and corsair tasks
High performance in nvpanel


----------



## newls1

Panchovix said:


> IMO SpeedWay (if you bought it) it's pretty good to test mem only, since it's very memory intensive benchmark (or you can also try on interactive mode)
> 
> If not, Port Royal seems to stress more the VRAM than TimeSpy (on my case at least)
> 
> The method that you say though is good, it's the way to test the VRAM imo


yes I bought speedway the day i sold my 3090ti a few weeks back.. ill use it, thank you!


----------



## TSportM

Benni231990 said:


> You have 600 watt instead of 520 like all others like Strix or TUF etc etc


yes iam aware, but i mean overlcock or performance improvments ?

cheers


----------



## Benni231990

no only the higher powerlimit the rest is the same to the 520watt Bios


----------



## arvinz

Having a hard time deciding which one to keep. The FE below just arrived, managed to snag one from a quick Best Buy drop. I will be replacing one of these with my Strix 3090 + Optimus block. The TUF or FE will be going under water as soon Optimus releases their block and I get my hands on it. Which one would you say is worth keeping?


----------



## Nizzen

arvinz said:


> Having a hard time deciding which one to keep. The FE below just arrived, managed to snag one from a quick Best Buy drop. I will be replacing one of these with my Strix 3090 + Optimus block. The TUF or FE will be going under water as as soon Optimus releases their block and I get my hands on it. Which one would you say is worth keeping?
> 
> View attachment 2581869


Tuf life 

Keep the best binned one


----------



## motivman

arvinz said:


> Having a hard time deciding which one to keep. The FE below just arrived, managed to snag one from a quick Best Buy drop. I will be replacing one of these with my Strix 3090 + Optimus block. The TUF or FE will be going under water as as soon Optimus releases their block and I get my hands on it. Which one would you say is worth keeping?
> 
> View attachment 2581869


Yup, that would be my recommendation too. keep the one that overclocks the best, return the other one. I would say the FE looks like a premium product compared to my MSI trio I had prior.


----------



## Tideman

Noticed my Gaming OC rattles when fan speed is at 90% and above..

It is not the fans because when I gently squeeze the card, the rattle stops.

I'm not using the included support bracket because it doesn't fit my motherboard, so not sure if that has something to do with it. I am using my mb's gpu support wedge, which just props it up from underneath.

Any other Gaming OC owners experiencing this? Really disappointed with this.


----------



## arvinz

motivman said:


> Yup, that would be my recommendation too. keep the one that overclocks the best, return the other one. I would say the FE looks like a premium product compared to my MSI trio I had prior.


I'm running a full custom watercooling rig so unfortunately I can't just pop these in and try them out to see which one overclocks the best. I suppose that's one of the downsides of a custom loop! I suppose I could've set up my loop with quick disconnects but that's just fugly.

So without being able to test them, but knowing that they are going to be teared down and put under water...I suppose it's a gamble. The TUF has the extra hdmi slot, that's the only thing that really separates them at this point.


----------



## doom3crazy

Ive got a legit question. Outside of leaderboard type stuff, is there really any reason for us to be pulling more than 450w out of these cards? Like whats the real world gain in general gaming applications? 3-5 more fps? I ask cause I am sitting here debating on actually under volting / power reducing my card for cooler temps and less power draw. With that said, if someone has a valid argument against this, please let me know. I am generally curious about this.


----------



## neteng101

doom3crazy said:


> Ive got a legit question. Outside of leaderboard type stuff, is there really any reason for us to be pulling more than 450w out of these cards?


There's gains until around 3GHz... I don't see a reason to bump voltage to 1.1V except for benching. You want to never hit a power limit for better frame to frame consistency in game, but can do that with an undervolt point too. Overall though - let the card figure it out - games don't pull nearly as much power or generate as much heat.


----------



## Azazil1190

Tideman said:


> Noticed my Gaming OC rattles when fan speed is at 90% and above..
> 
> It is not the fans because when I gently squeeze the card, the rattle stops.
> 
> I'm not using the included support bracket because it doesn't fit my motherboard, so not sure if that has something to do with it. I am using my mb's gpu support wedge, which just props it up from underneath.
> 
> Any other Gaming OC owners experiencing this? Really disappointed with this.


Mine too have this


----------



## motivman

arvinz said:


> I'm running a full custom watercooling rig so unfortunately I can't just pop these in and try them out to see which one overclocks the best. I suppose that's one of the downsides of a custom loop! I suppose I could've set up my loop with quick disconnects but that's just fugly.
> 
> So without being able to test them, but knowing that they are going to be teared down and put under water...I suppose it's a gamble. The TUF has the extra hdmi slot, that's the only thing that really separates them at this point.


Do you think my loop is fugly? I have it setup with alphacool quick disconnect, so I can swap my CPU, as well as switch from Air cooled to watercooled GPU in less than 10 minutes. I have enough space to fit the longest aircooled card with no problems. My original plan was to get a 4090 strix, so I redid my loop so i can fit an air cooled strix whenever I get one, then switch to a gpu water block once i am 100% sure what card I would be keeping. Patiently waiting for my cablemod gpu cable, hate this nvidia adapter.


----------



## motivman

doom3crazy said:


> Ive got a legit question. Outside of leaderboard type stuff, is there really any reason for us to be pulling more than 450w out of these cards? Like whats the real world gain in general gaming applications? 3-5 more fps? I ask cause I am sitting here debating on actually under volting / power reducing my card for cooler temps and less power draw. With that said, if someone has a valid argument against this, please let me know. I am generally curious about this.


NO, lol.


----------



## arvinz

motivman said:


> Do you think my loop is fugly? I have it setup with alphacool quick disconnect, so I can swap my CPU, as well as switch from Air cooled to watercooled GPU in less than 10 minutes. I have enough space to fit the longest aircooled card with no problems. My original plan was to get a 4090 strix, so I redid my loop so i can fit an air cooled strix whenever I get one, then switch to a gpu water block once i am 100% sure what card I would be keeping. Patiently waiting for my cablemod gpu cable, hate this nvidia adapter.


Here's mine


----------



## J7SC

...guess what the cat dragged in just now (day early) - this is custom for Seasonic Pr.Platn. PX1300 (another one from Seasonic on the way as part of a free upgrade, but that one is black. Oddly enough, the cat is black and white...).


----------



## Tideman

Azazil1190 said:


> Mine too have this


That's reassuring that it's not just mine then.

I guess I can live with seeing it's only at +90% fan speed. The fans themselves are very quiet. I feel like that plastic shroud was a bad move by Gigabyte, especially if they're going to use higher speed fans than other models...


----------



## yzonker

2nd time recently I've heard the rumor that 3DMark may require ECC to fix the inflated scores. Hope that's coming.



Spoiler: Jay2Cents vid


----------



## Azazil1190

GRABibus said:


> View attachment 2581861
> 
> 
> I have a benchable Core offset which is +45MHz higher than with PR.
> 
> So, if you can bench +315MHz in Superposition 8K, you should be able to bench PR at +285MHz offset.
> 
> Please report


Just now i made a run of pr(3d mark servers have issues )not even close to your score .
+255 +1600 memory on 10900. Gave me 28093
What frequency you get at voltage curve at 1.1v when you set +255 to ab? Mine is 3090. probably i dont have the gold sample or the 10900k doesn't help any more on synthesis


----------



## motivman

arvinz said:


> Here's mine
> View attachment 2581883


well in that case, keep the TUF. Your PC is a work of art


----------



## GRABibus

Azazil1190 said:


> Just now i made a run of pr(3d mark servers have issues )not even close to your score .
> +255 +1600 memory on 10900. Gave me 28093
> What frequency you have to voltage curve at 1.1v when you set +255 to ab? Mine is 3090. probably i dont have the gold sample or the 10900k doesn't help any more on synthesis












Maybe I am stupid but I can't read any frequency at 1.1V here.
The scale of the curve doesn' help


----------



## alasdairvfr

Azazil1190 said:


> Just now i made a run of pr(3d mark servers have issues )not even close to your score .
> +255 +1600 memory on 10900. Gave me 28093
> What frequency you have to voltage curve at 1.1v when you set +255 to ab? Mine is 3090. probably i dont have the gold sample or the 10900k doesn't help any more on synthesis


I'm pretty sure it's CPU. On 3dmark result browser look up your cpu/gpu combo, see what is your position # compared to equal hw rather than overall, you will find you are at the top 10-20 of the pile


----------



## GRABibus

At stock, [email protected]










So +255MHz => 3150MHz ?


----------



## Azazil1190

GRABibus said:


> View attachment 2581896
> 
> 
> Maybe I am stupid but I can't read any frequency at 1.1V here.
> The scale of the curve doesn' help


You need to extended


----------



## Azazil1190

GRABibus said:


> At stock, [email protected]
> 
> View attachment 2581897
> 
> 
> So +255MHz => 3150MHz ?


Mine is 2850
So you are 3135 with +255 
So you have better sample


----------



## GRABibus

Azazil1190 said:


> You need to extended


sorry, how do you extend scale ?.....


----------



## GRABibus

Azazil1190 said:


> Mine is 2850
> So you are 3135 with +255
> So you have better sample


Did you force Rebar with inspector ?


----------



## Azazil1190

GRABibus said:


> Did you force Rebar with inspector ?


No how we do that?


----------



## mattskiiau

GRABibus said:


> sorry, how do you extend scale ?.....


Go to MSIAfterburner.cfg and adjust both VFCurveEditorMaxFrequency and VFCurveEditorMaxVoltage.
The .cfg can be found the main directory where the .exe is.


----------



## GRABibus

You have to use nvidia inspector v2.3.0.13.

Then choose "3DMark Port Royal DLSS" and follow this procedure :









Here's How You Can Enable Resizable BAR Support in Any Game via NVIDIA Inspector


While only sixteen games have been officially whitelisted for Resizable BAR support by NVIDIA, there's a procedure to manually enable any.




wccftech.com





It will force Rebar in Port Royal. Of course, Rebar must also be enabled in your Bios.


----------



## Azazil1190

GRABibus said:


> sorry, how do you extend scale ?.....


Open a note pad with administration and then open from note pad the msiafterburner cfg file and then find this.
Its 3000 make it 4000 save and you are ready


----------



## Azazil1190

GRABibus said:


> You have to use nvidia inspector v2.3.0.13.
> 
> Then choose "3DMark Port Royal DLSS" and follow this procedure :
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's How You Can Enable Resizable BAR Support in Any Game via NVIDIA Inspector
> 
> 
> While only sixteen games have been officially whitelisted for Resizable BAR support by NVIDIA, there's a procedure to manually enable any.
> 
> 
> 
> 
> wccftech.com
> 
> 
> 
> 
> 
> It will force Rebar in Port Royal. Of course, Rebar must also be enabled in your Bios.


Thnx im gonna try tomorrow 😉


----------



## Azazil1190

Another thing that i notice today is ...if i oc my cpu to high then i cant oc my gpu too high.
This card must be name Christine 😂


----------



## GraphicsWhore

Is the stock situation not as great as we expected or am I being unrealistic? Most of the models according to nowinstock.net have not been available since basically launch. No urgency to upgrade so I'll wait however long I need but figured it'd be a bit better.


----------



## J7SC

GraphicsWhore said:


> Is the stock situation not as great as we expected or am I being unrealistic? Most of the models according to nowinstock.net have not been available since basically launch. No urgency to upgrade so I'll wait however long I need but figured it'd be a bit better.


...checked the chain where I got mine, as well as two national chains (in Canada), including 'sold by' Newegg.ca - all sold out other than 1x MSI Supreme Liquid X (in Saskatoon, Saskatchewan)


----------



## KingEngineRevUp

Got my card today, dropped on the 5800X3D also!

Stock vs. +1700 on memory









Result







www.3dmark.com





Applied +1700 on the memory, then +1750 and then instant crash at +1800.

Seems a little bit above average.

Will test core and more thorough testing later. This card is a beauty! I don't even know if I want to WB it but it would be a waste of all my water components.

Edit: Looks like +195 Core and +1500 Mhz is the best this card can do









I scored 27 926 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Edit: I scored 27 072 in Port Royal

+1500 Mhz is the real deal. memory correction at +1700 Mhz is obvious. 









Result







www.3dmark.com





This gen, it's all about OC the memory. I see what you guys are saying now.


----------



## TrxR

I don't know if this is appropriate place to ask since I am new here and it's urgent, but here:
I am going to buy 4090 gpu in few hours and I can choose between Inno3d X3 OC, Gainward Phantom and Inno3d X3 iChill. They all cost the same here and only ones that are available.

I was planning on buying phantom but people complain about weird fan noise. What would you recommend me?


----------



## motivman

TrxR said:


> I don't know if this is appropriate place to ask since I am new here and it's urgent, but here:
> I am going to buy 4090 gpu in few hours and I can choose between Inno3d X3 OC, Gainward Phantom and Inno3d X3 iChill. They all cost the same here and only ones that are available.
> 
> I was planning on buying phantom but people complain about weird fan noise. What would you recommend me?


get the Inno3d X3 iChill and flash 600w bios to it.


----------



## changboy

Got my block today, will install it friday i think.


----------



## QSS-5

I was only able to get +185 (2920Mhz) on core and +900 on memory, 1.05v (stock) if I push further it starts artifacting . Got a 4090 Trio (non-x).

Could a BIOS flash help here and increase voltage or am I doomed?


----------



## N19htmare666

GraphicsWhore said:


> Is the stock situation not as great as we expected or am I being unrealistic? Most of the models according to nowinstock.net have not been available since basically launch. No urgency to upgrade so I'll wait however long I need but figured it'd be a bit better.


It's worse in UK.


----------



## Panchovix

New post comparing various presets (stock, UV, UV + OC, OC, PL Limit) on the 4090.
If you like tables and percentages, there is a ton inside lol









RTX 4090 Undervolt, UV + OC, OC and PL Limit on 7...


Hi there guys, as always I do with my newer GPUs, like of the 3060 Ti, here, which was a 3060Ti Gaming OC Pro with 270W max TDP, or the 3080 here, which was shunt modded (480W). Now I did the same, but with the RTX 4090 ASUS TUF non-OC, which by default comes with 450W Power limit and 600W max...




www.overclock.net





As an below average chip (+1000 VRAM, 3030 Core), I can get ~7% more than stock with an overclock on games, so that's good news if you have an above average chip, you will be in the range of 10% or more improvement.


----------



## N19htmare666

QSS-5 said:


> I was only able to get +185 (2920Mhz) on core and +900 on memory, 1.05v (stock) if I push further it starts artifacting . Got a 4090 Trio (non-x).
> 
> Could a BIOS flash help here and increase voltage or am I doomed?


That's lower than average. If it was me I would send back if not getting 1500 on memory and 3k+ on the core. Up your voltage and see if you get over 3k on the core.


----------



## yzonker

bmagnien said:


> I'd propose running DLAA as the AA solution, as it takes advantage of the NV architecture and is arguably better visually than MSAA or TAA. That being said, all other settings are the same as yours, no DLSS, all max, 4k. I'd be interested in others positing their screenshots if they have the game (with update), especially to see how it performs on some other CPUs:
> View attachment 2581748


Odd result. Why is your GPU limited percentage so low? Is it that CPU limited?


----------



## BigMack70

Man this card is just freaking amazing... just fired up Doom Eternal, and this thing can actually play it maxed out with ray tracing at 240fps on my neo G9 5120x1440... absolutely incredible experience


----------



## KingEngineRevUp

Undervolt (0.950 @ 2745 Mhz) beating stock, running less on the core but +1500 Mhz alone beats stock.









Result







www.3dmark.com





Drops temperatures by 6C also. Max power during run 366W


----------



## yzonker

This is what I got anyway, but confused as to why the GPU limited % is so much different but about the same score as @bmagnien . Feels like settings must be a little different or something.


----------



## N19htmare666

cheddardonkey said:


> Forcing fans up will drop temps by roughly 4 with the suprim bios. The Aorus bios the fans stay running but at low speed for most of the test duration.
> ,
> After some in game testing, I am also seeing the Aorus bios perform bettter in temps (similar to superposition), and slightly better FPS (+5-7)while holding a slightly lower clock time. I'd like to get a better analysis than what I've been able to put together but it seems a mod to the Aorus bios allowing more power/voltage control would be the start.


Please can you post the the original Waterforce BIOS here? Would be interesting to see if this improvement can be replicated on the MSI liquid cards.

Also, curious to how it went in the end? Any updates.


----------



## EarlZ

How does the Zotac Amp extreme airo 4090 stack up to other brands like Suprim X, Strix or Auros Master? Does the Zotac cards boost just as much as the other brands?


----------



## Benni231990

GRABibus said:


> It will force Rebar in Port Royal. Of course, Rebar must also be enabled in your Bios.



I have enable reBar on the global setting in Nvidiainspector for all games is this better or not?


----------



## bmagnien

yzonker said:


> This is what I got anyway, but confused as to why the GPU limited % is so much different but about the same score as @bmagnien . Feels like settings must be a little different or something.
> 
> View attachment 2581910


Yah I didn’t even notice the low gpu limited %…that’s very odd. But our settings and scores are identical. I’ll run it again and see if that was just an anomaly.


----------



## yzonker

bmagnien said:


> Yah I didn’t even notice the low gpu limited %…that’s very odd. But our settings and scores are identical. I’ll run it again and see if that was just an anomaly.


There are some settings in the video menu that don't show up in that benchmark screen. All the CAM FOVs are default. Didn't screenshot those. Here are mine,


----------



## bmagnien

yzonker said:


> There are some settings in the video menu that don't show up in that benchmark screen. All the CAM FOVs are default. Didn't screenshot those. Here are mine,
> 
> View attachment 2581917
> 
> 
> View attachment 2581918


Just reset everything to default and manually re-entered all settings to max, leaving camera angles/fovs at default. gpu limited % is even lower. i have no idea, but basically the same score. i'm running the xbox game pass pc version, if that matters?


----------



## bmagnien

@yzonker i think its actually the CPU render that's doing it. Mine is just higher than my avg fps, while yours is signficantly higher with the 7950, so I am getting cpu bound much more than you, making my gpu bound % lower. looks at your max cpu render compared to mine - it's definitely that.


----------



## bmagnien

x


----------



## yzonker

bmagnien said:


> @yzonker i think its actually the CPU render that's doing it. Mine is just higher than my avg fps, while yours is signficantly higher with the 7950, so I am getting cpu bound much more than you, making my gpu bound % lower. looks at your max cpu render compared to mine - it's definitely that.


Yea actually maybe we're just GPU bound in reality though. Only thing that makes sense given the score is the same. I've got a 13900k BTW, not a 7950x.


----------



## long2905

Krzych04650 said:


> Flashed Inno3D X3 OC 450W to Gigabyte Gaming OC 600W. It was semi-successful. The card still works properly, but the actual power limit seems to be 550W, not 600, it starts throttling at 550.
> 
> Too bad, the core on this card looks great. It can do almost 3200 at 1100mV if not power limited and 3000 is stable at 1025mV. Memory rolls over past 23.5 though.
> 
> 3120 at 1075mV seems to be optimal for 550W. Not much of a gain from 3000 though. Stabilizing 3000 and overclocking the memory brings around 8-9% vs stock ~2770, but 3100 is only 1% on top of 3000.


did you move the power slider to max? can you add effective clock for monitoring as well? you can see the actual clock being utilized instead of the set clock. be careful with the adapter if you are pushing 600W? whats the temp like with the stock cooler?


----------



## KingEngineRevUp

arvinz said:


> Having a hard time deciding which one to keep. The FE below just arrived, managed to snag one from a quick Best Buy drop. I will be replacing one of these with my Strix 3090 + Optimus block. The TUF or FE will be going under water as soon Optimus releases their block and I get my hands on it. Which one would you say is worth keeping?
> 
> View attachment 2581869


Well... A decent portion of cards that melted are TUF cards and not one FE card has been reported. If it's a board design flaw, the FE doesn't seem to have it. 


__
https://www.reddit.com/r/nvidia/comments/ydh1mh


----------



## Merkor

Do you guys think an MSI Trio 4090 will fit in a case with 165mm cooler clearance with the Nvidia adapter?

Lian Li published this:





__





Loading…






lian-li.com





According to this, an OD11 with 158mm will not fit, but the Evo version with 167mm should work. So mine with 165mm is not far away from that…


----------



## Glottis

KingEngineRevUp said:


> Well... A decent portion of cards that melted are TUF cards and not one FE card has been reported. If it's a board design flaw, the FE doesn't seem to have it.
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/ydh1mh


It's just a popularity contest. Most selling cards and brands are dominating the list simply because most people have them. As for FE, it's only sold in a handful of countries and not even available in most parts of the world. Way too early to claim that FE is immune (at this points looks more like wishful thinking).


----------



## KingEngineRevUp

Glottis said:


> It's just a popularity contest. Most selling cards and brands are dominating the list simply because most people have them. As for FE, it's only sold in a handful of countries and not even available in most parts of the world. Way too early to claim that FE is immune (at this points looks more like wishful thinking).


Even so, at least one FE should have burned up by now, statistically speaking.

I know PNY and Zotac have sold a bunch of cards as well. If you want to say they don't sell as much as the other cards listed, then what if you lump all three of them all together? At least one of them should have burned up, statistically speaking.


----------



## TSportM

Hello 

is there a solution for the suprimx liquide 600w bios fan curves ?

with this bios it only goes To 2500rpm (max on my surprumx is 3500rpm) any hack to fix this ?

cheers


----------



## Benni231990

No you Can’t fix This because the liquid has the 240mm Radiator so thats the Problem


----------



## dk_mic

TSportM said:


> Hello
> 
> is there a solution for the suprimx liquide 600w bios fan curves ?
> 
> with this bios it only goes To 2500rpm (max on my surprumx is 3500rpm) any hack to fix this ?
> 
> cheers


waterblock 👍

well I am also messing with that bios on a trio, and there is no way to get the original fan speed back. I have tried fancontrol and afterburner, no luck. One of the two fan controllers are pushing max RPM, the other one only 2500.

Considering performance per watt, gains are a joke though, besides for leaderboard benching.


----------



## doom3crazy

NOOB QUESTION: It's been many years since ive tinkered with bios and such. Can the 4090 cards all be cross flashed? Or is that not a thing? Like can a MSI card be flashed with the asus strix bios?


----------



## N19htmare666

What about switching the fan header to a motherboard header?


----------



## Benni231990

Holy **** Force ReBar is a absolut game changer 

No Force reBar








with Fore reBar


----------



## N19htmare666

Benni231990 said:


> Holy **** Force ReBar is a absolut game changer
> 
> No Force reBar
> View attachment 2581943
> 
> 
> with Fore reBar
> View attachment 2581944


Yep 1% lows are much more important than averages


----------



## dk_mic

N19htmare666 said:


> What about switching the fan header to a motherboard header?


you'd need an adapter cable like this








4 Pin PWM Fan Connector Female to 4 Pin Mini GPU Fan Connector Male


Buy 4 Pin PWM Fan Connector Female to 4 Pin Mini GPU Fan Connector Male for $9.99 with Free Shipping Worldwide (In Stock)




www.moddiy.com




And i don't see the point, 2500 RPM is already way too noisy in my opinion.
If you want to run that bios on air, I'd replace the GPU fans with some proper fans like T30s. If you're really after chasing 3DMark, then you'd watercool anyways.



Benni231990 said:


> Holy **** Force ReBar is a absolut game changer


It depends on the game and resolution, try to search for some ampere benchmarks...








NVIDIA Resizable BAR Performance Revisited


NVIDIA Resizable BAR Performance Revisited – 9 supported games benchmarked, using an RTX 3080, a Z690 motherboard, and an i9-12900K CPU.




babeltechreviews.com


----------



## Benni231990

But i dont understand when i Force reBar in global settings i get higher FPS but not as high wenn i force a specific game and force reBar 

why?


----------



## Nd4spdvn

I hear (not from reliable sources) that there might be some updated 600W vBios for MSI Suprim X (non-liquid). Does anyone here know/heard anything about this? And most importantly how and where one would get them?


----------



## N19htmare666

dk_mic said:


> you'd need an adapter cable like this
> 
> 
> 
> 
> 
> 
> 
> 
> 4 Pin PWM Fan Connector Female to 4 Pin Mini GPU Fan Connector Male
> 
> 
> Buy 4 Pin PWM Fan Connector Female to 4 Pin Mini GPU Fan Connector Male for $9.99 with Free Shipping Worldwide (In Stock)
> 
> 
> 
> 
> www.moddiy.com
> 
> 
> 
> 
> And i don't see the point, 2500 RPM is already way too noisy in my opinion.
> If you want to run that bios on air, I'd replace the GPU fans with some proper fans like T30s. If you're really after chasing 3DMark, then you'd watercool anyways.
> 
> 
> 
> It depends on the game and resolution, try to search for some ampere benchmarks...
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA Resizable BAR Performance Revisited
> 
> 
> NVIDIA Resizable BAR Performance Revisited – 9 supported games benchmarked, using an RTX 3080, a Z690 motherboard, and an i9-12900K CPU.
> 
> 
> 
> 
> babeltechreviews.com


Is it the aio fans or the one on the GPU that has the slow fan issue?


----------



## Benni231990

Nd4spdvn said:


> I hear (not from reliable sources) that there might be some updated 600W vBios for MSI Suprim X (non-liquid). Does anyone here know/heard anything about this? And most importantly how and where one would get them?


UHHHH this where hugh when msi update the bios to 600watt


----------



## dk_mic

N19htmare666 said:


> Is it the aio fans or the one on the GPU that has the show can issue?


I don't know, I suppose it's the fan on the GPU that spins a bit slower than the rad fans


----------



## Panchovix

doom3crazy said:


> NOOB QUESTION: It's been many years since ive tinkered with bios and such. Can the 4090 cards all be cross flashed? Or is that not a thing? Like can a MSI card be flashed with the asus strix bios?


Yes you can, I have a Gigabyte Gaming OC VBIOS on my quiet VBIOS slot of my TUF lol

No issues so far, though the TUF is 2HDMI/3DP, I guess if you do the reverse (X Trío is 1HDMI/3DP) you may lose some ports, but besides that it will work.

Important edit: except for the 4090 FE, you cannot flash another VBIOS on it, or flash a 4090FE with an VBIOS that isn't the FE one)


----------



## Krzych04650

long2905 said:


> did you move the power slider to max? can you add effective clock for monitoring as well? you can see the actual clock being utilized instead of the set clock. be careful with the adapter if you are pushing 600W? whats the temp like with the stock cooler?


Yea of course I did. Tested it now more thoroughly in Furmark, actual power limit is 560W and it only reacts up to 126% PL, going to 133% should go 600 but it does nothing. Maybe I will try different BIOS.

I did add effective clock monitoring. Will post some more screenshots later. But increasing from 3000 set to 3180 set is only bringing around 80 MHz improvement in effective clock.

I have dedicated Corsair 12VHPWR cable.

Temps are very good at stock, mid 50s to low 60s with very reasonable fan speeds, but pushing max OC with 1100mV in demanding scenarios is going to reach like 72C core with 89C hotspot over time and that's on max fan speed. 

I want to go with Alphacool ES block but it is out of stock for now.


----------



## bmagnien

Krzych04650 said:


> Yea of course I did. Tested it now more thoroughly in Furmark, actual power limit is 560W and it only reacts up to 126% PL, going to 133% should go 600 but it does nothing. Maybe I will try different BIOS.
> 
> I did add effective clock monitoring. Will post some more screenshots later.
> 
> I have dedicated Corsair 12VHPWR cable.
> 
> Temps are very good at stock, mid 50s to low 60s with very reasonable fan speeds, but pushing max OC with 1100mV in demanding scenarios is going to reach like 72C core with 89C hotspot over time and that's on max fan speed. I want to go with Alphacool ES block but it is out of stock for now.


I don’t think it’s the bios, I’ve never seen the full 133% pl on my gigaoc with stock 600w bios, even in furmark or any other high wattage app. Haven’t seen anyone else hit it either


----------



## Krzych04650

bmagnien said:


> I don’t think it’s the bios, I’ve never seen the full 133% pl on my gigaoc with stock 600w bios, even in furmark or any other high wattage app. Haven’t seen anyone else hit it either


Interesting. It looks like the actual power limit is 560W not 600W then, the card throttles at this point so it is not like it cannot or doesn't want to pull more. In Furmark I am at 560W with only 980mV or something like that and very low clocks like 2600, so the card definitely would use more if it was allowed. I will try different BIOS later.


----------



## KingEngineRevUp

Well I was on the fence about the EKWB block, but it got shipped today so going to have to stick with it.


----------



## bmagnien

If anyone has Stray, there's a fully ray traced version if you add -dx12 as a launch parameter. Wouldn't go over 25fps on my 3090, around 45-60 on 4090:


----------



## Sheyster

My Cablemod 12VHPWR cable is out for delivery. Almost time to unleash the beast.


----------



## Krzych04650

Neptune 630W BIOS not available yet?


----------



## newls1

getting this error all of a sudden in the middle of tuning my OC. Did 6 runs back to back just increasing mem and all was going great. now everytime i run and 3dmark benchmark i get the following error... So crazy. What is this and how to i fix this? even with NO OC now, 3dmark throws this error


----------



## Sheyster

newls1 said:


> getting this error all of a sudden in the middle of tuning my OC. Did 6 runs back to back just increasing mem and all was going great. now everytime i run and 3dmark benchmark i get the following error... So crazy. What is this and how to i fix this? even with NO OC now, 3dmark throws this error
> 
> View attachment 2581995


I've seen it before, check this:









EnumDisplayDevicesA() failed


An "EnumDisplayDevicesA() failed" error when running 3DMark suggests that Windows 10 cannot properly identify your monitors or that a third-party application has modified how they are set up. To fix this issue, open Windows Device Manager. E...




support.benchmarks.ul.com


----------



## LunaP

Sheyster said:


> My Cablemod 12VHPWR cable is out for delivery. Almost time to unleash the beast.


Did yours also ship from china?


----------



## newls1

Sheyster said:


> I've seen it before, check this:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EnumDisplayDevicesA() failed
> 
> 
> An "EnumDisplayDevicesA() failed" error when running 3DMark suggests that Windows 10 cannot properly identify your monitors or that a third-party application has modified how they are set up. To fix this issue, open Windows Device Manager. E...
> 
> 
> 
> 
> support.benchmarks.ul.com


I read your link, then installed specific monitor drivers, rebooted.. same thing. unistalled 3dmark, reinstalled it and still same thing. im puzzled. Games play great so no issues with OC or card, but this is so weird. Never had 3dmark do this


----------



## Sheyster

LunaP said:


> Did yours also ship from china?


Yes it did!

Apparently there is a new driver release today:

GeForce Game Ready Driver | 526.86 | Windows 10 64-bit, Windows 11 | NVIDIA


----------



## Sheyster

newls1 said:


> I read your link, then installed specific monitor drivers, rebooted.. same thing. unistalled 3dmark, reinstalled it and still same thing. im puzzled. Games play great so no issues with OC or card, but this is so weird. Never had 3dmark do this


Sorry to hear that. I suggest also checking reddit and search here as well, for possible reasons.


----------



## newls1

Sheyster said:


> Sorry to hear that. I suggest also checking reddit and search here as well, for possible reasons.


this is so wierd!


----------



## Krzych04650

Some 2080 Ti vs 4090 and 2080 Ti SLI vs 4090 benchmarks. Both cards overclocked, 2080 Ti is 2160/8000 380W for single card, 2025/8000 for two cards, and 4090 3000/11750 600W.

All games at max settings, RT games with max RT and DLSS Quality, older games with MSAAx4+SGSSAAx4. 5120x2160 21:9.









Essentially, very hard to go below 100FPS even with balls to the wall settings at damn 5120x2160 resolution. I can still find some cases that drop into 70s, so I think one more generation is needed for absolute unbeatable performance in all scenarios, but those numbers are just absolutely insane.


----------



## LunaP

Krzych04650 said:


> Some 2080 Ti vs 4090 and 2080 Ti SLI vs 4090 benchmarks. Both cards overclocked, 2080 Ti is 2160/8000 380W for single card, 2025/8000 for two cards, and 4090 3000/11750 600W.
> 
> All games at max settings, RT games with max RT and DLSS Quality, older games with MSAAx4+SGSSAAx4. 5120x2160 21:9.
> View attachment 2582008
> 
> 
> Essentially, very hard to go below 100FPS even with balls to the wall settings at damn 5120x2160 resolution. I can still find some cases that drop into 70s, so I think one more generation is needed for absolute unbeatable performance in all scenarios, but those numbers are just absolutely insane.


coming from 2080ti's OC in SLI as well I can vouch for this, I skipped ampere cuz of this reason lol, and glad I waited. From here on I only have to upgrade 1 card anyways vs 2 (even if the price is about the same lmao )


----------



## Vasoka

Wrong thread.


----------



## jootn2kx

Just my 2 cents I received my Gainward 4090 (non gs version) which I bought for msrp price.
Did some tests couple of hours and mine does 2970mhz with default bios and powerlimit to 100%.
I get stable +250 core and +1300 ram easily and the card is pulling around 400watts.

I feel like there is more room to overclock with higer powerlimits (flashing other bios), but i'm not bothered right now to do that.
Side note there is a slight coil whine but it doesnt bother me at all


----------



## Panchovix

Krzych04650 said:


> Some 2080 Ti vs 4090 and 2080 Ti SLI vs 4090 benchmarks. Both cards overclocked, 2080 Ti is 2160/8000 380W for single card, 2025/8000 for two cards, and 4090 3000/11750 600W.
> 
> All games at max settings, RT games with max RT and DLSS Quality, older games with MSAAx4+SGSSAAx4. 5120x2160 21:9.
> View attachment 2582008
> 
> 
> Essentially, very hard to go below 100FPS even with balls to the wall settings at damn 5120x2160 resolution. I can still find some cases that drop into 70s, so I think one more generation is needed for absolute unbeatable performance in all scenarios, but those numbers are just absolutely insane.


Pretty nice overclockers you had there on the 2080Tis! By the way, on the percentages, when you mean 153% (on LOTTRO), it is actually the 4090 being 2.53x times faster than the 2080Ti right? Since 119*100/47 = 253.2%


----------



## Krzych04650

Panchovix said:


> Pretty nice overclockers you had there on the 2080Tis! By the way, on the percentages, when you mean 153% (on LOTTRO), it is actually the 4090 being 2.53x times faster than the 2080Ti right? Since 119*100/47 = 253.2%


Yes, +153% means 253% and so on. So from this testing 4090 is 2.76x 2080 Ti on average. That is OC vs OC where on 4090 overclocking brings around 9% extra performance vs stock 450W and on 2080 Ti around 15% vs stock 250W, so the margins are going to be a bit smaller than stock vs stock tests.


----------



## Xavier233

Anyone knows the difference between the CableMod *Sleeved 12VHPWR *which has 3x8-pin output (to the PSU) versus the ones that have 4x8-pin output?

Not sure why they have to do 3x or 4x when Corsair has a 2x8-pin output to the PSU?


----------



## newls1

Xavier233 said:


> Anyone knows the difference between the CableMod *Sleeved 12VHPWR *which has 3x8-pin output (to the PSU) versus the ones that have 4x8-pin output?
> 
> Not sure why they have to do 3x or 4x when Corsair has a 2x8-pin output to the PSU?


ive wondered the same thing too....


----------



## Krzych04650

4090 can still be murdered with SGSSAA. LOTRO, 3840x1600, 8xMSAA+8xSGSSAA. 50 FPS and massive power throttling from 3000 target to 2700s. It would probably pull like 800W here if allowed. It is not modern RT games that is the most difficult test.


----------



## LuckyImperial

Xavier233 said:


> Anyone knows the difference between the CableMod *Sleeved 12VHPWR *which has 3x8-pin output (to the PSU) versus the ones that have 4x8-pin output?
> 
> Not sure why they have to do 3x or 4x when Corsair has a 2x8-pin output to the PSU?


I assume it's just to spread the load out across more conductors. Also, maybe some PSU's can't do 450W on a single rail ( I can't image there's many), so the 3rd and 4th 8pins help accomplish that.


----------



## J7SC

LunaP said:


> coming from 2080ti's OC in SLI as well I can vouch for this, I skipped ampere cuz of this reason lol, and glad I waited. From here on I only have to upgrade 1 card anyways vs 2 (even if the price is about the same lmao )


...true, having only to go for 1 expensive instead of 2 semi-expensive cards does have some advantages, though NVlink SLI CFR, AFR with my 2x 2080 Ti setup (now a 'retired work+play' unit serving in the m-bedroom to power a 4K60 IPS HDR) works surprisingly well - s.th. about the symmetry with an HEDT and dual GPUs which I miss with my later work+play builds.

...now I have figure out how to shuffle 5x GPUs on 3x mobos also re. best work+play backup-combos. I read that Ubuntu recognizes NVlink SLI but doesn't do much with it...


----------



## Xavier233

LuckyImperial said:


> I assume it's just to spread the load out across more conductors. Also, maybe some PSU's can't do 450W on a single rail ( I can't image there's many), so the 3rd and 4th 8pins help accomplish that.


What I would actually would like to know is, whether the 3x8-pin or 4x8pin both can handle a max of 600 watts or not. If yes, there is no reason to get the 4x8pin and make a bulky mess more than the 3x8-pin already does


----------



## doubledoubt

Anyone has had any luck flashing another bios on the Suprim X (non-liquid) to get closer to 600W?


----------



## newls1

newls1 said:


> getting this error all of a sudden in the middle of tuning my OC. Did 6 runs back to back just increasing mem and all was going great. now everytime i run and 3dmark benchmark i get the following error... So crazy. What is this and how to i fix this? even with NO OC now, 3dmark throws this error
> 
> View attachment 2581995





Sheyster said:


> I've seen it before, check this:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EnumDisplayDevicesA() failed
> 
> 
> An "EnumDisplayDevicesA() failed" error when running 3DMark suggests that Windows 10 cannot properly identify your monitors or that a third-party application has modified how they are set up. To fix this issue, open Windows Device Manager. E...
> 
> 
> 
> 
> support.benchmarks.ul.com


Fixed it!!! Google searches led me to a post saying download the 3dmark "System info" application and let it remove the sys info files, and then reinstall them with this file.. worked like a charm. Now im @ +220c 1300M does this score in port royal look right??


----------



## GRABibus

29k past at PR 



















I scored 29 026 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## GRABibus

newls1 said:


> Fixed it!!! Google searches led me to a post saying download the 3dmark "System info" application and let it remove the sys info files, and then reinstall them with this file.. worked like a charm. Now im @ +220c 1300M does this score in port royal look right??
> 
> View attachment 2582038


You can't push your memory ?


----------



## Panchovix

newls1 said:


> Fixed it!!! Google searches led me to a post saying download the 3dmark "System info" application and let it remove the sys info files, and then reinstall them with this file.. worked like a charm. Now im @ +220c 1300M does this score in port royal look right??
> 
> View attachment 2582038


Do you have the link of the run?

Also damn, the competition with the 13900K is brutal; with my 5800X, if you get 27800~ or more, you get Legendary, and 27000~ is Excellent; seeing the 27600 being just "good" hurts lol

The best run in PR on 5800X/4090 is 28400~


----------



## mirkendargen

Xavier233 said:


> What I would actually would like to know is, whether the 3x8-pin or 4x8pin both can handle a max of 600 watts or not. If yes, there is no reason to get the 4x8pin and make a bulky mess more than the 3x8-pin already does


I think it's literally just Cablemod upselling more connectors for a higher price because they know people are falling for it based on misconceptions/fear mongering. 1:1 wires are fine, it all ends up at the same 6 +12v and 6 ground wires in the end whether it has 2 connectors or 10 connectors at the PSU side.


----------



## newls1

GRABibus said:


> You can't push your memory ?


oh im just starting this OC, im working my way up... I just keep seeing all these MUCH HIGHER scores with very similiar hardware and cant understand why im 1000+ points behind...


----------



## Sheyster

Xavier233 said:


> Anyone knows the difference between the CableMod *Sleeved 12VHPWR *which has 3x8-pin output (to the PSU) versus the ones that have 4x8-pin output?
> 
> Not sure why they have to do 3x or 4x when Corsair has a 2x8-pin output to the PSU?





newls1 said:


> ive wondered the same thing too....





LuckyImperial said:


> I assume it's just to spread the load out across more conductors. Also, maybe some PSU's can't do 450W on a single rail ( I can't image there's many), so the 3rd and 4th 8pins help accomplish that.


Are they advertising the 3 x 8-pin cable as a 600W cable? If not, the sense pins might be configured for 450W only. FWIW, I ordered the 4 x 8-pin cable, it's arriving sometime today.


----------



## Panchovix

newls1 said:


> oh im just starting this OC, im working my way up... I just keep seeing all these MUCH HIGHER scores with very similiar hardware and cant understand why im 1000+ points behind...


I get a little more than you with lower clocks (3030 core, 1457 mem) I scored 27 894 in Port Royal, though my CPU is holding me for sure

Did you made the usual optimizations? (Like forced ReBar, clean OS, NVIDIA Control panel tweaks, etc)

If you didn't, you're losing a good amount of points vs the higher places on the benchmark


----------



## Benni231990

The new Driver and the MSI 600watt Liquid Bios is not bad over 350 points more


----------



## Panchovix

Benni231990 said:


> The new Driver and the MSI 600watt Liquid Bios is not bad over 350 points more
> 
> View attachment 2582048


Well, you just made me wanna go home ASAP to install it and see if it improved my scores lol

Have you noticed any issue in this latest driver?


----------



## Xavier233

Sheyster said:


> Are they advertising the 3 x 8-pin cable as a 600W cable? If not, the sense pins might be configured for 450W only. FWIW, I ordered the 4 x 8-pin cable, it's arriving sometime today.


Thats the thing: CableMode failed to explain (as far as I can see) what is actually the difference between 3x8-pin or 4x8-pin

I just noticed that the 4x8-pin is $5 more. Sounds like a money-grab to me. My PSU is 1500w, so the GPU will have plenty of room to grab 600w anyway.


----------



## Benni231990

Panchovix said:


> Have you noticed any issue in this latest driver?


no issue nothing the driver is not bad


----------



## neteng101

Benni231990 said:


> The new Driver and the MSI 600watt Liquid Bios is not bad over 350 points more


Which card are you on? I think I degraded my GPU from 1.1V bench runs, stability is down 2 clock bins, nowhere near your +220. I'm down to +135 stable from +165 stable. +1500 memory is the max for me too, anything more and scores will dip.

Edit - think I found my problem - my CPU is overheating. Have to go repaste and remount it and see what's up. GPU is likely fine from OC attempts, CPU killed stability.


----------



## J7SC

Xavier233 said:


> Thats the thing: CableMode failed to explain (as far as I can see) what is actually the difference between 3x8-pin or 4x8-pin
> 
> I just noticed that the 4x8-pin is $5 more. Sounds like a money-grab to me. My PSU is 1500w, so the GPU will have plenty of room to grab 600w anyway.


In addition to the Cablemod that arrived yesterday (going to be a busy weekend, also re. water-block install etc), I still have a free 12VHPWR cable coming from Seasonic - but Seasonic also had linked the exact model at Cablemod for my Seasonic PX1300...Cablemod's configurator (by PSU type) is helpful to pick the right one.


----------



## Benni231990

neteng101 said:


> Which card are you on? I think I degraded my GPU from 1.1V bench runs, stability is down 2 clock bins, nowhere near your +220. I'm down to +135 stable from +165 stable. +1500 memory is the max for me too, anything more and scores will dip.


I have a Suprim non Liquid card and this settings are my 24/7 gaming settings with no errors or something

I could go to +2000MHZ on the ram but here i get very little artefacts so i go down to 1700 for stability and no EEC correction


----------



## neteng101

Benni231990 said:


> I have a Suprim non Liquid card and this settings are my 24/7 gaming settings with no errors or something


That's the card I wanted, ended up with a Gaming Trio instead.

Edit - just tested the new driver - my PR scores are same on both versions. Last run I did was yesterday night. Just got home and tried out the latest drivers.


----------



## doubledoubt

GRABibus said:


> 29k past at PR
> 
> View attachment 2582043
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 026 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


That's impressive. This is Gaming OC I take it?


----------



## J7SC

...fyi, just removed my 4-into-1 dongle connector for the first time since I got the card almost three weeks ago. I had done a slew of early benchies (up to 575 W per HWInfo) at 25 C+ ambient....

Looks pristine to me, though I had made sure a.) no sharp twisting / bending and b.) fully insert and 'seat' the connector during the original install.


----------



## GRABibus

doubledoubt said:


> This is Gaming OC ?


Yes


----------



## doubledoubt

Benni231990 said:


> The new Driver and the MSI 600watt Liquid Bios is not bad over 350 points more
> 
> View attachment 2582048


Where did you get the new MSI 600W Liquid BIOS if you don't mind me asking? All the 4 various MSI BIOS (including the 2 Liquid ones) available on TechPowerOp are limited to 530.


----------



## newls1

Panchovix said:


> I get a little more than you with lower clocks (3030 core, 1457 mem) I scored 27 894 in Port Royal, though my CPU is holding me for sure
> 
> Did you made the usual optimizations? (Like forced ReBar, clean OS, NVIDIA Control panel tweaks, etc)
> 
> If you didn't, you're losing a good amount of points vs the higher places on the benchmark


I have not forced rebar… I saw that thread and forgot to do it. I’ll work on it tonight once GF leaves.


----------



## Benni231990

Holy **** the 600watt MSI Bios is a beast my card boost now to 3090MHZ and my ram can now Stable +1900Mhz and thats all on AIRCOOLED!!!! i need my waterblock for this card with this i think 28600+ should possible


----------



## GRABibus

Benni231990 said:


> Holy **** the 600watt MSI Bios is a beast my card boost now to 3090MHZ and my ram can now Stable +1900Mhz
> 
> View attachment 2582093


OK....But where is this bios ?


----------



## newls1

Benni231990 said:


> Holy **** the 600watt MSI Bios is a beast my card boost now to 3090MHZ and my ram can now Stable +1900Mhz and thats all on AIRCOOLED!!!! i need my waterblock for this card with this i think 28600+ should possible
> 
> View attachment 2582093


Can we see a gpuz screenie of the bios details tab please


----------



## KingEngineRevUp

Benni231990 said:


> Holy **** the 600watt MSI Bios is a beast my card boost now to 3090MHZ and my ram can now Stable +1900Mhz
> 
> View attachment 2582093


Are you sure your memory isn't error correcting? What if you kept that core and did +1500 on the memory? What's the score then?


----------



## Benni231990

doubledoubt said:


> Where did you get the new MSI 600W Liquid BIOS if you don't mind me asking? All the 4 various MSI BIOS (including the 2 Liquid ones) available on TechPowerOp are limited to 530.


Here in the forum i cant remember what page a owner from his liqued card upload it here

_Update_ ich found the link again : SuprimX4090G.rom


----------



## GRABibus

Benni231990 said:


> Here in the forum i cant remember what page a owner from his liqued card upload it here


Can you post it here in .txt format ?


----------



## newls1

Benni231990 said:


> Here in the forum i cant remember what page a owner from his liqued card upload it here
> View attachment 2582095


Please click on advanced tab down arrow click to bios, and post that page please


----------



## doubledoubt

Found it!









[Official] NVIDIA RTX 4090 Owner's Club


What is your score in 4k optimized? I think somethings up with my Gigabyte Gaming OC? I currently have +120 on the core and +1200 on the memory but my score is waaay lower? DDR5 at 6600mhz and 12900KF at 5.2Ghz Not sure why its scoring low, also noticed that Dying light 2 and Cyberpunk was...




www.overclock.net


----------



## GRABibus

Benni231990 said:


> Here in the forum i cant remember what page a owner from his liqued card upload it here
> 
> _Update_ ich found the link again : SuprimX4090G.rom
> View attachment 2582095
> 
> 
> View attachment 2582097


With which version of nvflash did you flash it ?


----------



## GRABibus

GRABibus said:


> With which version of nvflash did you flash it ?


This version ?








NVIDIA NVFlash (5.792.0) Download


NVIDIA NVFlash is used to flash the graphics card BIOS on Ampere, Turing, Pascal and all older NVIDIA cards. NVFlash supports BIOS flashing on NVID




www.techpowerup.com


----------



## Benni231990

i used the last version from NVflash

SO NOW FOR ALL HERE THE 600WATT MSI BIOS : !!!!!! SuprimX4090G.rom!!!!!!!!!!!!!!!!! (But with this Bios your fancurve is **** as **** because the liquid has a AIO so if you want a silence PC only use it with waterblock or live with the Noise )

and here the result with +1500MHZ Ram










So i Think i have no Error Corretion


----------



## KingEngineRevUp

Benni231990 said:


> i used the last version from NVflash
> 
> SO NOW FOR ALL HERE THE 600WATT MSI BIOS : !!!!!! SuprimX4090G.rom!!!!!!!!!!!!!!!!! (But with this Bios your fancurve is *** as *** because the liquid has a AIO so if you want a silence PC only use it with waterblock or live with the Noise )
> 
> and here the result with +1500MHZ Ram
> 
> 
> 
> So i Think i have no Error Corretion


Very nice, it seems there is diminishing returns at some point. Going form +0 to +1000 is a nice performance boost. +1000-1500 still nice but starts to diminish. 

I'm guessing the memory is a little underspec to keep up with the core out of box. Luckily we get to OC it.


----------



## energie80

Edit


----------



## KingEngineRevUp

energie80 said:


> I will offer free service.


Offer what free service?


----------



## energie80

Edit


----------



## Nizzen

energie80 said:


> Hello everyone, I’m looking for someone for a test. 1440p res monitor needed. I do professional optimizations' service for modern warfare 2 and Warzone. If someone have some spare time for a test I will offer free service. Have a lot of feedback for reference. Pm if interested


Show us your minimum fps, and we wil try to beat it 
This forums is free service 24/7


----------



## Ozz387

Nd4spdvn said:


> I hear (not from reliable sources) that there might be some updated 600W vBios for MSI Suprim X (non-liquid). Does anyone here know/heard anything about this? And most importantly how and where one would get them?
> 
> There is a 600w bios on this thread a few pages back im not sure how to search for it. I think its an experimental bios that was accidently shipped with some of the early suprim x's. I too was sceptical but i flashed it today on my suprim x and i can confirm it does work and is a 600w limit. It doesnt like kombuster though, it makes the card read a really low boost in the 300-400mhz range, but itonly seems to be a misread as on load it shows the correct boost i set


----------



## energie80

Edit


----------



## Zero989

Benni231990 said:


> Here in the forum i cant remember what page a owner from his liqued card upload it here
> 
> _Update_ ich found the link again : SuprimX4090G.rom
> View attachment 2582095
> 
> 
> View attachment 2582097


This BIOS isn't optimal for MSI cards IMO, also your PR score should be higher than 28.58k


----------



## Ozz387

Nd4spdvn said:


> I hear (not from reliable sources) that there might be some updated 600W vBios for MSI Suprim X (non-liquid). Does anyone here know/heard anything about this? And most importantly how and where one would get them?


There is a 600w bios on this thread a few pages back im not sure how to search for it. I think its an experimental bios that was accidently shipped with some of the early suprim x's. I too was sceptical but i flashed it today on my suprim x and i can confirm it does work and is a 600w limit. It doesnt like kombuster though, it makes the card read a really low boost in the 300-400mhz range, but itonly seems to be a misread as on load it shows the correct boost i set


----------



## GRABibus

deleted


----------



## GRABibus

Nizzen said:


> Show us your minimum fps, and we wil try to beat it
> This forums is free service 24/7
> View attachment 2582123


Wrong thread


----------



## N19htmare666

Ozz387 said:


> There is a 600w bios on this thread a few pages back im not sure how to search for it. I think its an experimental bios that was accidently shipped with some of the early suprim x's. I too was sceptical but i flashed it today on my suprim x and i can confirm it does work and is a 600w limit. It doesnt like kombuster though, it makes the card read a really low boost in the 300-400mhz range, but itonly seems to be a misread as on load it shows the correct boost i set


What are the steps to replicate the the issue? Can you add screenshots.


----------



## Ozz387

Zero989 said:


> This BIOS isn't optimal for MSI cards IMO, also your PR score should be higher than 28.58k


As far as i know this bios is a stock 600w bios shipped with some of the early suprim x cards. Im assuming its an experimental bios, but it does work. Im definately no expert but how can you tell if is it not optimized?


----------



## Zero989

Ozz387 said:


> As far as i know this bios is a stock 600w bios shipped with some of the early suprim x cards. Im assuming its an experimental bios, but it does work. Im definately no expert but how can you tell if is it not optimized?


Check your cards board power max during Furmark. See if it even hits 600W. With your clocks, you should be getting 29K easily in Port Royal.


----------



## DokoBG

I think i will be sticking to the standard Suprim X Gaming bios. I got it also updated to the latest version through the MSI Centre app few days ago. Works flawlessly on my Gaming Trio.


----------



## Ozz387

N19htmare666 said:


> What are the steps to replicate the the issue? Can you add screenshots.


Its when i run kombuster stress test at 4k. I leave it running and when i come back to it, its closed the window and doesnt read properly in afterburner after that. If i do anything to put load on the card it shows the correct boost, and if i restart my pc it reads properly again as long as i dont run kombuster


----------



## N19htmare666

energie80 said:


> View attachment 2582124


And without the upscaling... Tip: warzone only cares about lower ram latency.


----------



## energie80

Fx cas doesn’t add any fps, it’s just for visibility


----------



## Panchovix

doubledoubt said:


> Found it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [Official] NVIDIA RTX 4090 Owner's Club
> 
> 
> What is your score in 4k optimized? I think somethings up with my Gigabyte Gaming OC? I currently have +120 on the core and +1200 on the memory but my score is waaay lower? DDR5 at 6600mhz and 12900KF at 5.2Ghz Not sure why its scoring low, also noticed that Dying light 2 and Cyberpunk was...
> 
> 
> 
> 
> www.overclock.net


Gonna try this later on my ASUS, I'm willing to test any 600W on my TUF to _maybe _be able to increase the VRAM a bit, no hopes though (_sigh_)


----------



## N19htmare666

energie80 said:


> Fx cas doesn’t add any fps, it’s just for visibility






 see 1440p basic section. Just note his pc is not well oc'd for warzone.


----------



## energie80

Saw that video already, need a test with my settings 😅 both video and game settings


----------



## N19htmare666

energie80 said:


> Saw that video already, need a test with my settings 😅 both video and game settings


Look up fr33thy. He's already done all the testing and shared a special config. Of course it's free and not for resale


----------



## energie80

Edit


----------



## mirkendargen

energie80 said:


> Hello everyone, I’m looking for someone for a test. 1440p res monitor needed. I do professional optimisation service for modern warfare 2 and Warzone. If someone have some spare time for a test I will offer free service. Have a lot of feedback for reference. Pm if interested





energie80 said:


> I’m not selling anything lol just looking for someone who can do some testing


----------



## energie80

Didn’t advertise my page or something like that, just was looking for someone with a 4090 since I don’t own one.
Never mind I will just delete, thanks anyway


----------



## mattskiiau

Noticed there is a new MSI Trio BIOS via MSI Center. "MH-V510 202"
Anyone have info about it?


----------



## Azazil1190

Result not found







www.3dmark.com




A bit better score.
Thnx to GRABibus


----------



## neteng101

DokoBG said:


> I think i will be sticking to the standard Suprim X Gaming bios. I got it also updated to the latest version through the MSI Centre app few days ago. Works flawlessly on my Gaming Trio.


Going to try that out - probably better for daily than the 600W liquid bios, I want to see fan speed differences, etc.


----------



## bmagnien

New driver helped gain a few points...still cant crack 29k in PR:









I scored 28 787 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 11 078 in Speed Way


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





GigaOC, Bykski block, ambient room temp 71f, 21c.


----------



## Ozz387

Nd4spdvn said:


> Will the 600W Suprim Liquid X bios work fine on my Suprim X I wonder? Thinking of flashing it as the PCB is the same but I am worried a bit that it may not act right with the fans on the air-cooled Suprim x I have.


I can confirm it works on my suprim x


----------



## mattskiiau

DokoBG said:


> I think i will be sticking to the standard Suprim X Gaming bios. I got it also updated to the latest version through the MSI Centre app few days ago. Works flawlessly on my Gaming Trio.


Sorry this might be a stupid question but are you saying you flashed the Suprim BIOS onto your Trio and then MSI app is now recognizing your card as a Suprim and is updating to the new Suprim BIOS?
Effectively Msi Center "thinks" your GPU is a Suprim now?


----------



## Ozz387

Benni231990 said:


> i used the last version from NVflash
> 
> SO NOW FOR ALL HERE THE 600WATT MSI BIOS : !!!!!! SuprimX4090G.rom!!!!!!!!!!!!!!!!! (But with this Bios your fancurve is *** as *** because the liquid has a AIO so if you want a silence PC only use it with waterblock or live with the Noise )
> 
> and here the result with +1500MHZ Ram
> 
> View attachment 2582099
> 
> 
> So i Think i have no Error Corretion


Thats exactly what i got with +170core +1665mem on a 9900k @ 5.2ghz


----------



## Mad Pistol

I'm curious as to what the core reaches on most cards at stock settings.

According to Afterburner, my core hits 2730 at top out. It will not go any higher than that without OC. What do y'all get?


----------



## Benni231990

so i flashed the strix and Gigabyte bios But both are slower and give me 100-150 points less 

i flashed the 600watt msi bios back and my Score +- 10-15points

so for My card the 600watt msi Bios is the best


----------



## KingEngineRevUp

Zero989 said:


> This BIOS isn't optimal for MSI cards IMO, also your PR score should be higher than 28.58k


I thought so also. I thought he had a memory error correction. The score, clocks and memory seemed a little off to me.


----------



## KingEngineRevUp

Mad Pistol said:


> I'm curious as to what the core reaches on most cards at stock settings.
> 
> According to Afterburner, my core hits 2730 at top out. It will not go any higher than that without OC. What do y'all get?











I scored 24 749 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





average 2822 Mhz. I did unlock the power and temperature though. Are you?


----------



## N19htmare666

Benni231990 said:


> so i flashed the strix and Gigabyte bios But both are slower and give me 100-150 points less
> 
> i flashed the 600watt msi bios back and my Score +- 10-15points
> 
> so for My card the 600watt msi Bios is the best


How did you get on with the Neptune 630w?


----------



## KedarWolf

Benni231990 said:


> so i flashed the strix and Gigabyte bios But both are slower and give me 100-150 points less
> 
> i flashed the 600watt msi bios back and my Score +- 10-15points
> 
> so for My card the 600watt msi Bios is the best


Can you share the MSI 600w BIOS. It's not on TechPowerUp.


----------



## Benni231990

@KedarWolf  ⬇


Benni231990 said:


> SO NOW FOR ALL HERE THE 600WATT MSI BIOS : !!!!!! SuprimX4090G.rom!!!!!!!!!!!!!!!!! (But with this Bios your fancurve is *** as *** because the liquid has a AIO so if you want a silence PC only use it with waterblock or live with the Noise )




so here are my first attempt on UV with my card

I will continue when my waterblock is coming but for Aircooled im Finished


----------



## Benni231990

i think the msi bios is not that bad but i depends on the card and silicon lottery


----------



## ZealotKi11er

Mad Pistol said:


> I'm curious as to what the core reaches on most cards at stock settings.
> 
> According to Afterburner, my core hits 2730 at top out. It will not go any higher than that without OC. What do y'all get?


Was the same for both my GigaOC


----------



## cheddardonkey

N19htmare666 said:


> Please can you post the the original Waterforce BIOS here? Would be interesting to see if this improvement can be replicated on the MSI liquid cards.
> 
> Also, curious to how it went in the end? Any updates.


Here is the Aorus bios link extracted from my card. Feedback from others I've spoken to share this particular card has a limited audience so its tough to compare results with anyone else willing to dive in.

Aorus Waterforce 4090 factory bios v2

In testing, I have done quite alot. Here are my takeaways.
I have loaded several 4090 model's bios downloaded from the official links page as well as the 600w MSI Suprim Liquid not listed on that page (basically all of the ones shown that could potentially provide an increased power advantage over the stock waterforce bios due to a higher TDP or overall power limit including non AIO models, all of which can have fan curves adjusted and dont affect pump operation to the best of what I can tell. fans to 100% during op tests to eliminate the cooling variable.

Initially, I was quite dissapointed to find this card was expected to have a 600w limit, only to find that Aorus limited the increase from 450w to 500w. I felt cheated out of a presumed 100w of performance so to speak as this is the top end model offered by Gigabyte Aorus at a premium price. Hence my finding of this page in attempts to maximize the potential of the aio cooling pm this card.

Tested
Colorful, Gigabyte, Palit, MSI, Asus. All of which loaded fine outside of the Strix, that loaded ok but messed up the screen mapping (probably due to its 5th video port other cards don't have)

I didnt try the founders edition bios but after seeing results from all the others I'm curious (because that seems to be the best performing card overall from the "pro" reviews) but not compelled to. The differences are likely minor, I also want to limit the number of bios writes in attempts to make an awesome card better. The risk causing an issue I can't remedy. That would kill me.. With that said. Onto the testing synopsis. (If anyone else is diving in, read through the entirety of the next paragraph)

Each bios was loaded, rebooted, and re-tested for overclockability, performance gains or loss in couple of benches and games. In every instance, I was able to extend and validate increased power consumption, in furmark and superposition. I was also able to hold slightly higher clocks.. I and I mean slightly.. like 30-50mhz which amounts to nothing but pulling in much higher power at slighly reduced game and bench performance. In all instances', some increased heat was generated but easily dissipated by the aio. No stress testing ever amounted to any temperature concern in gpu, hotspot or vram. Similarly in every instance, voltage is the limiting performance factor OUTSIDE of the waterforce bios in which that bios shows BOTH a voltage and power limiting factor. However, in real game testing and bench testing as I shared before, the waterforce bios is holding higher bench scores +1000-2000 and higher 99% FPS at LOWER clocks and power consumption. So I've reverted to factory. I tried a short stab at undervolting but I didnt see any improvement, only less stability. Perhaps this design is just tuned to the best it can be already? I no longer feel "cheated" out of 100w performance and tend to see this the other way around now.. like I'm getting better miles per watt with this design which is still maxing out the chip output with less power.

Overall my understanding is that real world performance differences from 4090 models are relatively small and negligble outside of benchmark pissing contests, they all overclock past 3ghz which does bring some benefit. So I went with one I thought would have an advantage over the others in being more quiet. That so far I'm quite pleased with. Everyone posts their HIGH clocks.. not their EFFECTIVE clocks. I'd like to see more of that and would love some expert feedback on my findings, maybe other things to test or comparative results to look at.


----------



## yzonker

Well at least the new driver got me past 29k. I'll take it for now at least.









I scored 29 011 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## DokoBG

mattskiiau said:


> Sorry this might be a stupid question but are you saying you flashed the Suprim BIOS onto your Trio and then MSI app is now recognizing your card as a Suprim and is updating to the new Suprim BIOS?
> Effectively Msi Center "thinks" your GPU is a Suprim now?


Yup, that is exactly what is happening. The MSI Centre is recognizing it as a Suprim X and it updated it to the latest Suprim X bios.


----------



## Benni231990

So With the update how many watts you get?

600 or 520?


----------



## Sheyster

J7SC said:


> ...fyi, just removed my 4-into-1 dongle connector for the first time since I got the card almost three weeks ago. I had done a slew of early benchies (up to 575 W per HWInfo) at 25 C+ ambient....
> 
> Looks pristine to me, though I had made sure a.) no sharp twisting / bending and b.) fully insert and 'seat' the connector during the original install.


Just installed my Cablemod cable too, the adapter and plug both look clean. I removed the plastic sense pin cover on the Cablemod cable like someone else mentioned was a good idea, to be able to better see the cable/plug fully seated. It just slides off easily, not a big deal.

In the future, I might go with a 2 x 8-pin cable. 🤷‍♂️


----------



## Zero989

Benni231990 said:


> So With the update how many watts you get?
> 
> 600 or 520?


Main Suprim Liquid X bios is 530w

Interdasting update










whack 530W


----------



## DokoBG

Benni231990 said:


> So With the update how many watts you get?
> 
> 600 or 520?


if you ask me, 520. I am using Suprim X Air bios, not liquid.


----------



## yzonker

KedarWolf said:


> Can you share the MSI 600w BIOS. It's not on TechPowerUp.


Isn't this it? I haven't tried it.









[Official] NVIDIA RTX 4090 Owner's Club


Here in the forum i cant remember what page a owner from his liqued card upload it here Can you post it here in .txt format ?




www.overclock.net


----------



## Panchovix

New driver def helped on Port Royal and SpeedWay, haven't tested on the rest.

Got 28K on PR with pretty low clocks, before with the same clocks was 27.8k 









I scored 28 010 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## bmagnien

yzonker said:


> Well at least the new driver got me past 29k. I'll take it for now at least.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 011 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


31c avg? How’re you managing that? Your 13900k is surely doing work too, about 250pts higher than a run of mine with the exact same avg core and mem. Grats on 29k


----------



## SilenMar

cheddardonkey said:


> Here is the Aorus bios link extracted from my card. Feedback from others I've spoken to share this particular card has a limited audience so its tough to compare results with anyone else willing to dive in.
> 
> Aorus Waterforce 4090 factory bios v2
> 
> In testing, I have done quite alot. Here are my takeaways.
> I have loaded several 4090 model's bios downloaded from the official links page as well as the 600w MSI Suprim Liquid not listed on that page (basically all of the ones shown that could potentially provide an increased power advantage over the stock waterforce bios due to a higher TDP or overall power limit including non AIO models, all of which can have fan curves adjusted and dont affect pump operation to the best of what I can tell. fans to 100% during op tests to eliminate the cooling variable.
> 
> Initially, I was quite dissapointed to find this card was expected to have a 600w limit, only to find that Aorus limited the increase from 450w to 500w. I felt cheated out of a presumed 100w of performance so to speak as this is the top end model offered by Gigabyte Aorus at a premium price. Hence my finding of this page in attempts to maximize the potential of the aio cooling pm this card.
> 
> Tested
> Colorful, Gigabyte, Palit, MSI, Asus. All of which loaded fine outside of the Strix, that loaded ok but messed up the screen mapping (probably due to its 5th video port other cards don't have)
> 
> I didnt try the founders edition bios but after seeing results from all the others I'm curious (because that seems to be the best performing card overall from the "pro" reviews) but not compelled to. The differences are likely minor, I also want to limit the number of bios writes in attempts to make an awesome card better. The risk causing an issue I can't remedy. That would kill me.. With that said. Onto the testing synopsis. (If anyone else is diving in, read through the entirety of the next paragraph)
> 
> Each bios was loaded, rebooted, and re-tested for overclockability, performance gains or loss in couple of benches and games. In every instance, I was able to extend and validate increased power consumption, in furmark and superposition. I was also able to hold slightly higher clocks.. I and I mean slightly.. like 30-50mhz which amounts to nothing but pulling in much higher power at slighly reduced game and bench performance. In all instances', some increased heat was generated but easily dissipated by the aio. No stress testing ever amounted to any temperature concern in gpu, hotspot or vram. Similarly in every instance, voltage is the limiting performance factor OUTSIDE of the waterforce bios in which that bios shows BOTH a voltage and power limiting factor. However, in real game testing and bench testing as I shared before, the waterforce bios is holding higher bench scores +1000-2000 and higher 99% FPS at LOWER clocks and power consumption. So I've reverted to factory. I tried a short stab at undervolting but I didnt see any improvement, only less stability. Perhaps this design is just tuned to the best it can be already? I no longer feel "cheated" out of 100w performance and tend to see this the other way around now.. like I'm getting better miles per watt with this design which is still maxing out the chip output with less power.
> 
> Overall my understanding is that real world performance differences from 4090 models are relatively small and negligble outside of benchmark pissing contests, they all overclock past 3ghz which does bring some benefit. So I went with one I thought would have an advantage over the others in being more quiet. That so far I'm quite pleased with. Everyone posts their HIGH clocks.. not their EFFECTIVE clocks. I'd like to see more of that and would love some expert feedback on my findings, maybe other things to test or comparative results to look at.


You can try and request Gigabyte to release a custom 600W BIOS instead of flashing the BIOS from other brands with different configurations. 

Then compare the graphic scores between different cards with the same clock speed such as 3000mHz Gigabyte vs 3000mHz ASUS, etc.


----------



## bmagnien

I think I overdid it 😆


----------



## dante`afk

RES 8 maxed settings with RTX 5120x1440p, 60-70fps


----------



## bmagnien

dante`afk said:


> RES 8 maxed settings with RTX 5120x1440p, 60-70fps
> 
> View attachment 2582168


Yah I have no idea what that is. I hit 699w in rail power while playing Kerbal Space Program with a bunch of visual terrain mods lol. GPU Power max always below 570 though. Makes you wonder if the 630w bios of that one AIB I can’t remember now has something to do with that. Seems a bit of a coincidence that ‘600w’ bios all cap at 570, and there just so happens to be a ‘630’ watt bios by one AIB which would bring it exactly to the 600w spec. By the way - why haven’t we seen that bios available for download yet? It’s a retail model you’d think people would’ve uploaded it by now.


----------



## mirkendargen

bmagnien said:


> Yah I have no idea what that is. I hit 699w in rail power while playing Kerbal Space Program with a bunch of visual terrain mods lol. GPU Power max always below 570 though. Makes you wonder if the 630w bios of that one AIB I can’t remember now has something to do with that. Seems a bit of a coincidence that ‘600w’ bios all cap at 570, and there just so happens to be a ‘630’ watt bios by one AIB which would bring it exactly to the 600w spec. By the way - why haven’t we seen that bios available for download yet? It’s a retail model you’d think people would’ve uploaded it by now.


I saw someone showing one off on reddit. Someone asked him to dump the bios but he didn't know how 🤦‍♂️


----------



## J7SC

dante`afk said:


> RES 8 maxed settings with RTX 5120x1440p, 60-70fps
> 
> View attachment 2582168





bmagnien said:


> Yah I have no idea what that is. I hit 699w in rail power while playing Kerbal Space Program with a bunch of visual terrain mods lol. GPU Power max always below 570 though. Makes you wonder if the 630w bios of that one AIB I can’t remember now has something to do with that. Seems a bit of a coincidence that ‘600w’ bios all cap at 570, and there just so happens to be a ‘630’ watt bios by one AIB which would bring it exactly to the 600w spec. By the way - why haven’t we seen that bios available for download yet? It’s a retail model you’d think people would’ve uploaded it by now.


...from what I recall reading, the GPU Rail Power _might_ refer to short spikes, but I can't put a source on that right now.


----------



## ArcticZero

J7SC said:


> ...from what I recall reading, the GPU Rail Power _might_ refer to short spikes, but I can't put a source on that right now.


This makes more sense than it being a specific BIOS related thing, since it happens on my shunt modded 3090 as well (appropriate "GPU Power" multiplier applied on Hwinfo). Internal power limits should be around 520w or so, and the main GPU Power reading never exceeds this. But I'm seeing figures as high as this on the rail powers as well simply playing games like Overwatch 2.


----------



## jootn2kx

Any downsides with flashing a other brand bios on your card? Like for example black screens or fanspeed curve issues?


----------



## th3illusiveman

Damn, really had to squish my 4090 FE into my Cooler Master CM690 Advance II case (12yr old case lol). My old 3080FTW3 fit with a mm or two to spare... i had to bring the hammer out (literally) to bend my fixed HDD tray and make the few mm of clearance required - it is SNUG lol, don't even need a support bracket. Beautiful card and fast as FK - quite happy - not so happy about the dent in my wallet... and case...

I was able to get one during a drop using the "Falcodrin Community Discord" for stock alerts.


----------



## Krzych04650

Krzych04650 said:


> Flashed Inno3D X3 OC 450W to Gigabyte Gaming OC 600W. It was semi-successful. The card still works properly, but the actual power limit seems to be 550W, not 600, it starts throttling at 550.


Reflashed with Liquid X 600W BIOS, and still stuck at 560W max. I guess this is just how these cards work. This one allows for much higher fan speeds than Gigabyte BIOS which capped the speed quite a lot, so it is better to use until I get waterblock.


----------



## KingEngineRevUp

dante`afk said:


> RES 8 maxed settings with RTX 5120x1440p, 60-70fps
> 
> View attachment 2582168


Uh.... Yeah... As a daily driver, I'm going to just undervolt +OC the memory this generation. 

With memory OC giving +5-10% and core just giving 1-3% for +40-50% more in power consumption... It's a no brainer


----------



## KingEngineRevUp

th3illusiveman said:


> Damn, really had to squish my 4090 FE into my Cooler Master CM690 Advance II case (12yr old case lol). My old 3080FTW3 fit with a mm or two to spare... i had to bring the hammer out (literally) to bend my fixed HDD tray and make the few mm of clearance required - it is SNUG lol, don't even need a support bracket. Beautiful card and fast as FK - quite happy - not so happy about the dent in my wallet... and case...
> 
> I was able to get one during a drop using the "Falcodrin Community Discord" for stock alerts.


Share some pics


----------



## N19htmare666

cheddardonkey said:


> Here is the Aorus bios link extracted from my card.  Feedback from others I've spoken to share this particular card has a limited audience so its tough to compare results with anyone else willing to dive in.
> 
> Aorus Waterforce 4090 factory bios v2
> 
> In testing, I have done quite alot. Here are my takeaways.
> I have loaded several 4090 model's bios downloaded from the official links page as well as the 600w MSI Suprim Liquid not listed on that page (basically all of the ones shown that could potentially provide an increased power advantage over the stock waterforce bios due to a higher TDP or overall power limit including non AIO models, all of which can have fan curves adjusted and dont affect pump operation to the best of what I can tell. fans to 100% during op tests to eliminate the cooling variable.
> 
> Initially, I was quite dissapointed to find this card was expected to have a 600w limit, only to find that Aorus limited the increase from 450w to 500w. I felt cheated out of a presumed 100w of performance so to speak as this is the top end model offered by Gigabyte Aorus at a premium price. Hence my finding of this page in attempts to maximize the potential of the aio cooling pm this card.
> 
> Tested
> Colorful, Gigabyte, Palit, MSI, Asus. All of which loaded fine outside of the Strix, that loaded ok but messed up the screen mapping (probably due to its 5th video port other cards don't have)
> 
> I didnt try the founders edition bios but after seeing results from all the others I'm curious (because that seems to be the best performing card overall from the "pro" reviews) but not compelled to. The differences are likely minor, I also want to limit the number of bios writes in attempts to make an awesome card better. The risk causing an issue I can't remedy. That would kill me.. With that said. Onto the testing synopsis. (If anyone else is diving in, read through the entirety of the next paragraph)
> 
> Each bios was loaded, rebooted, and re-tested for overclockability, performance gains or loss in couple of benches and games. In every instance, I was able to extend and validate increased power consumption, in furmark and superposition. I was also able to hold slightly higher clocks.. I and I mean slightly.. like 30-50mhz which amounts to nothing but pulling in much higher power at slighly reduced game and bench performance. In all instances', some increased heat was generated but easily dissipated by the aio. No stress testing ever amounted to any temperature concern in gpu, hotspot or vram. Similarly in every instance, voltage is the limiting performance factor OUTSIDE of the waterforce bios in which that bios shows BOTH a voltage and power limiting factor. However, in real game testing and bench testing as I shared before, the waterforce bios is holding higher bench scores +1000-2000 and higher 99% FPS at LOWER clocks and power consumption. So I've reverted to factory. I tried a short stab at undervolting but I didnt see any improvement, only less stability. Perhaps this design is just tuned to the best it can be already? I no longer feel "cheated" out of 100w performance and tend to see this the other way around now.. like I'm getting better miles per watt with this design which is still maxing out the chip output with less power.
> 
> Overall my understanding is that real world performance differences from 4090 models are relatively small and negligble outside of benchmark pissing contests, they all overclock past 3ghz which does bring some benefit. So I went with one I thought would have an advantage over the others in being more quiet. That so far I'm quite pleased with. Everyone posts their HIGH clocks.. not their EFFECTIVE clocks. I'd like to see more of that and would love some expert feedback on my findings, maybe other things to test or comparative results to look at.


Thank you for sharing the Waterforce BIOS.

I totally agree with you regarding effective clocks. 

FE isn't cross flashable as far I understand.

It's interesting to hear that the Waterforce BIOS is benching higher than the other brands even with the power restriction. I wonder what the cause is. Could it be dropping to a higher base clock when voltage/power limited than other brands. 

If yes then we'd expect higher bench results using the Waterforce BIOS on other cards. If not, then there some other dark magic going on. Wonder what other folks think of this.


----------



## N19htmare666

mirkendargen said:


> I saw someone showing one off on reddit. Someone asked him to dump the bios but he didn't know how 🤦‍♂️


Good idea going to Reddit. Might also have a look and see if there any that can be shared here


----------



## newls1

pretty happy... my cablemod 16x4 cable just shipped . Cant wait to get rid of this adapter!


----------



## alesk76

Hi to all ... Im new and I ahev read all and I have a few questions for you...
I have Palit 4090 OC Gamerock and it came with adapter 4x8pin to 1 10 pin ... I also have powersupply 1200w BeQuiet... 
I have update graphic bios from their site to the latest one 95.02.18.80.55... if im not wrong this bios which goes to 500w max... 
so can I use Asus 4090 Strix OC which goes to 600w on my palit 4090 OC game rock card ??? Im playing RaceSims on Pimax 8K X and I need max FPS ...
Also another question .. I have saw her on forum on MSI afterburn that you mange to set voltage to 100% you could move the slider... at my MSI afterburn is locked... please to tell me how to unlock this... and how to set it somwhere a little after top to get max out of my card and that it would be stable working... not nesesery to me be max overclocked... 
also please for short step by step how to flash and where did you get flash program .. is possible to to backup of extending bios... and them use it back if new will do me trubles... 

please for help .. and I know that i have ask a lot of questions

best regards to all and a lot of health


----------



## yzonker

bmagnien said:


> 31c avg? How’re you managing that? Your 13900k is surely doing work too, about 250pts higher than a run of mine with the exact same avg core and mem. Grats on 29k


Chiller. I had it set to 10C.


----------



## doom3crazy

Zero989 said:


> This BIOS isn't optimal for MSI cards IMO, also your PR score should be higher than 28.58k


what makes you say that?


----------



## Nizzen

bmagnien said:


> 31c avg? How’re you managing that? Your 13900k is surely doing work too, about 250pts higher than a run of mine with the exact same avg core and mem. Grats on 29k


Many runs over 28500 points are vram bugged. Points are exploding when vram is unstable.
I stopped looking at port royal scores, after I got #1 spot with a bugged run. My score is over 31k with 4090 strix on air lol. Normal score is about 28300


----------



## N19htmare666

Nizzen said:


> Many runs over 28500 points are vram bugged. Points are exploding when vram is unstable.
> I stopped looking at port royal scores, after I got #1 spot with a bugged run. My score is over 31k with 4090 strix on air lol. Normal score is about 28300
> View attachment 2582192


Is there a bench that's as reliable as PR was? I.e. next to no run to run variance. And free of course so there's plenty of results to compare against


----------



## yzonker

Nizzen said:


> Many runs over 28500 points are vram bugged. Points are exploding when vram is unstable.
> I stopped looking at port royal scores, after I got #1 spot with a bugged run. My score is over 31k with 4090 strix on air lol. Normal score is about 28300
> View attachment 2582192


My run wasn't artifacted. I didn't see any artifacts in any of the runs I did last night.


----------



## Nizzen

yzonker said:


> My run wasn't artifacted. I didn't see any artifacts in any of the runs I did last night.


That doesn't mean it's not bugged 
Check scaling with memory OC. Try 500, 750,1000,1250,1500,1700 etc.... See if the scaling is perfect, or it skyrocket to the roof in around 1500mhz+ 
When someone got 500+ more points with the same core and memoryclocks, something is wrong....

My run wasn't artifacted, because I turned off the monitor LOL 😂😝😜🤪🥳


----------



## Nizzen

N19htmare666 said:


> Is there a bench that's as reliable as PR was? I.e. next to no run to run variance. And free of course so there's plenty of results to compare against


Turning on ECC in the nvidia controlpanel makes the scores ok. Normal with 40 series is ECC=off. This makes the memory to run on a unstable state, without stopping or crashing much higher..


----------



## a.senna

a_Criminai said:


> If anyone is using the $1599 Zotac 4090 trinity I can confirm the zotac 4090 extreme vbios works fine on it.
> 
> Also side note, apparently oc scanner is capped at +165 MHz. Nvidia should change that.



I have the same model as you, do you have the 4x8 pin cable? Mine comes with the 3x8 pin and I think I´m not able to unlock the 450w power limit. Can you post some prints showing how many watts do you have now?

Thanks


----------



## N19htmare666

Nizzen said:


> Turning on ECC in the nvidia controlpanel makes the scores ok. Normal with 40 series is ECC=off. This makes the memory to run on a unstable state, without stopping or crashing much higher..


That's only good for comparing runs I did against my previous runs. I'm asking more so I can compare against what others are getting, but reliability without bugged runs.


----------



## Nizzen

N19htmare666 said:


> That's only good for comparing runs I did against my previous runs. I'm asking more so I can compare against what others are getting, but reliability without bugged runs.


I don't know if Timespy extreme is bugged or not. Try this and compare the graphics score to others


----------



## N19htmare666

Nizzen said:


> I don't know if Timespy extreme is bugged or not. Try this and compare the graphics score to others


TS is not very reliable to get matching scores across runs as PR was.


----------



## Nizzen

N19htmare666 said:


> TS is not very reliable to get matching scores across runs as PR was.


Maybe because of the cputest in TS.


----------



## yzonker

Nizzen said:


> That doesn't mean it's not bugged
> Check scaling with memory OC. Try 500, 750,1000,1250,1500,1700 etc.... See if the scaling is perfect, or it skyrocket to the roof in around 1500mhz+
> When someone got 500+ more points with the same core and memoryclocks, something is wrong....
> 
> My run wasn't artifacted, because I turned off the monitor LOL 😂😝😜🤪🥳


Although I agree there will always be some ambiguity (which is very irritating), I have not seen any evidence that the scores are artificially inflated when no visible artifacting occurs. 

The test your suggesting most likely would reveal nothing as artifacted/bugged runs only occur in a very small range of mem speeds. Just below that point it runs normally, very much above and it just crashes. 

But I am 100% in favor of requiring ECC as that would hopefully eliminate any uncertainty.


----------



## Zero989

doom3crazy said:


> what makes you say that?


Limited wattage compared to other BIOS.


----------



## yzonker

Nizzen said:


> I don't know if Timespy extreme is bugged or not. Try this and compare the graphics score to others


Pretty much all benchmarks are susceptible to artifacting. It's just that longer benchmarks tend to crash before completing I think. I've seen it in PR/Speedway/TS/TSE/Superposition.


----------



## dk_mic

A little observation about coil whine:

My Gaming X Trio has some notable coil whine, way worse then my old, waterblocked 2080 Ti Gaming X Trio. I am quite sensitive about it, but it's no big issue for me.
I have installed a new PSU. While doing that, I have pulled the card and tightened all screws on the backplate, some of them could be tightened further quite easily.

So either the new PSU or the tightening has considerably reduced the coil while. Thought Id share..


----------



## Zero989

dk_mic said:


> A little observation about coil whine:
> 
> My Gaming X Trio has some notable coil whine, way worse then my old, waterblocked 2080 Ti Gaming X Trio. I am quite sensitive about it, but it's no big issue for me.
> I have installed a new PSU. While doing that, I have pulled the card and tightened all screws on the backplate, some of them could be tightened further quite easily.
> 
> So either the new PSU or the tightening has considerably reduced the coil while. Thought Id share..


Yeah it's just absorbing the vibrations better. My liquid x sometimes squeals like a pig because no aluminum fins to absorb vibrations.

Not ideal with open bench lol


----------



## newls1

is about a 300 point increase about normal for what you get by enabling rebar for port royal?


----------



## yzonker

newls1 said:


> is about a 300 point increase about normal for what you get by enabling rebar for port royal?


Yes. 200-300.


----------



## Spiriva

newls1 said:


> pretty happy... my cablemod 16x4 cable just shipped . Cant wait to get rid of this adapter!


I ordered from cablemods on 15th of October, they sent it on November 3rd. I got a tracking number from DHL saying they and waiting for the package in Holland.
The thing is that Cablemod send thier cables on a boat from China to Europe, that takes 3-4 weeks to reach Europe.

If you got a tracking number but nothing happens for weeks, then you know: your cable is on some boat towards America.


----------



## Nizzen

GRABibus said:


> Wrong thread


Yes posting Amd gpu result in this thread is epic fail. Ddr5 overclocked is all about to get 4090 perform 100%


----------



## GRABibus

Nizzen said:


> Yes posting Amd gpu result in this thread is epic fail. Ddr5 overclocked is all about to get 4090 perform 100%


Yes, but we see the screenshots of your RAM settings. What's the point if we see don't see GPU related performances ?


----------



## cheddardonkey

N19htmare666 said:


> Thank you for sharing the Waterforce BIOS.
> 
> I totally agree with you regarding effective clocks.
> 
> FE isn't cross flashable as far I understand.
> 
> It's interesting to hear that the Waterforce BIOS is benching higher than the other brands even with the power restriction. I wonder what the cause is. Could it be dropping to a higher base clock when voltage/power limited than other brands.
> 
> If yes then we'd expect higher bench results using the Waterforce BIOS on other cards. If not, then there some other dark magic going on. Wonder what other folks think of this.


A bios edit to allow 600w I think would be telling.. but on THIS bios. I don't think an editor is out there that can help though.


----------



## KingEngineRevUp

Nizzen said:


> That doesn't mean it's not bugged
> Check scaling with memory OC. Try 500, 750,1000,1250,1500,1700 etc.... See if the scaling is perfect, or it skyrocket to the roof in around 1500mhz+
> When someone got 500+ more points with the same core and memoryclocks, something is wrong....
> 
> My run wasn't artifacted, because I turned off the monitor LOL 😂😝😜🤪🥳


Simple rule perhaps. Users here should have to post 3 runs back to back with a very small standard deviation.


----------



## newls1

all these people posting 29000+ runs in PR have to be doing something fishy. I spent last night just starting my first OC attemps on my new 4090 that came in and started just increasing memory in +150 increments at a time. So many many runs later i gain about 200ish points per +150 I go up in memory. Its amazing how same specd system as me, same everything BUT they are running +1700 on mem (or something crazy like that) gets them a score of 30,000+ COME ON..... No way. As soon as I break a legit 28000 in PR, my OC attemps will be satisfied as card is almost TOO FAST! (never thought id ever say that)


----------



## GQNerd

Consistency is Key 😉


----------



## PhuCCo

I'm testing a Gigabyte Gaming OC with the Port Royal stress tests (20 loops) and I'm able to get +1900 on the memory so far with zero artifacts. And the highest Port Royal score I've been able to get is 27,771. My score seems pretty low compared to some that I've seen. I'm testing with a 10900KF system so it's with pcie 3.0 and I don't have rebar on.









I scored 27 771 in Port Royal


Intel Core i9-10900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





I did get like 27,850 with the memory at +2000, but I got some pretty bad graphical glitches on the ship at the very end of the benchmark so I deleted it. I haven't tested between + 1900-2000 yet.

I'm noticing that my card is doing a bit of clock stretching, with the reported core frequency being 3030-3045 and my actual effective clock is near 3000. I have my card locked to 1075mV as I was testing the stable frequencies per voltage point. I know I can go up to 1.1V. Am I just starving my card of the extra 25mV and it's bringing the clocks down? The core on this one doesn't seem very good.


----------



## J7SC

yzonker said:


> Although I agree there will always be some ambiguity (which is very irritating), I have not seen any evidence that the scores are artificially inflated when no visible artifacting occurs.
> 
> The test your suggesting most likely would reveal nothing as artifacted/bugged runs only occur in a very small range of mem speeds. Just below that point it runs normally, very much above and it just crashes.
> 
> But I am 100% in favor of requiring ECC as that would hopefully eliminate any uncertainty.


I've been running Port Royal ever since it came out, and I really like it - not least as it is somewhat less sensitive to CPU (though it likes fast+tight system RAM) which suits me with my work+play combos which are often HEDT. Now though, Port Royal seems a bit like 'damaged goods' because of the 'ambiguity' you and others referenced. 

At the end of the day, it generally reminds me of 'Cane and Abel' though... I am also against 3DM requiring a change to have ECC switched from 'default'(!) to 'on', or 3DM manually removing results that are outside some sort of bell curve...the whole point is to get as efficient as possible at the highest clocks and push the envelope. Eliminating results that 3DM software already called 'valid' and with 'Systeminfo on' by hand not only affects some questionable subs, but also potentially runs the risk of killing off perfectly good ones by folks who worked hard and fair...then there is the question of NVidia and AMD driver updates and how that shifts what was already an arbitrary manual intervention.

*Instead, 3DM* should com up with a new version that detects artifacts - I already posted some ancient GPU software applet (ATI) that still tests for and detects artefacts, and clearly, 3DM could add something similar perhaps even with an updated Systeminfo. Speaking of Systeminfo, that stays active even if you close 3DM altogether - not really acceptable in good coding land...I usually close it by hand (ctr-alt-dlt / task manager).


----------



## Sheyster

Spiriva said:


> I ordered from cablemods on 15th of October, they sent it on November 3rd. I got a tracking number from DHL saying they and waiting for the package in Holland.
> The thing is that Cablemod send thier cables on a boat from China to Europe, that takes 3-4 weeks to reach Europe.
> 
> If you got a tracking number but nothing happens for weeks, then you know: your cable is on some boat towards America.


U.S. orders are shipped via FedEx Priority International. I got mine in just a few days after it was shipped. I didn't pay their $30 rush fee so had to wait for them to make the cable. It took 2 weeks from the order date to have the cable delivered to me without the rush fee.


----------



## KingEngineRevUp

Sheyster said:


> U.S. orders are shipped via FedEx Priority International. I got mine in just a few days after it was shipped. I didn't pay their $30 rush fee so had to wait for them to make the cable. It took 2 weeks from the order date to have the cable delivered to me without the rush fee.


I got lucky. I had one ordered, but when I looked on offerup, there was a guy that bought three of them but didn't know which color he wanted to stick with. He had the exact one I wanted! I ended up cancelling my cablemod order.


----------



## newls1

J7SC said:


> I've been running Port Royal ever since it came out, and I really like it - not least as it is somewhat less sensitive to CPU (though it likes fast+tight system RAM) which suits me with my work+play combos which are often HEDT. Now though, Port Royal seems a bit like 'damaged goods' because of the 'ambiguity' you and others referenced.
> 
> At the end of the day, it generally reminds me of 'Cane and Abel' though... I am also against 3DM requiring a change to have ECC switched from 'default'(!) to 'on', or 3DM manually removing results that are outside some sort of bell curve...the whole point is to get as efficient as possible at the highest clocks and push the envelope. Eliminating results that 3DM software already called 'valid' and with 'Systeminfo on' by hand not only affects some questionable subs, but also potentially runs the risk of killing off perfectly good ones by folks who worked hard and fair...then there is the question of NVidia and AMD driver updates and how that shifts what was already an arbitrary manual intervention.
> 
> *Instead, 3DM* should com up with a new version that detects artifacts - I already posted some ancient GPU software applet (ATI) that still tests for and detects artefacts, and clearly, 3DM could add something similar perhaps even with an updated Systeminfo. Speaking of Systeminfo, that stays active even if you close 3DM altogether - not really acceptable in good coding land...I usually close it by hand (ctr-alt-dlt / task manager).


that "system info" running in the background EVEN AFTER you close the app drives me crazy! Hope they fix that soon.


----------



## Zero989

J7SC said:


> I've been running Port Royal ever since it came out, and I really like it - not least as it is somewhat less sensitive to CPU (though it likes fast+tight system RAM) which suits me with my work+play combos which are often HEDT. Now though, Port Royal seems a bit like 'damaged goods' because of the 'ambiguity' you and others referenced.
> 
> At the end of the day, it generally reminds me of 'Cane and Abel' though... I am also against 3DM requiring a change to have ECC switched from 'default'(!) to 'on', or 3DM manually removing results that are outside some sort of bell curve...the whole point is to get as efficient as possible at the highest clocks and push the envelope. Eliminating results that 3DM software already called 'valid' and with 'Systeminfo on' by hand not only affects some questionable subs, but also potentially runs the risk of killing off perfectly good ones by folks who worked hard and fair...then there is the question of NVidia and AMD driver updates and how that shifts what was already an arbitrary manual intervention.
> 
> *Instead, 3DM* should com up with a new version that detects artifacts - I already posted some ancient GPU software applet (ATI) that still tests for and detects artefacts, and clearly, 3DM could add something similar perhaps even with an updated Systeminfo. Speaking of Systeminfo, that stays active even if you close 3DM altogether - not really acceptable in good coding land...I usually close it by hand (ctr-alt-dlt / task manager).


Why not just call what it is, atitool


----------



## newls1

Random question about our cards..... Being that the same memory was used on the 3090Ti and our 4090's.... why is it the memory OCs WAY WAY HIGHER on the 4090? My 3090Ti absolutely can NOT go over +800 or instant game crashes... My co workers 3090Ti can NOT go over +900 or instant game lockups and glitches.... 4090's tho... +1200 and higher is the norm... WHY and HOW? It is identical memory, and i understand its the luck of the draw with OC'ing but litterly it seems everyone can clock their memory to the moon on these cards. Did nvidia raise the voltage allowed for the memory maybe?


----------



## Panchovix

The newer drivers def helped with some scores (DX12), though I saw an small decrease on FireStrike (DX11). Applied the common "optimizations" (forcing rebar, nvidia control panel settings to performance, etc); also, can't even do the VRAM bug since mine is so bad that it crashes instead of artifacting at +1175 or more 

These runs are 3090Mhz on TS, 3105Mhz on SWay and 3015Mhz on PR (the 2 formers have less effective clocks, on PR that 3015Mhz is effective) and +1130VRAM)
SWay and PR were on W11, TS on W10









I scored 28 076 in Port Royal


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





I scored 29 534 in Time Spy (40124 Graphics score)









I scored 10 752 in Speed Way


AMD Ryzen 7 5800X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com







Spoiler: Pics


----------



## Nizzen

Miguelios said:


> Consistency is Key 😉
> 
> View attachment 2582219


Consistent bugged 
Too bad it's almost impossible to know what score that is correct. Disable HOF for port royal...


----------



## Gking62

For Strix users, this update tool V2, anyone use it yet?


----------



## GQNerd

Nizzen said:


> Consistent bugged
> Too bad it's almost impossible to know what score that is correct. Disable HOF for port royal...


Ha..ha..

No artifacts here, but unfortunately there's no way to know if Memory is messing up the math in the background..

However, I've even gone as far as to start recording my runs: 




Best we can do for now... 

Also, my #'s are high Across ALL OF THE BENCHES not just PR, so...


----------



## newls1

Looking for a waterblock for my gigabyte card... I see phanteks has a block ready for purchase.. How are they compared to EK? Should I get the phanteks or wait 1month for the EK


----------



## J7SC

Zero989 said:


> Why not just call what it is, atitool


...  already did > here in this thread, and various other threads, ie. 3090


----------



## mirkendargen

newls1 said:


> all these people posting 29000+ runs in PR have to be doing something fishy. I spent last night just starting my first OC attemps on my new 4090 that came in and started just increasing memory in +150 increments at a time. So many many runs later i gain about 200ish points per +150 I go up in memory. Its amazing how same specd system as me, same everything BUT they are running +1700 on mem (or something crazy like that) gets them a score of 30,000+ COME ON..... No way. As soon as I break a legit 28000 in PR, my OC attemps will be satisfied as card is almost TOO FAST! (never thought id ever say that)


With the new driver, I can consistently get 28950-28975 legit scores, and I can only clock my memory to ~+1400-1425. +1450-1500 = 30k+ artifact runs. There are people that can legit clock their memory to +1500-+1600, maybe even slightly higher, so I'd believe low 29k scores with the new driver on exceptionally good cards.


----------



## KingEngineRevUp

Miguelios said:


> Ha..ha..
> 
> No artifacts here, but unfortunately there's no way to know if Memory is messing up the math in the background..
> 
> However, I've even gone as far as to start recording my runs:
> 
> 
> 
> 
> Best we can do for now...
> 
> Also, my #'s are high Across ALL OF THE BENCHES not just PR, so...


Don't record your runs. Just do 3 runs in a row and show them. If the STDV is high then you had a glitchy run.


----------



## GQNerd

mirkendargen said:


> With the new driver, I can consistently get 28950-28975 legit scores, and I can only clock my memory to ~+1400-1425. +1450-1500 = 30k+ artifact runs. There are people that can legit clock their memory to +1500-+1600, maybe even slightly higher, so I'd believe low 29k scores with the new driver on exceptionally good cards.


I've seen folks report better scores on the new driver, but mine stayed the same.. I even DDU'ed and reinstalled drivers, still same perf. 28.8 all-day-long, but would've been nice to get that extra ~200pts



KingEngineRevUp said:


> Don't record your runs. Just do 3 runs in a row and show them. If the STDV is high then you had a glitchy run.



Already done that.. those scores I posted are after multiple runs.

- First I found my Mem sweetspot, which is +1600 for me.. I can bench at +1800, but it's inconsistent or crashes. (+1700 still scales, but I dialed-it back for stability/daily-use)

- My card boosts to 2820mhz stock (no voltage or PL sliders, or offsets)... I slide to max PL and voltage at 1100mv it hits 2910mhz with no offsets..

- Next, I applied +15mhz to core on each run until I reached instability, which for my card is 3090mhz. So I removed -15mhz until it was stable and consistent, I arrived at 3060mhz.

One thing to note, is my card's requested clocks and effective clocks stay VERY close comparatively speaking. Some people are saying "Oh wow, my card runs at 3120-3150mhz", but I see their effective clocks are closer to 3000-3030mhz LMAO... Mine does 3060mhz, and the Effective clock is 3030mhz. (lower I go, closer they are.. higher I go, bigger delta)

I give up on the PR stuff until 3dmark figures it out.. But again, my #'s are high in wtv Benchmark I throw at this card.
Not bad for a "crappy AIO" card, and single rad custom loop CPU without exotic cooling".


----------



## PhuCCo

newls1 said:


> Looking for a waterblock for my gigabyte card... I see phanteks has a block ready for purchase.. How are they compared to EK? Should I get the phanteks or wait 1month for the EK


Look at Bykski. It's half the price of EK and Phanteks. A few people in here have already put them on their Gigabyte cards and the temperatures are very good.


----------



## yzonker

J7SC said:


> I've been running Port Royal ever since it came out, and I really like it - not least as it is somewhat less sensitive to CPU (though it likes fast+tight system RAM) which suits me with my work+play combos which are often HEDT. Now though, Port Royal seems a bit like 'damaged goods' because of the 'ambiguity' you and others referenced.
> 
> At the end of the day, it generally reminds me of 'Cane and Abel' though... I am also against 3DM requiring a change to have ECC switched from 'default'(!) to 'on', or 3DM manually removing results that are outside some sort of bell curve...the whole point is to get as efficient as possible at the highest clocks and push the envelope. Eliminating results that 3DM software already called 'valid' and with 'Systeminfo on' by hand not only affects some questionable subs, but also potentially runs the risk of killing off perfectly good ones by folks who worked hard and fair...then there is the question of NVidia and AMD driver updates and how that shifts what was already an arbitrary manual intervention.
> 
> *Instead, 3DM* should com up with a new version that detects artifacts - I already posted some ancient GPU software applet (ATI) that still tests for and detects artefacts, and clearly, 3DM could add something similar perhaps even with an updated Systeminfo. Speaking of Systeminfo, that stays active even if you close 3DM altogether - not really acceptable in good coding land...I usually close it by hand (ctr-alt-dlt / task manager).


I agree a code change would be preferred. No doubt. The issue there might be it would incur a performance penalty. This would hit every card, not just the 4090 and cause anyone with older cards to not be able to match their old scores. I'm sure they consider that when making any coding changes. If it could be done with a minimal impact on scores, then it would definitely be the best solution. Otherwise doing something 40 series specific would make more sense probably, even if it kinda sucks for us.


----------



## Panchovix

Miguelios said:


> I've seen folks report better scores on the new driver, but mine stayed the same.. I even DDU'ed and reinstalled drivers, still same perf. 28.8 all-day-long, but would've been nice to get that extra ~200pts


I think it maybe does helps when there's some CPU bottleneck? I mean, the 13900K is way ahead than my 5800X for example (and also my RAM is pretty mediocre, but don't judge me, I'm waiting for the 7800X3D lol), but maybe the new driver has less overhead and thus helping in slower CPUs like mine.

Though I'm not sure either because I have seen some reports with people that have the 13900K having better scores after the new driver, so welp, it is a mistery now it seems lol


----------



## yzonker

mirkendargen said:


> With the new driver, I can consistently get 28950-28975 legit scores, and I can only clock my memory to ~+1400-1425. +1450-1500 = 30k+ artifact runs. There are people that can legit clock their memory to +1500-+1600, maybe even slightly higher, so I'd believe low 29k scores with the new driver on exceptionally good cards.


That's my conclusion as well now that I've got things working a bit better. Comparing clocks, the other scores look inline with mine. I'm not doing anything special other than the usual tweaks. Pretty much all of them are covered in that series of screenshots someone posted. And of course I'm using a chiller, which is not nearly as effective as it was with 30 series, but still adds 200 pts or so. It would be more if the cold temps didn't somewhat nerf my mem OC. I had to _only_ use +1700 on the 29000pt run I posted (down from the 1800-1900 the card could do on the air cooler). LOL My core is very average, but I'm getting some help from the good mem.


----------



## yzonker

delete


----------



## KingEngineRevUp

Miguelios said:


> I've seen folks report better scores on the new driver, but mine stayed the same.. I even DDU'ed and reinstalled drivers, still same perf. 28.8 all-day-long, but would've been nice to get that extra ~200pts
> 
> 
> 
> 
> Already done that.. those scores I posted are after multiple runs.
> 
> - First I found my Mem sweetspot, which is +1600 for me.. I can bench at +1800, but it's inconsistent or crashes. (+1700 still scales, but I dialed-it back for stability/daily-use)
> 
> - My card boosts to 2820mhz stock (no voltage or PL sliders, or offsets)... I slide to max PL and voltage at 1100mv it hits 2910mhz with no offsets..
> 
> - Next, I applied +15mhz to core on each run until I reached instability, which for my card is 3090mhz. So I removed -15mhz until it was stable and consistent, I arrived at 3060mhz.
> 
> One thing to note, is my card's requested clocks and effective clocks stay VERY close comparatively speaking. Some people are saying "Oh wow, my card runs at 3120-3150mhz", but I see their effective clocks are closer to 3000-3030mhz LMAO... Mine does 3060mhz, and the Effective clock is 3030mhz. (lower I go, closer they are.. higher I go, bigger delta)
> 
> I give up on the PR stuff until 3dmark figures it out.. But again, my #'s are high in wtv Benchmark I throw at this card.
> Not bad for a "crappy AIO" card, and single rad custom loop CPU without exotic cooling".


Then your runs look legit. I don't think you need to record them.


----------



## newls1

PhuCCo said:


> Look at Bykski. It's half the price of EK and Phanteks. A few people in here have already put them on their Gigabyte cards and the temperatures are very good.


no thank you...had a few of their blocks and never again


----------



## WayWayUp

PhuCCo said:


> I'm testing a Gigabyte Gaming OC with the Port Royal stress tests (20 loops) and I'm able to get +1900 on the memory so far with zero artifacts. And the highest Port Royal score I've been able to get is 27,771. My score seems pretty low compared to some that I've seen. I'm testing with a 10900KF system so it's with pcie 3.0 and I don't have rebar on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 27 771 in Port Royal
> 
> 
> Intel Core i9-10900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I did get like 27,850 with the memory at +2000, but I got some pretty bad graphical glitches on the ship at the very end of the benchmark so I deleted it. I haven't tested between + 1900-2000 yet.
> 
> I'm noticing that my card is doing a bit of clock stretching, with the reported core frequency being 3030-3045 and my actual effective clock is near 3000. I have my card locked to 1075mV as I was testing the stable frequencies per voltage point. I know I can go up to 1.1V. Am I just starving my card of the extra 25mV and it's bringing the clocks down? The core on this one doesn't seem very good.
> 
> View attachment 2582220
> View attachment 2582221


I just watched Der8auer's 4090 Ln2 video and he's having an issue with pcie bandwidth. his system is running at x8/x8 and thats causing him to score way lower in port royal.

with a clock frequency of 3,210 MHz he is in 40th place.....

safe to say if your using a 10900k your severally bandwidth limited. And if you have an m.2 ssd installed its even worse


----------



## PhuCCo

newls1 said:


> no thank you...had a few of their blocks and never again


That sucks that you had a bad experience. These new ones at least come with usable thermal pads now, and they're all the same size.

Different PCB, but my FE EK block sucks. Another guy in here has the EK FE block with the active backplate and his temps aren't much better. The actual cooling performance for the gpu core doesn't seem very good. I did literally everything with mine and the temperature deltas still suck compared to other designs (liquid metal, different pad thicknesses, etc) They must have sacrificed cooling potential for lower flow restriction or something. The flow paths of the coldplate between the FE and Gigabyte are pretty much identical, so I don't see it doing much better on a Gigabyte card.


----------



## GQNerd

WayWayUp said:


> I just watched Der8auer's 4090 Ln2 video and he's having an issue with pcie bandwidth. his system is running at x8/x8 and thats causing him to score way lower in port royal.
> 
> with a clock frequency of 3,210 MHz he is in 40th place.....
> 
> safe to say if your using a 10900k your severally bandwidth limited. And if you have an m.2 ssd installed its even worse


Very Interesting.. On that note, have you all tried the 3dmark PCI Express Test?

For people with a similar setup (z690 + 13900k) what are you all seeing for bandwidth?

I'm only getting 24.2GB/s.. (card slotted and/or riser, doesn't make a difference... also tried moving my nvme drive, still nothing).

Is the only way to reach higher bandwidth by using a SATA drive?


----------



## ZealotKi11er

newls1 said:


> Random question about our cards..... Being that the same memory was used on the 3090Ti and our 4090's.... why is it the memory OCs WAY WAY HIGHER on the 4090? My 3090Ti absolutely can NOT go over +800 or instant game crashes... My co workers 3090Ti can NOT go over +900 or instant game lockups and glitches.... 4090's tho... +1200 and higher is the norm... WHY and HOW? It is identical memory, and i understand its the luck of the draw with OC'ing but litterly it seems everyone can clock their memory to the moon on these cards. Did nvidia raise the voltage allowed for the memory maybe?


Memory controller probably designed for 24+


----------



## alasdairvfr

dk_mic said:


> A little observation about coil whine:
> 
> My Gaming X Trio has some notable coil whine, way worse then my old, waterblocked 2080 Ti Gaming X Trio. I am quite sensitive about it, but it's no big issue for me.
> I have installed a new PSU. While doing that, I have pulled the card and tightened all screws on the backplate, some of them could be tightened further quite easily.
> 
> So either the new PSU or the tightening has considerably reduced the coil while. Thought Id share..


My money is on the PSU, though tightening the block maybe helped. Nothing like a few heat cycles to limber up the tim and pads.
I was talking to another user a week or 2 ago and he had awful coil whine that was mostly resolved with a psu upgrade. I wish more atx3 psus were available


----------



## GRABibus

newls1 said:


> all these people posting 29000+ runs in PR have to be doing something fishy. I spent last night just starting my first OC attemps on my new 4090 that came in and started just increasing memory in +150 increments at a time. So many many runs later i gain about 200ish points per +150 I go up in memory. Its amazing how same specd system as me, same everything BUT they are running +1700 on mem (or something crazy like that) gets them a score of 30,000+ COME ON..... No way. As soon as I break a legit 28000 in PR, my OC attemps will be satisfied as card is almost TOO FAST! (never thought id ever say that)


Nothing special beyond 29k.









[Official] NVIDIA RTX 4090 Owner's Club


Sorry to hear that. I suggest also checking reddit and search here as well, for possible reasons. this is so wierd!




www.overclock.net





+240MHz on core
+1600MHz on memory (no artifacts and as I don’t scale anymore as of 1700MHz, I remain at +1600MHz).
Drivers on high performances
Rebar forced with nvidia inspector in 3DMark Port Royal DLSS exe.

silicon lottery…?


----------



## Gking62

Miguelios said:


> Very Interesting.. On that note, have you all tried the 3dmark PCI Express Test?
> 
> For people with a similar setup (z690 + 13900k) what are you all seeing for bandwidth?
> 
> I'm only getting 24.2GB/s.. (card slotted and/or riser, doesn't make a difference... also tried moving my nvme drive, still nothing).
> 
> Is the only way to reach higher bandwidth by using a SATA drive?


24.14 GB/s, on Linkup Riser Extreme PCI-E 4.0, no card overclocking yet, V2 bios, no SSD in m.2_1 atm, but will have and from my experience the differences in having an ssd in PCI-E 5 slot are so negligible, not worth taking about.

of note, my 2 WD SN850s are in m.2_2 & 3


----------



## mirkendargen

Miguelios said:


> Very Interesting.. On that note, have you all tried the 3dmark PCI Express Test?
> 
> For people with a similar setup (z690 + 13900k) what are you all seeing for bandwidth?
> 
> I'm only getting 24.2GB/s.. (card slotted and/or riser, doesn't make a difference... also tried moving my nvme drive, still nothing).
> 
> Is the only way to reach higher bandwidth by using a SATA drive?


I get ~27GB/s with a riser cable (also tested without previously and no difference) on X670E all lanes in use. I guess AMD might do PCIE bandwidth a bit better or something.


----------



## newls1

Just ordered this phanteks waterblock and damn it im stoked!! Finally a waterblock available with out waiting for 8 weeks for availability!! Never owned a phanteks watercooling product, so im hoping its performance will be as good as EK would have been if i decided to wait another month for that block. I really like the inlet/outlet port locations on this block, its very different. And thankfully the backplate included uses thermal pads for heat disapation. Was seriously considering going the "cheap" route and ttry my luck with bykski again, but I had the WORST TIME playing the thermal pad game on my 3090 block i had with them and didnt want to reassemble a loop 453 times and waste all that time. well, any feedback on phanteks you may have, please let me know and heres a pic of the block









Phanteks Glacier G4090 GIGABYTE for GIGABYTE AORUS MASTER / GAMING RTX 4090


Phanteks’ Glacier G4090 GIGABYTE Water Block provides a high-performance water-cooling solution custom-designed for the latest GIGABYTE AORUS MASTER / GAMING RTX 4090 cards. Designed using premium materials, the Glacier G4090 GIGABYTE Water Block features an aluminum cover, nickel-plated copper...



www.phanteks.store


----------



## mirkendargen

newls1 said:


> Just ordered this phanteks waterblock and damn it im stoked!! Finally a waterblock available with out waiting for 8 weeks for availability!! Never owned a phanteks watercooling product, so im hoping its performance will be as good as EK would have been if i decided to wait another month for that block. I really like the inlet/outlet port locations on this block, its very different. And thankfully the backplate included uses thermal pads for heat disapation. Was seriously considering going the "cheap" route and ttry my luck with bykski again, but I had the WORST TIME playing the thermal pad game on my 3090 block i had with them and didnt want to reassemble a loop 453 times and waste all that time. well, any feedback on phanteks you may have, please let me know and heres a pic of the block
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Phanteks Glacier G4090 GIGABYTE for GIGABYTE AORUS MASTER / GAMING RTX 4090
> 
> 
> Phanteks’ Glacier G4090 GIGABYTE Water Block provides a high-performance water-cooling solution custom-designed for the latest GIGABYTE AORUS MASTER / GAMING RTX 4090 cards. Designed using premium materials, the Glacier G4090 GIGABYTE Water Block features an aluminum cover, nickel-plated copper...
> 
> 
> 
> www.phanteks.store


I had a Phanteks 1080ti block awhile back. It was fine, nothing bad, nothing amazing. And I mean....nothing is stopping you from putting thermal pads where ever you want, including the backplate. You could coat the back of the PCB with pads and as long as the backplate screws go through it's not like the backplate needs some special design to "work" with them.


----------



## KingEngineRevUp

PhuCCo said:


> That sucks that you had a bad experience. These new ones at least come with usable thermal pads now, and they're all the same size.
> 
> Different PCB, but my FE EK block sucks. Another guy in here has the EK FE block with the active backplate and his temps aren't much better. The actual cooling performance for the gpu core doesn't seem very good. I did literally everything with mine and the temperature deltas still suck compared to other designs (liquid metal, different pad thicknesses, etc) They must have sacrificed cooling potential for lower flow restriction or something. The flow paths of the coldplate between the FE and Gigabyte are pretty much identical, so I don't see it doing much better on a Gigabyte card.


That's disheartening to hear. I tried to go cancel my EK order, but then they shipped it. So I'm stuck with it. Did you try putting a washer on the 4 GPU screws to increase mounting pressure?


----------



## PhuCCo

KingEngineRevUp said:


> That's disheartening to hear. I tried to go cancel my EK order, but then they shipped it. So I'm stuck with it. Did you try putting a washer on the 4 GPU screws to increase mounting pressure?



Yeah I confirmed that mounting pressure wasn't the issue, I completely bottomed out the pcb to the block standoffs. I get about a 9C delta between core and core hotspot at 400-450W, and I can see the entire die outline in the paste spread including the writing on the die. The block just won't dissipate the heat very well. That's all with over 1GPM of flow with a 1260mm radiator.


----------



## ttnuagmada

noticed EK is selling active backplates for these. Any good reason for that? Thought all of the ram was on the top-side with these.

Also, what's the consensus on the best FE block out?


----------



## PhuCCo

WayWayUp said:


> I just watched Der8auer's 4090 Ln2 video and he's having an issue with pcie bandwidth. his system is running at x8/x8 and thats causing him to score way lower in port royal.
> 
> with a clock frequency of 3,210 MHz he is in 40th place.....
> 
> safe to say if your using a 10900k your severally bandwidth limited. And if you have an m.2 ssd installed its even worse


Thanks for the info, I figured as much. I guess I am either limited by pcie bandwidth, or Port Royal cares more about cpu than I thought. I did confirm that I'm running at full pcie 3.0 x16, so I guess I am essentially pcie 4.0 x8.


----------



## mirkendargen

ttnuagmada said:


> noticed EK is selling active backplates for these. Any good reason for that? Thought all of the ram was on the top-side with these.


The RAM is all on the front, it's so they can upsell to make more money. You can dissipate some heat on the back through the back of the RAM/core/power components/power connector....but not enough to warrant active cooling. I have all that connected to my backplate with pads, and a slow fan blowing across it makes it barely even warm. No heatsink, just the backplate with a slow fan. We're talking like 5W....maybe...


----------



## PhuCCo

ttnuagmada said:


> noticed EK is selling active backplates for these. Any good reason for that? Thought all of the ram was on the top-side with these.
> 
> Also, what's the consensus on the best FE block out?


The backplate still gets warm, but active backplates are absolutely unnecessary this gen since the 4090 doesn't have memory on the backside like the 3090. 

The only FE block out right now are the EK ones. The standard and active backplate versions. I will say that the blocks look very nice.

I am sending my standard backplate FE block back through RMA for a refund because I am very unhappy with the performance. Also, my backplate was sent to me damaged. They package them pretty much fully assembled with a cardboard cutout of a pcb in between the block and backplate. Mine had the backplate screws torqued down so hard that the backplate is completely warped even after install. 

EK messed up the current FE blocks and are shipping 0.8mm pads to fix their machining oversight, and they told me they don't know when the proper block will start shipping. I would have liked to test the proper block.

It's not like the EK block is unusable, but it was clearly rushed. Alphacool and a ton of others are going to have FE blocks so there will be more options in the near future.


----------



## ttnuagmada

PhuCCo said:


> The backplate still gets warm, but active backplates are absolutely unnecessary this gen since the 4090 doesn't have memory on the backside like the 3090.
> 
> The only FE block out right now are the EK ones. The standard and active backplate versions. I will say that the blocks look very nice.
> 
> I am sending my standard backplate FE block back through RMA for a refund because I am very unhappy with the performance. Also, my backplate was sent to me damaged. They package them pretty much fully assembled with a cardboard cutout of a pcb in between the block and backplate. Mine had the backplate screws torqued down so hard that the backplate is completely warped even after install.
> 
> EK messed up the current FE blocks and are shipping 0.8mm pads to fix their machining oversight, and they told me they don't know when the proper block will start shipping. I would have liked to test the proper block.
> 
> It's not like the EK block is unusable, but it was clearly rushed. Alphacool and a ton of others are going to have FE blocks so there will be more options in the near future.


Guess i'll try my luck with the EK block. I've always seen people complain about their QC, but I guess I've gotten lucky, because I've had several different models over the last few years and they've all been problem free (980 Tix2, 1080 Tix2, 3080 and 3090 w/abp)

What's your opinion on the EK thermal pads? I went with thermalrights on my 3080/3090 and they've worked really well for an extended period (guess i might as well get some more).


----------



## dr/owned

ttnuagmada said:


> noticed EK is selling active backplates for these. Any good reason for that? Thought all of the ram was on the top-side with these.


There's little benefit to cooling-through-pcb. Like yeah maybe if you're running things to the ragged edge but the stability there is going to be awful anyways.


----------



## dr/owned

When did Phanteks get a hard on that they're making blocks out of gold?


----------



## newls1

ttnuagmada said:


> Guess i'll try my luck with the EK block. I've always seen people complain about their QC, but I guess I've gotten lucky, because I've had several different models over the last few years and they've all been problem free (980 Tix2, 1080 Tix2, 3080 and 3090 w/abp)
> 
> What's your opinion on the EK thermal pads? I went with thermalrights on my 3080/3090 and they've worked really well for an extended period (guess i might as well get some more).


The cooling performance ive had on every single GPU block from EK was nothing short of awesome... you wont have any issues


----------



## newls1

dr/owned said:


> View attachment 2582251
> 
> 
> When did Phanteks get a hard on that they're making blocks out of gold?


just ordered that actually (but for my gigabyte) about 1 hour ago... about same price as EK is charging....


----------



## KingEngineRevUp

PhuCCo said:


> Yeah I confirmed that mounting pressure wasn't the issue, I completely bottomed out the pcb to the block standoffs. I get about a 9C delta between core and core hotspot at 400-450W, and I can see the entire die outline in the paste spread including the writing on the die. The block just won't dissipate the heat very well. That's all with over 1GPM of flow with a 1260mm radiator.


****, well I'll let you know how mine turns out ... How much did it cost for you to get a refund? Do they charge a restocking fee?


----------



## PhuCCo

ttnuagmada said:


> Guess i'll try my luck with the EK block. I've always seen people complain about their QC, but I guess I've gotten lucky, because I've had several different models over the last few years and they've all been problem free (980 Tix2, 1080 Tix2, 3080 and 3090 w/abp)
> 
> What's your opinion on the EK thermal pads? I went with thermalrights on my 3080/3090 and they've worked really well for an extended period (guess i might as well get some more).


I've have zero issues with EK until this so I understand what you're saying. 

So the FE block requires 0.8mm pads for the memory, and the rest of the card on the front can use 1mm. The included instructions in the box and on the website say to use 1mm on the memory and 1/1.5mm on everything else, but you need to ignore those instructions for these early blocks. The FE block should come with a sheet of paper with the updated instructions for the thinner pads. 

The stock EK pads really don't seem too bad. I did try 1mm Gelid Extremes on the memory thinking that they would compress to make up for the extra 0.2mm, but my core temps and hotspot were much worse. The memory temps did drop about 10C though. The tolerances are that tight. I couldn't believe it. I also tried 0.5mm pads on the memory and the core temps stayed the same, but my memory temps increased 20C. You're stuck needing 0.8mm on the memory.


----------



## doubledoubt

Panchovix said:


> Gonna try this later on my ASUS, I'm willing to test any 600W on my TUF to _maybe _be able to increase the VRAM a bit, no hopes though (_sigh_)


Let us know how it goes!


----------



## PhuCCo

KingEngineRevUp said:


> ****, well I'll let you know how mine turns out ... How much did it cost for you to get a refund? Do they charge a restocking fee?


They told me to arrange a day for a driver to come pick up the block and ship it back to EK, so I wonder if there will be any restocking fee at all. I'll update if they end up taking a percentage off of the refund or something. 
I did send pictures of my bent backplate straight out of the box, and I recommend taking a ton of pictures of yours before disassembling it and using it just in case. I do wonder if all of them are shipped like this.

And yeah definitely post your results with the EK block once you get it. Everyone on here can benefit from the information. I could just have a lemon.
Be careful picking out the 0.8mm pads from the rest of the 1mm pads for the memory because they're not labelled.


----------



## newls1

newls1 said:


> Just ordered this phanteks waterblock and damn it im stoked!! Finally a waterblock available with out waiting for 8 weeks for availability!! Never owned a phanteks watercooling product, so im hoping its performance will be as good as EK would have been if i decided to wait another month for that block. I really like the inlet/outlet port locations on this block, its very different. And thankfully the backplate included uses thermal pads for heat disapation. Was seriously considering going the "cheap" route and ttry my luck with bykski again, but I had the WORST TIME playing the thermal pad game on my 3090 block i had with them and didnt want to reassemble a loop 453 times and waste all that time. well, any feedback on phanteks you may have, please let me know and heres a pic of the block
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Phanteks Glacier G4090 GIGABYTE for GIGABYTE AORUS MASTER / GAMING RTX 4090
> 
> 
> Phanteks’ Glacier G4090 GIGABYTE Water Block provides a high-performance water-cooling solution custom-designed for the latest GIGABYTE AORUS MASTER / GAMING RTX 4090 cards. Designed using premium materials, the Glacier G4090 GIGABYTE Water Block features an aluminum cover, nickel-plated copper...
> 
> 
> 
> www.phanteks.store


and aquick update...... GREAT service so far.... i just ordered this block about 1 hour ago, and they already emailed me with shipping details and a tracking number.... WOW!


----------



## Spiriva

newls1 said:


> just ordered that actually (but for my gigabyte) about 1 hour ago... about same price as EK is charging....


If you could write a line or two when you got it, what you think and such would be nice. It looks pretty clean from the pictures ive seen on it!

I ordered the "EK-Quantum Vector² Strix RTX 4090 D-RGB - Nickel + Acetal" from EK´s webshop today. Said it would be in stock in two weeks. Still waiting for my cablemod cable and Apex z790 so i guess a few more weeks waiting for a waterblock makes no diff


----------



## newls1

Spiriva said:


> If you could write a line or two when you got it, what you think and such would be nice. It looks pretty clean from the pictures ive seen on it!
> 
> I ordered the "EK-Quantum Vector² Strix RTX 4090 D-RGB - Nickel + Acetal" from EK´s webshop today. Said it would be in stock in two weeks. Still waiting for my cablemod cable and Apex z790 so i guess a few more weeks waiting for a waterblock makes no diff


sure will...... You will love the EK block for sure. Their block for my 3090Ti is AMAZING


----------



## smushroomed

What kind of dd5 ram should I get? I'm planning on a 13900k + z790 mobo

I already have a 4090 and I game at 4k120hz


----------



## newls1

smushroomed said:


> What kind of dd5 ram should I get? I'm planning on a 13900k + z790 mobo
> 
> I already have a 4090 and I game at 4k120hz


kinda off subject for this thread dont you think?! There is a ddr5 thread in the intel section. I will just quickly say if i was in the market for another ram kit, id buy team group 7200/7600 ADie kit


----------



## KingEngineRevUp

PhuCCo said:


> They told me to arrange a day for a driver to come pick up the block and ship it back to EK, so I wonder if there will be any restocking fee at all. I'll update if they end up taking a percentage off of the refund or something.
> I did send pictures of my bent backplate straight out of the box, and I recommend taking a ton of pictures of yours before disassembling it and using it just in case. I do wonder if all of them are shipped like this.
> 
> And yeah definitely post your results with the EK block once you get it. Everyone on here can benefit from the information. I could just have a lemon.
> Be careful picking out the 0.8mm pads from the rest of the 1mm pads for the memory because they're not labelled.


Thanks for the advice. Be sure to measure the heatsinks with a caliper.

Edit: I meant thermal pads.


----------



## BigMack70

OK so the new driver 526.86 is also causing me to crash to black screen occasionally and ramps GPU fans to 100%, even at stock settings. It didn't throw any errors in the windows event logs this last time. 

Any suggestions to troubleshoot/fix? I'd keep the 525.25 driver except then Modern Warfare 2 won't let me play. 

I used DDU to clean install 526.86. Would a Windows reset help anything?


----------



## mirkendargen

BigMack70 said:


> OK so the new driver 526.86 is also causing me to crash to black screen occasionally and ramps GPU fans to 100%, even at stock settings. It didn't throw any errors in the windows event logs this last time.
> 
> Any suggestions to troubleshoot/fix? I'd keep the 525.25 driver except then Modern Warfare 2 won't let me play.
> 
> I used DDU to clean install 526.86. Would a Windows reset help anything?


It's still causing my second monitor to lose signal too, particularly when turning the screen on from standby. I would blame my cable or something, but the 4090 release driver has never done it once.


----------



## Panchovix

I was checking some 4090 + 5800X3D benchmarks (to compare to my 5800X, spoiler: the 5800X3D seems to help on some benchs like SpeedWay) and found this








I scored 29 528 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













Did he got that score because his 3075Mhz were effective clocks? Also 1441Mhz is +1028Mhz on VRAM if I'm not wrong. Or is the 5800X3D really that good for Port Royal? (The best score with a 5800X is 28400~)


----------



## GRABibus

Spiriva said:


> I ordered from cablemods on 15th of October, they sent it on November 3rd. I got a tracking number from DHL saying they and waiting for the package in Holland.
> The thing is that Cablemod send thier cables on a boat from China to Europe, that takes 3-4 weeks to reach Europe.
> 
> If you got a tracking number but nothing happens for weeks, then you know: your cable is on some boat towards America.


I got my Cablemod 2 weeks after ordering.
I also received the Corsair one today.

but after comparing the mounting of both, the Cablemod one seems much more better and I prefer its type of cables which allow better flexibility.


----------



## lawson67

I don't belive G-Sync is working with my LG CX and the 4090 i was hoping the new drivers would sort it out but they have not, i am using the Nvida control panel to limit my games to 118fps but i am still getting tearing and some stutters has anyone else noticed this?


----------



## yzonker

lawson67 said:


> I don't belive G-Sync is working with my LG CX and the 4090 i was hoping the new drivers would sort it out but they have not, i am using the Nvida control panel to limit my games to 118fps but i am still getting tearing and some stutters has anyone else noticed this?


No I haven't. Try 117 or a little less if you haven't.


----------



## Krzych04650

4.00x DSR 7680x3200 with Ultra RT and still 80 FPS, damn.









Most games I've tried are in 60-80 range at this resolution. Just generally trying to bring various games from various time periods to pristine quality with SSAA/SGSSAA/DLDSR/DSR ends up at around 60-70 FPS on 4090, so it is not quite there yet, but one more generation and everything will be in the 100s in every game, save for some most recent ones as always.


----------



## doom3crazy

Zero989 said:


> Limited wattage compared to other BIOS.


are you talking about the suprim liquid x bios on the gaming trio 4090? or are you talking about the stock bios on the gaming trio thats capped around 480w?


----------



## mirkendargen

lawson67 said:


> I don't belive G-Sync is working with my LG CX and the 4090 i was hoping the new drivers would sort it out but they have not, i am using the Nvida control panel to limit my games to 118fps but i am still getting tearing and some stutters has anyone else noticed this?


G-sync is working fine on my LG CX.


----------



## doom3crazy

I NEED SOME HELP FROM YOU WONDERFUL SMARTY PANTS! 

I am pretty smart generally when it comes to pc's tech etc but....... I am having the strangest thing going on with a game and my 4090 and what I can't figure out is if it's just the game or the 4090 or what but........Let me start by saying I have a gaming trio 4090, 5900x(all core at 4.6ghz) 32gb of g skill ram blah blah. And a Samsung 4k 60 panel(yes I know to unlock full power I should get a 144hz panel)

So I loaded up Crysis 1 remastered. First I tried running uncapped maxed out everything. No dlss and the experimental ray trace feature on. Seems I was holding steady around 70-80ish fps. In my head I was like "okay I should have no problem running 4k 60 locked. " So I turned on gsync, locked it to my panels refresh rate and loaded the game back up. Right away I noticed my core clocks and such were only boosting to like 1700mhz at best and there was points where I was dipping down into the 50's on my framerate. So then I set my settings in the control panel to the "fast" setting, and then used Rive tuner to lock it at 60. With this setting, I was locked at 60 and I could see the clock boosting to 2800mhz(since it was still pulling power and drawing for all those extra frames) and I was pegged at 60 and it was a generally smooth experience. Whats going on here? Do I need to set crysis as an application specific thing and change power to "prefer maximum performance"? Is crysis remastered still a poorly optimized game?

I don't have any of these issues with Doom eternal, god of war, fortnite etc. Any of the other games I play. I can turn on g sync, sync'd up to my monitors refresh rate of 60 and my core clocks show anywhere from 2200-2800mhz at any given time and it stays locked and smooth at 60. It's not using 100% on the gpu power of course cause it's limited to 60fps but I can't understand why crysis is not giving me a smooth experience with 4k60 g sync locked and the core clocks being so low. Clearly it's capable as uncapped it was holding around 70-80fps at any given moment.


----------



## Zero989

doom3crazy said:


> are you talking about the suprim liquid x bios on the gaming trio 4090? or are you talking about the stock bios on the gaming trio thats capped around 480w?


talking about the 600W suprim liquid x bios on a suprim liquid x, which is limited to 560w, whereas strixoc does 570w. I believe trio 4090 will show high consumption due to poorer vrm.


----------



## doom3crazy

Zero989 said:


> talking about the 600W suprim liquid x bios on a suprim liquid x, which is limited to 560w, whereas strixoc does 570w. I believe trio 4090 will show high consumption due to poorer vrm.


ohhh gotcha. would you still recommend the liquid x bios for the gaming trio card? or would the strix oc bios be better to flash it with?


----------



## Zero989

doom3crazy said:


> ohhh gotcha. would you still recommend the liquid x bios for the gaming trio card? or would the strix oc bios be better to flash it with?


Well, I would be speaking not from experience of a trio, but I find the strix OC bios better for portal royal and the liquid x bios for speedway.... -_-


----------



## lawson67

mirkendargen said:


> G-sync is working fine on my LG CX.


What frames are you limiting your games too via control panel for your CX?, ive set 118 maybe i should try lower?


----------



## mirkendargen

lawson67 said:


> What frames are you limiting your games too via control panel for your CX?, ive set 118 maybe i should try lower?


Everything in NVCP:
G-sync on
Frame limiter 118fps
V-sync on.

You can also mash the green button on the remote 6x to see the current VRR status on the TV. Make sure instant game response is enabled on the TV.


----------



## lawson67

mirkendargen said:


> Everything in NVCP:
> G-sync on
> Frame limiter 118fps
> V-sync on.
> 
> You can also mash the green button on the remote 6x to seer the current VRR status on the TV. Make sure instant game response is enabled on the TV.


i didn't think you needed V-sync on if your using G-Sync there's no need for V-sync


----------



## doom3crazy

doom3crazy said:


> I NEED SOME HELP FROM YOU WONDERFUL SMARTY PANTS!
> 
> I am pretty smart generally when it comes to pc's tech etc but....... I am having the strangest thing going on with a game and my 4090 and what I can't figure out is if it's just the game or the 4090 or what but........Let me start by saying I have a gaming trio 4090, 5900x(all core at 4.6ghz) 32gb of g skill ram blah blah. And a Samsung 4k 60 panel(yes I know to unlock full power I should get a 144hz panel)
> 
> So I loaded up Crysis 1 remastered. First I tried running uncapped maxed out everything. No dlss and the experimental ray trace feature on. Seems I was holding steady around 70-80ish fps. In my head I was like "okay I should have no problem running 4k 60 locked. " So I turned on gsync, locked it to my panels refresh rate and loaded the game back up. Right away I noticed my core clocks and such were only boosting to like 1700mhz at best and there was points where I was dipping down into the 50's on my framerate. So then I set my settings in the control panel to the "fast" setting, and then used Rive tuner to lock it at 60. With this setting, I was locked at 60 and I could see the clock boosting to 2800mhz(since it was still pulling power and drawing for all those extra frames) and I was pegged at 60 and it was a generally smooth experience. Whats going on here? Do I need to set crysis as an application specific thing and change power to "prefer maximum performance"? Is crysis remastered still a poorly optimized game?
> 
> I don't have any of these issues with Doom eternal, god of war, fortnite etc. Any of the other games I play. I can turn on g sync, sync'd up to my monitors refresh rate of 60 and my core clocks show anywhere from 2200-2800mhz at any given time and it stays locked and smooth at 60. It's not using 100% on the gpu power of course cause it's limited to 60fps but I can't understand why crysis is not giving me a smooth experience with 4k60 g sync locked and the core clocks being so low. Clearly it's capable as uncapped it was holding around 70-80fps at any given moment.





Zero989 said:


> Well, I would be speaking not from experience of a trio, but I find the strix OC bios better for portal royal and the liquid x bios for speedway.... -_-


Haha okay noted. You seem pretty knowledgeable. You think I could trouble you with this above post I made? ^^ I can't seem to figure it out.


----------



## mirkendargen

lawson67 said:


> i didn't think you needed V-sync on if your using G-Sync there's no need for V-sync


I've always followed this ¯\_(ツ)_/¯








G-SYNC 101: Optimal G-SYNC Settings & Conclusion | Blur Busters


Many G-SYNC input lag tests & graphs -- on a 240Hz eSports monitor -- via 2 months of testing via high speed video!




blurbusters.com


----------



## Zero989

doom3crazy said:


> Haha okay noted. You seem pretty knowledgeable. You think I could trouble you with this above post I made? ^^ I can't seem to figure it out.


Ye I'll help u tmr


----------



## Sheyster

mirkendargen said:


> I've always followed this ¯\_(ツ)_/¯
> 
> 
> 
> 
> 
> 
> 
> 
> G-SYNC 101: Optimal G-SYNC Settings & Conclusion | Blur Busters
> 
> 
> Many G-SYNC input lag tests & graphs -- on a 240Hz eSports monitor -- via 2 months of testing via high speed video!
> 
> 
> 
> 
> blurbusters.com


Yep, this is G-Sync gospel, two things to note:

1. Always cap the FPS via the game engine if possible.
2. Use V-Sync = ON, not V-Sync = FAST in NVCP.


----------



## doom3crazy

Zero989 said:


> Ye I'll help u tmr


awesome thanks so much


----------



## Krzych04650

doom3crazy said:


> I NEED SOME HELP FROM YOU WONDERFUL SMARTY PANTS!
> 
> I am pretty smart generally when it comes to pc's tech etc but....... I am having the strangest thing going on with a game and my 4090 and what I can't figure out is if it's just the game or the 4090 or what but........Let me start by saying I have a gaming trio 4090, 5900x(all core at 4.6ghz) 32gb of g skill ram blah blah. And a Samsung 4k 60 panel(yes I know to unlock full power I should get a 144hz panel)
> 
> So I loaded up Crysis 1 remastered. First I tried running uncapped maxed out everything. No dlss and the experimental ray trace feature on. Seems I was holding steady around 70-80ish fps. In my head I was like "okay I should have no problem running 4k 60 locked. " So I turned on gsync, locked it to my panels refresh rate and loaded the game back up. Right away I noticed my core clocks and such were only boosting to like 1700mhz at best and there was points where I was dipping down into the 50's on my framerate. So then I set my settings in the control panel to the "fast" setting, and then used Rive tuner to lock it at 60. With this setting, I was locked at 60 and I could see the clock boosting to 2800mhz(since it was still pulling power and drawing for all those extra frames) and I was pegged at 60 and it was a generally smooth experience. Whats going on here? Do I need to set crysis as an application specific thing and change power to "prefer maximum performance"? Is crysis remastered still a poorly optimized game?
> 
> I don't have any of these issues with Doom eternal, god of war, fortnite etc. Any of the other games I play. I can turn on g sync, sync'd up to my monitors refresh rate of 60 and my core clocks show anywhere from 2200-2800mhz at any given time and it stays locked and smooth at 60. It's not using 100% on the gpu power of course cause it's limited to 60fps but I can't understand why crysis is not giving me a smooth experience with 4k60 g sync locked and the core clocks being so low. Clearly it's capable as uncapped it was holding around 70-80fps at any given moment.


1700MHz for 4090 sounds absolutely bizarre. I remember having similar problem on 2080 Ti in AC:Origins, it would run full speed if unlocked but once locked the card would downclock way too much and performance was all over the place. Turned out that the GPU was downclocking with power saving feature, using 'Prefer maximum performance" fixed it. There were also cases of the GPU locking to baseline 3D clock of 1350MHz if the load was low enough. 

Some games can also have mess up V-syncs or frame limiters that undershoot a lot, so I generally use NVCP V-sync and RTSS lock for 100% reliability, although in your case that would not cause those bizarre low clocks, it does look like some kind of power saving.


----------



## doom3crazy

Krzych04650 said:


> 1700MHz for 4090 sounds absolutely bizarre. I remember having similar problem on 2080 Ti in AC:Origins, it would run full speed if unlocked but once locked the card would downclock way too much and performance was all over the place. Turned out that the GPU was downclocking with power saving feature, using 'Prefer maximum performance" fixed it. There were also cases of the GPU locking to baseline 3D clock of 1350MHz if the load was low enough.
> 
> Some games can also have mess up V-syncs or frame limiters that undershoot a lot, so I generally use NVCP V-sync and RTSS lock for 100% reliability, although in your case that would not cause those bizarre low clocks, it does look like some kind of power saving.


I just can;t figure out why at least so far it's only crysis that does it. I suppose I will try doing an application specific setting for crysis and set it to "prefer max performance" and see if it helps? I am certainly open to any suggestions. currently I am on the 522.30 studio driver(i do alot of premiere pro studio work) but ive also tried the latest game driver and the issue remains.


----------



## ZealotKi11er

GRABibus said:


> Nothing special beyond 29k.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [Official] NVIDIA RTX 4090 Owner's Club
> 
> 
> Sorry to hear that. I suggest also checking reddit and search here as well, for possible reasons. this is so wierd!
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> +240MHz on core
> +1600MHz on memory (no artifacts and as I don’t scale anymore as of 1700MHz, I remain at +1600MHz).
> Drivers on high performances
> Rebar forced with nvidia inspector in 3DMark Port Royal DLSS exe.
> 
> silicon lottery…?


Can you show how to enabled Re-Bar? I tried following some older guide but score did not move at all so my guess is I did not enable it correctly.


----------



## MrTOOSHORT

ZealotKi11er said:


> Can you show how to enabled Re-Bar? I tried following some older guide but score did not move at all so my guess is I did not enable it correctly.


I used this video, it helped:


----------



## doom3crazy

Zero989 said:


> Ye I'll help u tmr





Krzych04650 said:


> 1700MHz for 4090 sounds absolutely bizarre. I remember having similar problem on 2080 Ti in AC:Origins, it would run full speed if unlocked but once locked the card would downclock way too much and performance was all over the place. Turned out that the GPU was downclocking with power saving feature, using 'Prefer maximum performance" fixed it. There were also cases of the GPU locking to baseline 3D clock of 1350MHz if the load was low enough.
> 
> Some games can also have mess up V-syncs or frame limiters that undershoot a lot, so I generally use NVCP V-sync and RTSS lock for 100% reliability, although in your case that would not cause those bizarre low clocks, it does look like some kind of power saving.


Just a little update. So.... I changed crysis in the NVCP to specific set for "prefer max performance" and that seemed to fix the issue. Should I just set the main setting in the nvcp to "prefer max performance"? On the 3080 ti I had, when I did that, it would just be going full bore, full clock speed just sitting in windows and just sitting doing nothing the card was running at full speed and basically generating unnecessary heat.

EDIT: Which it appears to be doing on the 40 series cards as well. Thats so annoying. Is there anyway to fix that? Like when im just browsing and reading internet articles and youtube etc I don't need my card running at 2600mhz lol. Do I just have to set the prefer max performance individually for each game?


----------



## alasdairvfr

doom3crazy said:


> Just a little update. So.... I changed crysis in the NVCP to specific set for "prefer max performance" and that seemed to fix the issue. Should I just set the main setting in the nvcp to "prefer max performance"? On the 3080 ti I had, when I did that, it would just be going full bore, full clock speed just sitting in windows and just sitting doing nothing the card was running at full speed and basically generating unnecessary heat.
> 
> EDIT: Which it appears to be doing on the 40 series cards as well. Thats so annoying. Is there anyway to fix that? Like when im just browsing and reading internet articles and youtube etc I don't need my card running at 2600mhz lol. Do I just have to set the prefer max performance individually for each game?


I have set globally Max Performance since it resolved blanking issues on my 4k 144hz monitor. On my 3080. That card plus 3080ti and 4090 would all downclock and run sub-100w idle when in desktop etc. Im not sure why some ppl have full tilt GPU power draw, maybe windows power plan is the issue? Even clean driver plus that setting, it downclocks no problem with that setting.


----------



## Krzych04650

doom3crazy said:


> Just a little update. So.... I changed crysis in the NVCP to specific set for "prefer max performance" and that seemed to fix the issue. Should I just set the main setting in the nvcp to "prefer max performance"? On the 3080 ti I had, when I did that, it would just be going full bore, full clock speed just sitting in windows and just sitting doing nothing the card was running at full speed and basically generating unnecessary heat.
> 
> EDIT: Which it appears to be doing on the 40 series cards as well. Thats so annoying. Is there anyway to fix that? Like when im just browsing and reading internet articles and youtube etc I don't need my card running at 2600mhz lol. Do I just have to set the prefer max performance individually for each game?


Yea, just as I thought. It does not happen often, but there are going to be games where power saving is going to cause problems.

Setting max performance globally is going to lock the GPU into 3D clocks at all times, in practice this means that it is going to be idling at around 55-60W instead of 20-25W, this has been the case for many generations how. Up to you to decide whether having to remember to set max performance for every game individually to save 30W of idle power consumption is worth the trouble.

You can theoretically create MSI Afterburner profile that forces those idle clocks anyway by locking the VF curve to the lowest point and then setting this as default 2D profile and having Afterburner to launch on startup, but this will really lock it and prevent the GPU from clocking up on demand which could cause issues with some programs.


----------



## motivman

Calling on all Gigabyte Gaming OC owners. Anyone receive a dud card yet? I have an opportunity to pick up a gaming OC for MSRP. Hoping to not loose silicone lottery for the 4th time....


----------



## EarlZ

motivman said:


> Calling on all Gigabyte Gaming OC owners. Anyone receive a dud card yet? I have an opportunity to pick up a gaming OC for MSRP. Hoping to not loose silicone lottery for the 4th time....


What do you mean dud?


----------



## Panchovix

motivman said:


> Calling on all Gigabyte Gaming OC owners. Anyone receive a dud card yet? I have an opportunity to pick up a gaming OC for MSRP. Hoping to not loose silicone lottery for the 4th time....


I did saw one bad overclocker on reddit, but it seems pretty rare


----------



## mirkendargen

motivman said:


> Calling on all Gigabyte Gaming OC owners. Anyone receive a dud card yet? I have an opportunity to pick up a gaming OC for MSRP. Hoping to not loose silicone lottery for the 4th time....


My friend who I grabbed a Gaming OC for in Canada along with mine reports he can only get +1100 memory. I got lucky there I guess, because I actually didn't open them both and bench them to decide which was mine.


----------



## chrismohsseni

I scored 30 830 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





4090FE on LN2. Can't push much further without more power. 
Does anybody know where we can get the 1000w 4090 Strix bios?


----------



## doom3crazy

alasdairvfr said:


> I have set globally Max Performance since it resolved blanking issues on my 4k 144hz monitor. On my 3080. That card plus 3080ti and 4090 would all downclock and run sub-100w idle when in desktop etc. Im not sure why some ppl have full tilt GPU power draw, maybe windows power plan is the issue? Even clean driver plus that setting, it downclocks no problem with that setting.


ohhh like my power performance plan or whatever in actual windows. I didnt even think of that! gonna try now.

EDIT: no change. still running at 2500mhz


----------



## mirkendargen

chrismohsseni said:


> I scored 30 830 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 4090FE on LN2. Can't push much further without more power.
> Does anybody know where we can get the 1000w 4090 Strix bios?


You can't flash an FE even if you had the BIOS.


----------



## chrismohsseni

mirkendargen said:


> You can't flash an FE even if you had the BIOS.


Yes, well aware. I have a strix as well that I would like to do some LN2 overclocking on.


----------



## Arizor

WELL BOYS. Enjoying my TUF performance but concerned by Hot Spot temps (20C higher than core once core hit 70).

so decided to take apart and repaste with Grizzly (plus the tip from lads on this forum about adding grizzly to the thermal pads too).

Core temps now 15C (!!!) lower, and hot spot within 10C.


----------



## Panchovix

Arizor said:


> WELL BOYS. Enjoying my TUF performance but concerned by Hot Spot temps (20C higher than core once core hit 70).
> 
> so decided to take apart and repaste with Grizzly (plus the tip from lads on this forum about adding grizzly to the thermal pads too).
> 
> Core temps now 15C (!!!) lower, and hot spot within 10C.


I have the same issue! But can't open the card or I lose the warranty here on Chile  nice to know it can be improved though!


----------



## jediblr

This is for MSI GeForce RTX 4090 GAMING TRIO owners. I flashed Suprim X silent bios on my silent bios , so i can swap.
For stability test i run 20 loop Port Royal + UE 5.1 Matrix demo all max in DLDSR 4k on my 2 k monitor.
On 520PL Suprim X bios i can run +200 core +1400 mem , on 480PL default game bios +250 core +1400 mem, after day of tests for 24/7 i make 2 profile :
1. default bios +250 core +1300 mem 1.05 v 480PL // 2. UV+OC default bios +250 core +1300 mem 0.975 v 480PL
520PL Suprim X bios only for benches, we dont need it for daily gaming
P.S. new 526.86 driver is good(only DWM-HAGS bug sux) , and for G-sync and fps limit questions look at this


----------



## motivman

mirkendargen said:


> My friend who I grabbed a Gaming OC for in Canada along with mine reports he can only get +1100 memory. I got lucky there I guess, because I actually didn't open them both and bench them to decide which was mine.


Should I play silicone lottery again, or keep this FE? The core is not even stable at 3000mhz. It can bench up to 3030mhz, but stable core clock is only around 2940mhz. Memory is a good overclocker though, does +1650 max in afterburner. I want a card that can do at least 3050mhz on the core stable..









I scored 11 034 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

chrismohsseni said:


> Yes, well aware. I have a strix as well that I would like to do some LN2 overclocking on.


If you get a copy let us know. I don't think it's really in the wild yet. Just a select few with it. It would help a lot if TPU would get GPUZ working so people could just upload to their database.


----------



## Arizor

Panchovix said:


> I have the same issue! But can't open the card or I lose the warranty here on Chile  nice to know it can be improved though!


i also noticed, when putting it back together, that I couldtighten the screws on all areas much more than they were previously.

so maybe just take a screwdriver and double check for tightening? My bracket was especially a bit loose.


----------



## KingEngineRevUp

motivman said:


> Should I play silicone lottery again, or keep this FE? The core is not even stable at 3000mhz. It can bench up to 3030mhz, but stable core clock is only around 2940mhz. Memory is a good overclocker though, does +1650 max in afterburner. I want a card that can do at least 3050mhz on the core stable..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 034 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


3050 vs 2940 = 3.7% difference, in terms of performance... That's 1.5% performance, like 1.5 frame for every 100 frames.

Your memory OC is pretty good, it actually gives you more performance than the core does. Getting another card, you risk getting a worse memory OCer.

I guess is it worth playing the lottery for a core but risk getting bad memory?


----------



## th3illusiveman

KingEngineRevUp said:


> Share some pics


sure, she aint pretty, but she works lol - 




Spoiler























That HDD tray is non removable, but with some.... motivation... i was able to get the card in. Checked the connector after a hard day of 4k gaming and it looks good, will make one final check next week (unless it fails sooner).

Card is also quiet and runs pretty cool - i limited it to 350w and it seems to hover in the low to mid 60s.


----------



## Travis Scott

Arizor said:


> WELL BOYS. Enjoying my TUF performance but concerned by Hot Spot temps (20C higher than core once core hit 70).
> 
> so decided to take apart and repaste with Grizzly (plus the tip from lads on this forum about adding grizzly to the thermal pads too).
> 
> Core temps now 15C (!!!) lower, and hot spot within 10C.


i am having issue w this too ( huge waterblock/res), going to copper mod the entire gpu i guess. i wish amd and nvidia never added hotspot cuz it makes me so paranoid lolll


----------



## KingEngineRevUp

th3illusiveman said:


> sure, she aint pretty, but she works lol -
> 
> 
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2582339
> 
> 
> View attachment 2582340
> 
> 
> 
> That HDD tray is non removable, but with some.... motivation... i was able to get the card in. Checked the connector after a hard day of 4k gaming and it looks good, will make one final check next week (unless it fails sooner).
> 
> Card is also quiet and runs pretty cool - i limited it to 350w and it seems to hover in the low to mid 60s.


Looks like my old case for my Friday build


----------



## Travis Scott

also might as well use a thermal camera to see where the hotspot is too, its confusing af


----------



## Morteen199

Hello i see that gigabyte gaming OC uses sleeve bearing fans... does that mean i cant mount horizontally as that's bad for sleeve bearing fans? :S Gigabyte cheeping out ...


----------



## Nizzen

chrismohsseni said:


> I scored 30 830 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 4090FE on LN2. Can't push much further without more power.
> Does anybody know where we can get the 1000w 4090 Strix bios?


Is it possible to flash FE cards?
Nice result!


----------



## Sheyster

motivman said:


> Should I play silicone lottery again, or keep this FE? The core is not even stable at 3000mhz. It can bench up to 3030mhz, but stable core clock is only around 2940mhz. Memory is a good overclocker though, does +1650 max in afterburner. I want a card that can do at least 3050mhz on the core stable..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 034 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Well.. do you feel lucky today? 

FWIW, my Giga-G-OC can do 3105 stable @ 1.1v, BUT +1400 memory is a fairly quick crash. I'm not a bencher so it's not a big deal to me. My gaming OC will likely be 3000 and +1200 at 1.05v.


----------



## mirkendargen

motivman said:


> Should I play silicone lottery again, or keep this FE? The core is not even stable at 3000mhz. It can bench up to 3030mhz, but stable core clock is only around 2940mhz. Memory is a good overclocker though, does +1650 max in afterburner. I want a card that can do at least 3050mhz on the core stable..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 034 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Things aren't entirely what they seem when it comes to core clocks. For example I can finish a PR run at 1.1v with my core at 3105mhz....but the effective core is at like 3015mhz and I get 28975. Or I can finish a PR run at 1.05v with my core at 3015 and the effective clock at like 2985-3000mhz....and get 28875. Effective clock is a weird mystery no one really understands.


----------



## Travis Scott

3400mhz after all possible non ln2/chiller/ac cooling mods? ultrainstinct.mp4


----------



## KingEngineRevUp

mirkendargen said:


> Things aren't entirely what they seem when it comes to core clocks. For example I can finish a PR run at 1.1v with my core at 3105mhz....but the effective core is at like 3015mhz and I get 28975. Or I can finish a PR run at 1.05v with my core at 3015 and the effective clock at like 2985-3000mhz....and get 28875. Effective clock is a weird mystery no one really understands.


Hm, perhaps boost behavior is seen more in effective clocks.

You're drawing more power, your temperatures might be going over a threshold where it's down clocking?

Normally we see boost affect our target clocks directly.


----------



## mirkendargen

KingEngineRevUp said:


> Hm, perhaps boost behavior is seen more in effective clocks.
> 
> You're drawing more power, your temperatures might be going over a threshold where it's down clocking?
> 
> Normally we see boost affect our target clocks directly.


I'm on water, my temps are <50C all the time. I lose at most 1 bin at 46C, it's something else that goes on.


----------



## N19htmare666

doom3crazy said:


> I just can;t figure out why at least so far it's only crysis that does it. I suppose I will try doing an application specific setting for crysis and set it to "prefer max performance" and see if it helps? I am certainly open to any suggestions. currently I am on the 522.30 studio driver(i do alot of premiere pro studio work) but ive also tried the latest game driver and the issue remains.


Check if you're voltage/power/thermal limited. 



chrismohsseni said:


> I scored 30 830 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 4090FE on LN2. Can't push much further without more power.
> Does anybody know where we can get the 1000w 4090 Strix bios?


Think the 1000w BIOS doesn't have any thermal checks so really easy to fry your card if not used correctly


----------



## KingEngineRevUp

mirkendargen said:


> I'm on water, my temps are <50C all the time. I lose at most 1 bin at 46C, it's something else that goes on.


What I mean is observing effective clocks and temperature, not target clocks.

If you windowed and looped port royal on the background, could you manipulate your pump speed to go slower and slower and watch temperatures climb to observe and see if effective clocks go down?


----------



## Travis Scott

1000w bios lol better have active cooling on that goofy ahh cable


----------



## GRABibus

ZealotKi11er said:


> Can you show how to enabled Re-Bar? I tried following some older guide but score did not move at all so my guess is I did not enable it correctly.


@MrTOOSHORT was fast 😊

here is another link :








Here's How You Can Enable Resizable BAR Support in Any Game via NVIDIA Inspector


While only sixteen games have been officially whitelisted for Resizable BAR support by NVIDIA, there's a procedure to manually enable any.




wccftech.com


----------



## GRABibus

Sheyster said:


> Well.. do you feel lucky today?
> 
> FWIW, my Giga-G-OC can do 3105 stable @ 1.1v, BUT +1400 memory is a fairly quick crash. I'm not a bencher so it's not a big deal to me. My gaming OC will likely be 3000 and +1200 at 1.05v.


Just for information, which offset max on the core can you set to run a stable PR session ?


----------



## GRABibus

Travis Scott said:


> 1000w bios lol better have active cooling on that goofy ahh cable


Currently I don’t see how this 1000W can raise used power in cards.
With a 600W Bios, most of the cards pull maximum 570W in PR @ 1,1V.
should this 1000W bios unlock voltage ?


----------



## Nizzen

Travis Scott said:


> 1000w bios lol better have active cooling on that goofy ahh cable


Don't worry if you aren't using Elmor tool to crank up the "Voltage" over 1.1v


----------



## kx11

My best so far










I scored 17 169 in Time Spy Extreme


Intel Core i9-12900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com














I scored 9 999 in Speed Way


Intel Core i9-12900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com






PORT ROYAL won't finish for some reason


----------



## jootn2kx

Update for my gainward 4090 non gs
Stable core 3000mhz (+230 core) in all games with stock 450watt bios and +1500 memory.


Can even do +2000 memory without artifacts or crashes this is insane lol but i'm not comfortable with that setting.


----------



## motivman

kx11 said:


> My best so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 17 169 in Time Spy Extreme
> 
> 
> Intel Core i9-12900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 9 999 in Speed Way
> 
> 
> Intel Core i9-12900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> PORT ROYAL won't finish for some reason


your scores are too low, at least for your clocks... something ain't right.


----------



## Azazil1190

Some infos.
I was test uv-oc at my both systems.
10900k giga oc 2910 at 0.995v and +1200 mem.
5950x tuf oc 2880 at 0.995v and +1200 mem.

The run was at port royal to see the difference pcix3 vs pcix4 how many point you can loose from pci bandwidth.

Τhe tuf at lower clock get about 300-350points more than giga on 10900k


----------



## AKBrian

motivman said:


> your scores are too low, at least for your clocks... something ain't right.


Yeah, those numbers are quite low.

Could be unstable RAM (or running in Gear 4), since they're on a kit of 7200MT sticks. Would also explain why PR won't finish for them.


----------



## Benni231990

I found a little secret 

I UV my 4090 with the same core and memory clock but instead of 1,1V to 1,085V

and the average clock goes down 30-40mhz and i lost 130-150 points in port royal so UV is dead on the 4090


----------



## Nd4spdvn

Panchovix said:


> I was checking some 4090 + 5800X3D benchmarks (to compare to my 5800X, spoiler: the 5800X3D seems to help on some benchs like SpeedWay) and found this


That is my score, which at that time was good for no2 or 3 HoF, can't remember. It is most likely a (memory) bugged score although at the time I've made the run I could not spot obvious artifacts and did not know that pushing memory hard could lead to bugged results. My card which i have 2 days after 12th Oct launch is a bad memory overclocker but with a slightly higher than average core perhaps. It boosted to 2970Mhz with 1.05V and power maxed out which is 520W on this Suprim X bios. The memory though is good for only +1000 in afterburner which is (very) disappointing, but then I have no idea what timings are being used by MSI in their bios vs other vendors. The performance of the card is crazy good as well as the temps on an open bench setup. All games are clocking to 3045MHz steady with effective clocks 3015Mhz (at 1.1V) which is pretty good on an aircooled card and fans at about 65%.


----------



## kx11

motivman said:


> your scores are too low, at least for your clocks... something ain't right.


It was a simple clicks here and there to get everything up and running


----------



## long2905

those of you with bykski water block, do you use the stock pad or swap in gelid or other vendor pad? what size did you use? the stock one is 1.8mm and i heard others used 2mm. but from my experience with 3090 the pad got compressed quite a bit, in which case, would 1.5mm pad suffice? or even thin copper shim?


----------



## Sheyster

GRABibus said:


> Just for information, which offset max on the core can you set to run a stable PR session ?


3105 core was at +315. It was also stable in MW2 for over 2 hours. I haven't tried higher.


----------



## Sheyster

Benni231990 said:


> I found a little secret
> 
> I UV my 4090 with the same core and memory clock but instead of 1,1V to 1,085V
> 
> and the average clock goes down 30-40mhz and i lost 130-150 points in port royal so UV is dead on the 4090


1.085 isn't really an undervolt, it's just a little lower than max voltage (1.1v). I think most people here would consider 1v and lower to be undervolting. Just saying..


----------



## sew333

Hi. I have 12900K stock
32 GB 2x16 GB DDR4 3600mhz
Palit Gamerock Pro OC rtx 4090
1300W Seasonic Platinum Prime
Aorus Elite DDR4 Z690
Windows 11 2H22. Updated new nvidia drivers.



I used 1440,RT ULTRA,rest ultra for testing fps.

And you know? All 3-4 modes of DLSS not working.

Dlss is not working. I am gettin the same scores and fps on all 3 modes of dlss. Any ideas? my drivers is up to date. DLSS Quality,performance,ultra performance ,no fps gain in benchmark and game.

I compare benchmark scores with youtube user and he got 160fps on that settings in benchmark, when me 115fps because dlss is screwed. Any ideas?

Here someone with Rtx 4090 and hes dlss AUTO is "working" in benchmark:









He is getting at start 165 fps. I have 115fps.
I tried all 3 different drivers nothing helps.


----------



## J7SC

long2905 said:


> those of you with bykski water block, do you use the stock pad or swap in gelid or other vendor pad? what size did you use? the stock one is 1.8mm and i heard others used 2mm. but from my experience with 3090 the pad got compressed quite a bit, in which case, would 1.5mm pad suffice? or even thin copper shim?


FYI, I measured the Bykski pads with a micrometer and double-checked - the included Bykski pads are only 1.5 mm (instead of the 1.8 mm claimed at Formulamod's / Bykski's online 'schematics').


----------



## long2905

J7SC said:


> FYI, I measured the Bykski pads with a micrometer and double-checked - the included Bykski pads are only 1.5 mm (instead of the 1.8 mm claimed at Formulamod's / Bykski's online 'schematics').


that makes so much sense. i dont have a caliper to measure it myself, but based on how much the stock and gelid 2mm pad got squished, 1.5mm should be far better. now to consider whether i should experiment with 1mm copper shim on it


----------



## Panchovix

Nd4spdvn said:


> That is my score, which at that time was good for no2 or 3 HoF, can't remember. It is most likely a (memory) bugged score although at the time I've made the run I could not spot obvious artifacts and did not know that pushing memory hard could lead to bugged results. My card which i have 2 days after 12th Oct launch is a bad memory overclocker but with a slightly higher than average core perhaps. It boosted to 2970Mhz with 1.05V and power maxed out which is 520W on this Suprim X bios. The memory though is good for only +1000 in afterburner which is (very) disappointing, but then I have no idea what timings are being used by MSI in their bios vs other vendors. The performance of the card is crazy good as well as the temps on an open bench setup. All games are clocking to 3045MHz steady with effective clocks 3015Mhz (at 1.1V) which is pretty good on an aircooled card and fans at about 65%.


So VRAM timings are depending of the VBIOS of AIBs? Didn't know that. That basically may you get more scores on benchmarks at the same VRAM speed depending of the VBIOS hmm


----------



## chrismohsseni

Nizzen said:


> Is it possible to flash FE cards?
> Nice result!


No, I have a strix that I would like to overclock but it is hitting a powerwall.


----------



## KingEngineRevUp

long2905 said:


> those of you with bykski water block, do you use the stock pad or swap in gelid or other vendor pad? what size did you use? the stock one is 1.8mm and i heard others used 2mm. but from my experience with 3090 the pad got compressed quite a bit, in which case, would 1.5mm pad suffice? or even thin copper shim?


If the stock pads work, why not use them? What's the point of lowering memory temperature another 5-10C if it can actually hurt your performance?

The working theory is that the memory needs to not be too hot but not be too cold to get its best performance.

So just use the stock pads, save money.


----------



## J7SC

long2905 said:


> that makes so much sense. i dont have a caliper to measure it myself, but based on how much the stock and gelid 2mm pad got squished, 1.5mm should be far better. now to consider whether i should experiment with 1mm copper shim on it


I used thermal putty for the VRAM and some other areas on the backside / below the included backplate, but for the some of the VRM segments, I used the stock-included pads - with a bit of themal paste on top - per red arrow, there definitely is good contact...


----------



## jootn2kx

sew333 said:


> Hi. I have 12900K stock
> 32 GB 2x16 GB DDR4 3600mhz
> Palit Gamerock Pro OC rtx 4090
> 1300W Seasonic Platinum Prime
> Aorus Elite DDR4 Z690
> Windows 11 2H22. Updated new nvidia drivers.
> 
> 
> 
> I used 1440,RT ULTRA,rest ultra for testing fps.
> 
> And you know? All 3-4 modes of DLSS not working.
> 
> Dlss is not working. I am gettin the same scores and fps on all 3 modes of dlss. Any ideas? my drivers is up to date. DLSS Quality,performance,ultra performance ,no fps gain in benchmark and game.
> 
> I compare benchmark scores with youtube user and he got 160fps on that settings in benchmark, when me 115fps because dlss is screwed. Any ideas?
> 
> Here someone with Rtx 4090 and hes dlss AUTO is "working" in benchmark:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He is getting at start 165 fps. I have 115fps.
> I tried all 3 different drivers nothing helps.


No cpu can handle the [email protected] 1440p right now in this game. 
Even at 4K i'm seeing huge bottlenecks with my 5800X3D cpu.
There is nothing you can do but waiting for dlss3.0 (frame generation) or buy a more powerful cpu in the future


----------



## dante`afk

chrismohsseni said:


> I scored 30 830 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 4090FE on LN2. Can't push much further without more power.
> Does anybody know where we can get the 1000w 4090 Strix bios?


3330mhz just with 1.1v ? wow


----------



## GRABibus

Sheyster said:


> 3105 core was at +315. It was also stable in MW2 for over 2 hours. I haven't tried higher.


+315MHz in PR is incredible.
My max is +240MHz.

with 240MHz, Boost at 3120MHz in PR and 3098MHz average :









I scored 29 077 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com




And you ?


----------



## sew333

jootn2kx said:


> No cpu can handle the [email protected] 1440p right now in this game.
> Even at 4K i'm seeing huge bottlenecks with my 5800X3D cpu.
> There is nothing you can do but waiting for dlss3.0 (frame generation) or buy a more powerful cpu in the future


But someone from youtube was testing on all dlss modes on 1440P and hes fps was higher. Hes cpu 13900K.
I am testing on 1440P ,rt ultra,all dlss modes: Performance,Balanced,Quality,Ultra Performance and fps is not changing . Only gpu usage is lower.


----------



## GRABibus

jootn2kx said:


> No cpu can handle the [email protected] 1440p right now in this game.
> Even at 4K i'm seeing huge bottlenecks with my 5800X3D cpu.
> There is nothing you can do but waiting for dlss3.0 (frame generation) or buy a more powerful cpu in the future


All CPU’s bottleneck the 4090 in All games at 1440p.

even at 4K, you can see bottleneck in some games and scenarios, even with 13900 K.

Waiting for Ryzen 7000 X3D.


----------



## Zero989

doom3crazy said:


> Just a little update. So.... I changed crysis in the NVCP to specific set for "prefer max performance" and that seemed to fix the issue. Should I just set the main setting in the nvcp to "prefer max performance"? On the 3080 ti I had, when I did that, it would just be going full bore, full clock speed just sitting in windows and just sitting doing nothing the card was running at full speed and basically generating unnecessary heat.
> 
> EDIT: Which it appears to be doing on the 40 series cards as well. Thats so annoying. Is there anyway to fix that? Like when im just browsing and reading internet articles and youtube etc I don't need my card running at 2600mhz lol. Do I just have to set the prefer max performance individually for each game?


See if eVGA Precision X can load your 3D clocks to full turbo without using max perf.


----------



## J7SC

GRABibus said:


> All CPU’s bottleneck the 4090 in All games at 1440p.
> 
> even at 4K, you can see bottleneck in some games and scenarios, even with 13900 K.
> 
> Waiting for Ryzen 7000 7950X3D.


...FTFY


----------



## yzonker

J7SC said:


> FYI, I measured the Bykski pads with a micrometer and double-checked - the included Bykski pads are only 1.5 mm (instead of the 1.8 mm claimed at Formulamod's / Bykski's online 'schematics').


Mine are definitely thicker than 1.5mm. Although we're only talking 0.012" difference.


----------



## PhuCCo

I'm measuring 1.8


----------



## Larkonian

sew333 said:


> But someone from youtube was testing on all dlss modes on 1440P and hes fps was higher. Hes cpu 13900K.
> I am testing on 1440P ,rt ultra,all dlss modes: Performance,Balanced,Quality,Ultra Performance and fps is not changing . Only gpu usage is lower.


It might be your memory speed. The guy in the video is using fast DDR5.

I am getting 142 FPS with the same settings as the video.

12700K with DDR4-4000 14-15-15-35


----------



## KingEngineRevUp

dante`afk said:


> 3330mhz just with 1.1v ? wow


Makes you wonder what the boost vs. temperature graphics look like for this generation.

Tech power up didn't do it this generation probably because the coolers were so good and they didn't have access to a water block to test it out.

Looks like the 40 series is highly dependent on lower temperatures for stability, this proves it. -35C and 3330 MHz. It's no wonder NVIDIA had all the AIB target 60-70C temperatures with a heat density that is 85-90% of the board power draw into the die.


----------



## yzonker

KingEngineRevUp said:


> Makes you wonder what the boost vs. temperature graphics look like for this generation.
> 
> Tech power up didn't do it this generation probably because the coolers were so good and they didn't have access to a water block to test it out.
> 
> Looks like the 40 series is highly dependent on lower temperatures for stability, this proves it. -35C and 3330 MHz. It's no wonder NVIDIA had all the AIB target 60-70C temperatures with a heat density that is 85-90% of the board power draw into the die.


Actually, I'm finding the gains on core clock to not be any greater than 30 series by lowering temps. That LN2 run was most likely lower than -35C as that's the lowest the card reads I think.

But speaking of core power draw, I should have taken screenshots but I fired up my 3080ti machine to see how they really compare.

At 450w PL on both cards,

3080ti core: 280w
4090 core: 380w

No wonder we're struggling with block deltas. You'd have to pull 600w on the 3080ti to get the same core power!!

I'm also noticing the VRAM doesn't pull nearly as much power on the 4090. Must be the new 2Gb chips. This is why it's running so cool with a water block I think. And also is contributing to the OC falloff as temps go down.


----------



## lawson67

I changed my Ryzen 7 5800x for a Ryzen 7 5800x3D today i thought i wouldn't notice much difference in 4K but i have gone from 183 FPS in tomb rader to 202 fps , i think that's a great up lift, i am very happy indeed!


----------



## J7SC

yzonker said:


> Mine are definitely thicker than 1.5mm. Although we're only talking 0.012" difference.
> 
> View attachment 2582425


....I actually had one of the strips come in at 1.4 mm (measured several times), the others at 1.5 mm - anyway, they are designed to compress a bit and besides, for VRAM I used thermal putty, and the earlier pic showed that there was decent contacts on the VRM bits.


----------



## yzonker

Larkonian said:


> It might be your memory speed. The guy in the video is using fast DDR5.
> 
> I am getting 142 FPS with the same settings as the video.
> 
> 12700K with DDR4-4000 14-15-15-35


This is what I got. 13900k, DDR4 4200CL15. But that's a high bar against @sugi0lover. Dude probably has binned fans. lol


----------



## changboy

I got my 2.0mm pad (thermalright) to pair with my Bykski and gigabyte, will see what will happen, maybe i will do it tomorrow.


----------



## mirkendargen

yzonker said:


> Mine are definitely thicker than 1.5mm. Although we're only talking 0.012" difference.
> 
> View attachment 2582425


I didn't measure mine, but eyeballing them compared to the 1.5mm Arctic pads I used the Bykski ones were *slightly* thicker. Now I don't know if the Arctic ones were exactly 1.5mm either, but they definitely made contact fine when I used them.


----------



## Baasha

About damn time!



















Now to wait for the Strix 4090 OC!


----------



## sew333

yzonker said:


> This is what I got. 13900k, DDR4 4200CL15. But that's a high bar against @sugi0lover. Dude probably has binned fans. lol
> 
> View attachment 2582445


Very greatful for checking benchmark. I will made a test and check if i have similiar score. Thanks again for finding time for me.


----------



## neteng101

yzonker said:


> This is what I got. 13900k, DDR4 4200CL15. But that's a high bar against @sugi0lover. Dude probably has binned fans. lol


Guess my terrible memory isn't all that bad, split the difference... 13700k, DDR4 3866CL19-23-23-46. I take it CP2077 is CPU/system bound at 1440p to an extent?


----------



## yzonker

neteng101 said:


> Guess my terrible memory isn't all that bad, split the difference... 13700k, DDR4 3866CL19-23-23-46. I take it CP2077 is CPU/system bound at 1440p to an extent?
> 
> View attachment 2582467


Apparently it is. Biggest difference I see is your min FPS is pretty low. Could have just glitched at some point though to cause that.


----------



## neteng101

yzonker said:


> Apparently it is. Biggest difference I see is your min FPS is pretty low. Could have just glitched at some point though to cause that.


I had another run before that I couldn't screenshot - min FPS was actually above yours, but the average was around that 148 mark. Too much variance in min FPS it looks like. 1% low might be better averaged if they reported that.


----------



## yzonker

neteng101 said:


> I had another run before that I couldn't screenshot - min FPS was actually above yours, but the average was around that 148 mark. Too much variance in min FPS it looks like. 1% low might be better averaged if they reported that.


I think that's at the beginning or end. I never see it go below 100fps during the run. Probably bogus.


----------



## alasdairvfr

I did some benching post-driver update, on 526.86 and it's bumped my scores a bit. I got Legendaries in SpeedWay and FireStrike Ultra for my HW config but PR I'm kinda lower and TimeSpy not even top 100 for HW config. I'm wondering if its my system memory thats not brilliant causing top 10 system in FireStrike not hit top 100 in TimeSpy (?)


I scored 10 982 in Speed Way

I scored 30 933 in Time Spy

I scored 18 210 in Time Spy Extreme

I scored 28 335 in Port Royal

I scored 53 573 in Fire Strike

I scored 42 248 in Fire Strike Extreme

I scored 26 833 in Fire Strike Ultra


----------



## sugi0lover

yzonker said:


> This is what I got. 13900k, DDR4 4200CL15. But that's a high bar against @sugi0lover. Dude probably has binned fans. lol
> 
> View attachment 2582445


As you already know, that video was mine  I lost some performance because my pc was recording the video at the same time benching the game.
I wonder if e cores disabled can also help higher frames.


----------



## J7SC

@Arizor @yzonker ...'time for the 'pucker moment' : First boot-up after water-cooling with various custom applications, new 12VHPWR cable, surgery on an 'open loop' after taking the previous water-blocked occupant out (awaiting transfer to another machine) and installing the water-blocked 4090... 

I have done this sort of thing for over a decade, but first boot-up never gets old....glad to say everything works like a charm, including PCIe4.0 16x, temps etc 

...just doing some very light 3D testing now to seat and cure the various bits... delta GPU > hotspot less than 8 C at fairly easy 3D loads...

Still needs some cable and tube management, a bit more air-bubble-bleeding etc.


----------



## BigMack70

OK so now I'm getting quite vexed. I used my 4090 for a few weeks with no issues. Crashed to black screen once or twice which threw errors in Windows logs. Lowered memory OC and then was stable.

Then, I updated my graphics driver from 522.25, and started experiencing regular crashes to black screen with GPU fans spinning immediately to 100%. Used DDU and re-installed 522.25, and was again fine.

However, yesterday I updated to 526.86 because Modern Warfare 2 requires it. Cue more black screen crashes with 100% fan speed. However, I today was unable to fix the problem by simply rolling back to 522.25.

So, I reset Windows, and now have reinstalled 526.86, and I'm getting black screen / 100% fan speed crashes randomly at desktop every few minutes. No errors in windows logs beyond "the system shutdown was unexpected".

I'm going to try 522.25 again but I'm really at a loss. It doesn't seem like a hardware issue - the system worked fine for two weeks until I started updating the Nvidia driver. Or is it likely to be a hardware problem? Could any motherboard BIOS settings be problematic for the OS to lose communication with the GPU?


-EDIT-

RIP me. Now I can't even complete a port royal run at stock clocks without a black screen / 100% fan crash. No errors or visible problems. Do I have any options besides trying to RMA the card?


----------



## J7SC

...just tried 1.1V and ~ 400 W and mild oc...long way to go before full PL (in a day or two), but temps are great so far.


----------



## KingEngineRevUp

yzonker said:


> Actually, I'm finding the gains on core clock to not be any greater than 30 series by lowering temps. That LN2 run was most likely lower than -35C as that's the lowest the card reads I think.
> 
> But speaking of core power draw, I should have taken screenshots but I fired up my 3080ti machine to see how they really compare.
> 
> At 450w PL on both cards,
> 
> 3080ti core: 280w
> 4090 core: 380w
> 
> No wonder we're struggling with block deltas. You'd have to pull 600w on the 3080ti to get the same core power!!
> 
> I'm also noticing the VRAM doesn't pull nearly as much power on the 4090. Must be the new 2Gb chips. This is why it's running so cool with a water block I think. And also is contributing to the OC falloff as temps go down.


Nice observations there. I don't have my 3080 TI anymore to test, but we have plenty of GPU Z screenshots of all of our cards. It makes sense. There's much more heat density


----------



## BigMack70

Is this an Nvidia driver error in windows event logs?



> The cplspcon service terminated with the following error: Unspecified error


That's the only thing I see that might mean anything when I'm black screen crashing.


----------



## lawson67

BigMack70 said:


> Is this an Nvidia driver error in windows event logs?
> 
> 
> 
> That's the only thing I see that might mean anything when I'm black screen crashing.


The cplspcon (IntelCpHDCPSvc.exe) process is part of the Intel HD Graphics driver installation on your PC


----------



## BigMack70

lawson67 said:


> The cplspcon (IntelCpHDCPSvc.exe) process is part of the Intel HD Graphics driver installation on your PC


Am I possibly having some kind of driver conflict between the discreet and integrated graphics?


----------



## lawson67

BigMack70 said:


> Am I possibly having some kind of driver conflict between the discreet and integrated graphics?


I've just googled your problem, 2 different people fixed cplspcon service terminated with the following error: Unspecified error below, same as you with shutdowns and black screens

Fix 1: Seems it was just the GPU which wasn't fitted properly lol.. i reseted it and no issues since then

Fix 2: I fixed this problem by replacing my Power Supply Unit(PSU), why? I figured it could be a voltage spike from my psu or another component that would activate some sort of safety mechanism on my PC to turn off, i replaced a 1200W asus Thor Platinum for a 850W EVGA Gold and its working great since then.

Now, your problem could also be the PSU since you have a high TDP CPU, but it could also be another component that is being triggered with a voltage spike, atleast thats my hypothesis


----------



## BigMack70

lawson67 said:


> I've just googled your problem, 2 different people fixed cplspcon service terminated with the following error: Unspecified error below
> 
> Fix 1: Seems it was just the GPU which wasn't fitted properly lol.. i reseted it and no issues since then
> 
> Fix 2: I fixed this problem by replacing my Power Supply Unit(PSU), why? I figured it could be a voltage spike from my psu or another component that would activate some sort of safety mechanism on my PC to turn off, i replaced a 1200W asus Thor Platinum for a 850W EVGA Gold and its working great since then.
> 
> Now, your problem could also be the PSU since you have a high TDP CPU, but it could also be another component that is being triggered with a voltage spike, atleast thats my hypothesis


Interesting. I'll try resetting the GPU if it happens again. I'd be surprised if it's a PSU issue, I've got a Corsair AX1000i Titanium that's only 3 years old and all my testing this afternoon after resetting windows has been with the whole system at stock settings. 

After reading your comment I immediately disabled my iGPU and started re-testing... No black screens yet. Hoping it was some kind of problem related to that.


----------



## alasdairvfr

sew333 said:


> And you know? All 3-4 modes of DLSS not working.
> 
> Dlss is not working. I am gettin the same scores and fps on all 3 modes of dlss. Any ideas? my drivers is up to date. DLSS Quality,performance,ultra performance ,no fps gain in benchmark and game.
> 
> I compare benchmark scores with youtube user and he got 160fps on that settings in benchmark, when me 115fps because dlss is screwed. Any ideas?
> 
> Here someone with Rtx 4090 and hes dlss AUTO is "working" in benchmark:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He is getting at start 165 fps. I have 115fps.
> I tried all 3 different drivers nothing helps.


maybe not news but cp2077 notoriously doesn't reliably apply settings without relaunching the game. Its possible the changed settings simply did not apply


----------



## yzonker

J7SC said:


> ...just tried 1.1V and ~ 400 W and mild oc...long way to go before full PL (in a day or two), but temps are great so far.
> View attachment 2582490


Slightly better hotspot delta.


----------



## Travis Scott

anyone test 4090 in skyrim fallout 4 or fallout 76 ? working really good for me in 4k, vram maxing out 💰


----------



## J7SC

Arizor said:


> i also noticed, when putting it back together, that I couldtighten the screws on all areas much more than they were previously.
> 
> so maybe just take a screwdriver and double check for tightening? My bracket was especially a bit loose.


Repasting and retightening the stock cooler is a valuable step (if done carefully of course) because while the components such as GPU die and VRAM have strong quality control, sometimes the screwing together process leaves a lot to be desired...below in pic 1 is just how my card came apart - the upper left half shows some thermal strips (VRM area) not making good contact, and I don't think the TIM application is the greatest I've ever seen, either... This card could always boost up really high, but then drop something like 4 or more clock bins during a bench run. I've seen the hotspot hit over 94 C with a delta in the mid-20s. That problem is solved now, and then some - fyi, I used the same steps and materials I used with my last two hi-po cards which have exactly the same delta they had 18 months ago. 

Finally, this dual mobo single case work+play build simply doesn't have the room to fit the 4090 with a stock cooler (pic 2, the 4090 temporarily sits outside and in front of the build). With an already-built and proven loop, putting a water-block on the Gigabyte G-OC 4090 was a no-brainer. Over here, water-cooling will not automatically affect warranties (...anymore) but I took lots of pictures plus temperature screenies with the hotspot going bananas anyway if there ever is a warranty issue.


----------



## neteng101

Finally got to comparing MSI bios options - Gaming Trio vs. Suprim X (Air) vs. 600w Suprim X Liquid... at 480w power limit. The liquid gave the highest score but X (Air) was just a few points so its margin of error they're basically the same - because the Gaming Trio is not factory overclocked I had to tweak the core and couldn't replicate the same OC as on the Suprim bioses. Hardware Unboxed tested 2 of the cards and found the Gaming Trio's bios is set to run fans much slower - I can confirm - the Gaming bios runs fans around 1500rpm, saw 2000rpm on the X Air... the 600w liquid bios doesn't run fans at the same speed, with Fan 1 going to 2200rpm, fan 2 a bit less.

Overall, unless you really want to push it, the 520w Suprim X (Air) bios seems to be the best option for a Gaming Trio - exceeding the Trio's own bios. Fan speeds on the Trio bios is way too slow. Unless you're going to tinker with voltage overclock @1.1V, you won't really need the 600w power limit either.


----------



## Rcmorr09

BigMack70 said:


> OK so now I'm getting quite vexed. I used my 4090 for a few weeks with no issues. Crashed to black screen once or twice which threw errors in Windows logs. Lowered memory OC and then was stable.
> 
> Then, I updated my graphics driver from 522.25, and started experiencing regular crashes to black screen with GPU fans spinning immediately to 100%. Used DDU and re-installed 522.25, and was again fine.
> 
> However, yesterday I updated to 526.86 because Modern Warfare 2 requires it. Cue more black screen crashes with 100% fan speed. However, I today was unable to fix the problem by simply rolling back to 522.25.
> 
> So, I reset Windows, and now have reinstalled 526.86, and I'm getting black screen / 100% fan speed crashes randomly at desktop every few minutes. No errors in windows logs beyond "the system shutdown was unexpected".
> 
> I'm going to try 522.25 again but I'm really at a loss. It doesn't seem like a hardware issue - the system worked fine for two weeks until I started updating the Nvidia driver. Or is it likely to be a hardware problem? Could any motherboard BIOS settings be problematic for the OS to lose communication with the GPU?
> 
> 
> -EDIT-
> 
> RIP me. Now I can't even complete a port royal run at stock clocks without a black screen / 100% fan crash. No errors or visible problems. Do I have any options besides trying to RMA the card?


There was person on reddit who was black screening, didn't mention errors in windows but it was his connector going bad(melting).


----------



## BigMack70

Rcmorr09 said:


> There was person on reddit who was black screening, didn't mention errors in windows but it was his connector going bad(melting).


I've been keeping an eye on the adapter connector - I've opened my front panel and touched it on most of the crashes, and cannot feel any meaningful heat on the connector. I've sometimes just checked it on long intensive gaming sessions like in Cyberpunk and it's never felt more than slightly above the case ambient temp. I've not yet unplugged/replugged it to inspect, but maybe I should.

Also had the thought tonight that I should probably open up the back of my case and inspect the PSU side of the connectors, which I have not done.

-edit- Inspected the adapter. It looks good.


----------



## doubledoubt

long-time (kinda) lurker, occasional (sorta) poster. Sad to have to report that after having wasted an entire day trying to troubleshoot why my brand spanking new Suprim X (non-liquid) is not running at X16 in the same lane as his comrades before him ran flawlessly at X16, I have concluded that an RMA is needed.

My motherboard (an ASUS ROG Maximus Z690 Extreme) BIOS is up-to-date. The PCI-e lane shares goodies with M2_1 which is empty. Has always been. Have seated and reseated numerous times. Tried it with my LINKUP vertical GPU riser that does support PCIe 4.0 X16 too--with 0 luck. GPU-Z shows it X8. 3D Mark PCI Express feature test shows 10-12 GB/s. I take it out. Put my FE in (glad I still have it!) and viola! X16 all the way... On this same motherboard, my former EVGA 3090 KP, 4090 Gaming OC, and 4090 FE *ALL* were running at X16.

Is it possible I'm missing something else?!

Edit: Oh yeah I did triple, quadruple check the PSU connectors too. Everything is very tight and secure.


----------



## doubledoubt

DokoBG said:


> I think i will be sticking to the standard Suprim X Gaming bios. I got it also updated to the latest version through the MSI Centre app few days ago. Works flawlessly on my Gaming Trio.


Curious how did you do that? When I installed the MSI Center app in Windows, it presented me with various other apps to control things. I opted to install the Graphic Fan Tool and Mystic Light. But neither really asked me if I wanted to update anything...?

Edit: Never mind; found it!


----------



## J7SC

What is the consensus on NVidia driver 526.86 ? According to several posts here, it is supposed to be a bit faster...but as near as I can tell, some folks have some problems with it and are rolling back ? Tx


----------



## mirkendargen

J7SC said:


> What is the consensus on NVidia driver 526.86 ? According to several posts here, it is supposed to be a bit faster...but as near as I can tell, some folks have some problems with it and are rolling back ? Tx


A little faster, but just like every other driver after the launch one so far, it makes my secondary monitor randomly lose signal until I reboot fairly regularly and isn't usable. I haven't seen anyone else complaining about this...but it definitely doesn't happen on the launch driver so I don't think it's anything with my hardware.


----------



## J7SC

mirkendargen said:


> A little faster, but just like every other driver after the launch one so far, it makes my secondary monitor randomly lose signal until I reboot fairly regularly and isn't usable. I haven't seen anyone else complaining about this...but it definitely doesn't happen on the launch driver so I don't think it's anything with my hardware.


Thanks. I will eventually run two monitors with different refresh rates, but for now it's just the C1 OLED.


----------



## mirkendargen

J7SC said:


> Thanks. I will eventually run two monitors with different refresh rates, but for now it's just the C1 OLED.


An LG CX is my main monitor and it's never lost signal, a 120hz 1440p monitor connected via DP is the one that dies. Unplugging it/plugging it back in, plugging it into a different port, nothing fixes it when it happens other than a reboot. It's very odd.


----------



## Nizzen

J7SC said:


> What is the consensus on NVidia driver 526.86 ? According to several posts here, it is supposed to be a bit faster...but as near as I can tell, some folks have some problems with it and are rolling back ? Tx


There are allways someone with problems. Millions of hardware kombinations. Bioses, firmware, software kombinations. Then we have userfailures 

If the driver works for you, it's great for you. There is no rule for everyone


----------



## Nizzen

J7SC said:


> Thanks. I will eventually run two monitors with different refresh rates, but for now it's just the C1 OLED.


C1 is a big tv 

This is a Oled monitor


----------



## J7SC

Nizzen said:


> C1 is a big tv
> 
> This is a Oled monitor
> View attachment 2582542


Ha - obviously your monitor is crooked  

...just played the updated (per yesterday's 24 GB patch) FS2020 '4K ultra' w/DLSS quality + frame generation on the 4090 and that beautiful OLED...110 fps++


----------



## Nizzen

J7SC said:


> Ha - obviously your monitor is crooked
> 
> ...just played the updated (per yesterday's 24 GB patch) FS2020 '4K ultra' w/DLSS quality + frame generation on the 4090 and that beautiful OLED...110 fps++


Oled is meta 

The Oled 42" was too big for me, so finally got the 34" ultrawide Oled.


----------



## dr/owned

Nizzen said:


> C1 is a big tv
> 
> This is a Oled monitor


AW3423DW: Had it and hated it. Text was sooooo bad with the subpixel layout on that thing.


----------



## Aaq

5800X3D with 4090FE. Everything @stock.


----------



## schoolofmonkey

Aaq said:


> View attachment 2582550
> 
> 
> 5800X3D with 4090FE. Everything @stock.


Dare you to run it at Psycho Ray Tracing and Quality DLSS....


----------



## Aaq

schoolofmonkey said:


> Dare you to run it at Psycho Ray Tracing and Quality DLSS....


----------



## Nizzen

dr/owned said:


> AW3423DW: Had it and hated it. Text was sooooo bad with the subpixel layout on that thing.


Is this fixed with new revitions, because I can't see a problem with text. Maybe I'm too old 😆


----------



## EarlZ

I just got my Auros Master 4090 and its pretty impossible to close the Corsair 5000D side panel w/o pushing into the power adapter. I've placed an order for the Cablemods 16pin to 4x PCIE.

I am getting a max boost of 2805Mhz at 100% power limit with Super Position. Dunno if thats a low boost or acceptable. Just running the card at 70% power limit for now until the new cables arrive.


----------



## schoolofmonkey

Aaq said:


> View attachment 2582552


Same as I got on my 1440p, still very playable, my 3090 couldn't pull those FPS of at all..


----------



## J7SC

Did some more tiptoeing towards higher loads with the water-cooling heat-cycling after finishing the water block install earlier. I subbed a couple of things but the rest will have to wait. It is a nice cozy 25 C in the room (2 C outside) and nobody in my family wants the heat off at 3 am while they're sleeping.

I am encouraged by the hotspot delta staying at below 10 C even beyond 530 W, and VRAM now goes almost +100 higher than before from what I can tell - don't really know the new boundaries yet, but the prior heat-related limitations are gone...straight horizontal lines on the clocks are a good sign . For now, I am not running 'prefer max performance' in NVCP yet.


----------



## th3illusiveman

neteng101 said:


> Guess my terrible memory isn't all that bad, split the difference... 13700k, DDR4 3866CL19-23-23-46. I take it CP2077 is CPU/system bound at 1440p to an extent?
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2582467


This is mine... looks like 2077 doesnt care much for Vcache - i was CPU bottlenecked and GPU usage never went to 100, was mid to high 80s indoor scene and high 80s to mid 90s outdoor scene. 5800X3D (PBO2 enabled) and stock 4090 FE. 









Luckily at 4K, my native rez - i get 110/ 47/ 147 (ave/min/max) almost pegged 95%+ GPU usage...


----------



## yzonker

th3illusiveman said:


> This is mine... looks like 2077 doesnt care much for Vcache - i was CPU bottlenecked and GPU usage never went to 100, was mid to high 80s indoor scene and high 80s to mid 90s outdoor scene. 5800X3D (PBO2 enabled) and stock 4090 FE.
> View attachment 2582562
> 
> 
> Luckily at 4K, my native rez - i get 110/ 47/ 147 (ave/min/max) almost pegged 95%+ GPU usage...


It's still GPU dependent though I think. Both CPU/GPU. So you'll score lower if you are running the 4090 at stock.


----------



## th3illusiveman

Aaq said:


> View attachment 2582552


now thats interesting... your score is quite abit higher than mine, same hardware... only two things i can think of is the crowd density setting and perhaps RAM - if i turn crowd density to low, i get 140 ave fps... my ram is pretty crappy 3600/ CL16-19-19-39-85.



yzonker said:


> It's still GPU dependent though I think. Both CPU/GPU. So you'll score lower if you are running the 4090 at stock.


Nope, if i run a +150 core/ +750 mem i get the same score w/ some error margin variance. Heavy CPU bottleneck with crowd density on high for me. I kinda suspected the new intel CPUs would not have that issue due to clock speedss. I bet the new AM5s would also do better.


----------



## yzonker

th3illusiveman said:


> now thats interesting... your score is quite abit higher than mine, same hardware... only two things i can think of is the crowd density setting and perhaps RAM - if i turn crowd density to low, i get 140 ave fps... my ram is pretty crappy 3600/ CL16-19-19-39-85.
> 
> 
> Nope, if i run a +150 core/ +750 mem i get the same score w/ some error margin variance. Heavy CPU bottleneck with crowd density on high for me. I kinda suspected the new intel CPUs would not have that issue due to clock speedss. I bet the new AM5s would also do better.


That's the problem with benchmarking games. It's easy to have inadvertently changed some settings that skew the results.


----------



## Aaq

th3illusiveman said:


> now thats interesting... your score is quite abit higher than mine, same hardware... only two things i can think of is the crowd density setting and perhaps RAM - if i turn crowd density to low, i get 140 ave fps... my ram is pretty crappy 3600/ CL16-19-19-39-85.
> 
> 
> Nope, if i run a +150 core/ +750 mem i get the same score w/ some error margin variance. Heavy CPU bottleneck with crowd density on high for me. I kinda suspected the new intel CPUs would not have that issue due to clock speedss. I bet the new AM5s would also do better.


This is with crowd density set to low. Psycho Ray Tracing and Quality DLSS. I used the Ultra preset and only changed those earlier mentioned settings.
Hardware used: 5800X3D with 4090 FE 










Every settings tab set to default with Ultra preset:


----------



## th3illusiveman

Aaq said:


> This is with crowd density set to low. Psycho Ray Tracing and Quality DLSS. I used the Ultra preset and only changed those earlier mentioned settings.
> 
> View attachment 2582564


This is mine same settings,








Now seems to be within system variance - GPU was at 97% ~2750mhz for the run (stock). TY for the updated info/numbers


----------



## Aaq

th3illusiveman said:


> This is mine same settings,
> View attachment 2582565
> 
> Now seems to be within system variance - GPU was at 97% ~2750mhz for the run (stock). TY for the updated info/numbers


Your score seems fine to me. The FPS can change quite a bit actually. I've ran it a few times and it can be within a few fps difference.


----------



## Panchovix

New NVIDIA Profile inspector relelease, after ~2 years since the last update.









Release 2.4.0.1 · Orbmu2k/nvidiaProfileInspector


Shadercache - Cachesize added #88 Background Application Max Frame Rate added #79 Disable Ansel setting is back and working #54 #87 Ultra Low Latency setting is back #49 added workaround for some s...




github.com


----------



## yzonker

I reset everything to defaults, exited, then ran the bench again. The only other change is I ran on the new Nvidia driver and disabled HT. So that feels a little better _assuming_ this is the correct config now.


----------



## Aaq

yzonker said:


> I reset everything to defaults, exited, then ran the bench again. The only other change is I ran on the new Nvidia driver and disabled HT. So that feels a little better _assuming_ this is the correct config now.
> 
> View attachment 2582568


Very impressive! Intel title . Damn 100 min fps must be nice.


----------



## jootn2kx

For those who play cyberpunk use DSRDL in nvidia control panel en select 2.25x one.
I play @ 5160x2160 resolution with dlss on ultra performance
I'm getting way better clarity than compared with native 4K + dlss on quality  also textures look sharper.
It also helps with the cpu bottleneck i'm actually getting more stable +-120fps now with crowddenstity on medium.

This card is an absolute beast


----------



## Sheyster

Nizzen said:


> Oled is meta
> 
> The Oled 42" was too big for me, so finally got the 34" ultrawide Oled.


Good chance I will "down grade" to the ASUS 42" PG42UQ OLED (LG C2 panel). The 48" CX I'm using now is a little too big, but the ASUS is just right, not to mention a slight bump up to 138 Hz.


----------



## Sayenah

Sheyster said:


> Good chance I will "down grade" to the ASUS 42" PG42UQ OLED (LG C2 panel). The 48" CX I'm using now is a little too big, but the ASUS is just right, not to mention a slight bump up to 138 Hz.


I have been thinking about a smaller C2 panel myself. Why not the C2 itself? Why the Asus?


----------



## yzonker

An FE finally melted. 



Spoiler: Reddit link 





__
https://www.reddit.com/r/nvidia/comments/yu5gm5


----------



## Sheyster

Sayenah said:


> I have been thinking about a smaller C2 panel myself. Why not the C2 itself? Why the Asus?


Several reasons:

The ASUS is brighter, it has a heat sink and ability to fully disable ABSL.
Refresh rate is slightly higher, 138 Hz.
Input lag is lower than the C2.
Has more PC friendly options such as a DP input.
The main gripe about the ASUS is the semi-matte coating they use on the panel. It doesn't quite "pop" as much as the LG C2. This can be compensated for somewhat with settings.


----------



## Sayenah

yzonker said:


> An FE finally melted.
> 
> 
> 
> Spoiler: Reddit link
> 
> 
> 
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/yu5gm5


it is tough for the dude involved, but I still say these guys are simply not inserting the plug in properly.


----------



## Sayenah

Sheyster said:


> Several reasons:
> 
> The ASUS is brighter, it has a heat sink and ability to fully disable ABSL.
> Refresh rate is slightly higher, 138 Hz.
> Input lag is lower than the C2.
> Has more PC friendly options such as a DP input.
> The main gripe about the ASUS is the semi-matte coating they use on the panel. It doesn't quite "pop" as much as the C2 panel. This can be compensated for somewhat with settings.


 Thanks for the reply. Now you are convincing me on the Asus; let me do some more research


----------



## Sheyster

Sayenah said:


> Thanks for the reply. Now you are convincing me on the Asus; let me do some more research


Sure NP. The only hesitation I have is ASUS improving on it for next year. I can see them pushing the refresh rate higher if LG's next panel can support that.


----------



## ESRCJ

For those with a waterblock, how is your GPU die-over-fluid delta? 

Also, based on some earlier comments, it appears the early versions of the EK block had a machining error. I pre-ordered mine and it arrived a few weeks ago. I just got a 4090 FE and haven't installed the WB yet. Have they fixed the machining error on their newer 4090 FE blocks?


----------



## KingEngineRevUp

Sheyster said:


> Several reasons:
> 
> The ASUS is brighter, it has a heat sink and ability to fully disable ABSL.
> Refresh rate is slightly higher, 138 Hz.
> Input lag is lower than the C2.
> Has more PC friendly options such as a DP input.
> The main gripe about the ASUS is the semi-matte coating they use on the panel. It doesn't quite "pop" as much as the LG C2. This can be compensated for somewhat with settings.


And the price. The C2 has gone on sale regularly now and black Friday is right around the corner.

Personally, I like that the C2 has it's own native SoC and it has Netflix, etc. Plus having a remote control to change settings is nice.


----------



## Sheyster

KingEngineRevUp said:


> And the price. The C2 has gone on sale regularly now and black Friday is right around the corner.
> 
> Personally, I like that the C2 has it's own native SoC and it has Netflix, etc. Plus having a remote control to change settings is nice.


If you need the TV features, apps, etc. absolutely get the C2! However, I use it as a monitor for PC gaming only (primarily FPS), nothing else. I can appreciate the lower input lag, slightly higher refresh rate and increased/consistent brightness more than most.


----------



## jeiselramos

4090 Suprim X | Gaming Bios
+120 +2000 passed 10 loops of MEE


----------



## KingEngineRevUp

Sheyster said:


> If you need the TV features, apps, etc. absolutely get the C2! However, I use it as a monitor for PC gaming only (primarily FPS), nothing else. I can appreciate the lower input lag, slightly higher refresh rate and increased/consistent brightness more than most.


I have the C2 as well. It's mainly used for work, splitting for windows in each corner.

But it's my gaming monitor also. Got it for $900+tax, which I think is a great deal when compared to the ASUS monitor.


----------



## motivman

man, so bummed with my 4090 Gaming OC. I won the lottery for the memory, will run +1800 in afterburner, but the core... geez man, I just cant score a 4090 that does good on the core. Max core clock is around 3045, but effective clock is almost 100 points lower? its crazy. My FE that does around +1650 on the memory, and 3000mhz max on the core beats my gigabyte because its effective clock is only on average 30mhz lower than the core clock. This is frustrating, but I am done hunting for a golden 4090... I am just gonna hang my head down in shame, and accept my LOSS this gen, SMH.


----------



## Sheyster

motivman said:


> man, so bummed with my 4090 Gaming OC. I won the lottery for the memory, will run +1800 in afterburner, but the core... geez man, I just cant score a 4090 that does good on the core. Max core clock is around 3045, but effective clock is almost 100 points lower? its crazy. My FE that does around +1650 on the memory, and 3000mhz max on the core beats my gigabyte because its effective clock is only on average 30mhz lower than the core clock. This is frustrating, but I am done hunting for a golden 4090... I am just gonna hang my head down in shame, and accept my LOSS this gen, SMH.


So basically your G-OC is the opposite of mine.. It's a double lottery this time, core and memory.  Maybe it's Hail Mary time, try a Strix... 😜


----------



## motivman

Sheyster said:


> So basically your G-OC is the opposite of mine.. It's a double lottery this time, core and memory.  Maybe it's Hail Mary time, try a Strix... 😜


yeah, when they are more readily available in stores, I might try a strix, but for now, I am done... plus this whole cable fiasco? I might retire my FE for now (such a beautiful card compared to the UGLY gigabyte gaming OC) and go back to a cheap 30 series while nvidia figures their crap out with the melting cables.


----------



## cheddardonkey

Anyone have an aorus master bios they are willing to extract and share?


----------



## dr/owned

I'm not happy with the $250 price tag so maybe I'll return it if Bykski comes out with their block for MSI.

If someone wants to look at the HWinfo for being shunted (stock Trio bios). Seems it still has awareness of the true core? (NVVDD) power output:










From eyeballing my whole-house power monitoring, it goes +500W when I start Furmark.


----------



## J7SC

dr/owned said:


> View attachment 2582605
> 
> 
> I'm not happy with the $250 price tag so maybe I'll return it if Bykski comes out with their block for MSI.
> 
> If someone wants to look at the HWinfo for being shunted (stock Trio bios). Seems it still has awareness of the true core? (NVVDD) power output:
> 
> View attachment 2582609
> 
> 
> From eyeballing my whole-house power monitoring, it goes +500W when I start Furmark.


Phanteks make my favourite water-cooling blocks (CPU, GPU) and the 3090 Strix block I have performs flawlessly, as do 2x X570 and 1x TR CPU Phanteks blocks. The only annoyance is the fact that when vertically mounted, the Glacier GPU 3090 block has this annoying air-bubble at top left, no matter how many times you got rid of it (others have reported the same issue)...it's a minor annoyance given the overall quality and performance, though.

All that said, the price of the Bykski is right...the first Bykski block is still trucking along just fine after 18 months on another GPU, and if the 4090 Giga block from Bykski performs the same over time (i.e. nickel coating), I am happy. I bought the 4090 Giga Gaming-OC because it was at an entry-level 4090 price - and available. I could have opted for the Strix (nice card) but I am very happy with my sample, and saved a hefty premium. Now with the water-block (US$ 122.95 +shipping, includes backplate and remotes), the card can stretch its legs a bit -temps are not an issue anymore per below (from a 1k+ Speedway) at ~ 25 C ambient


----------



## newls1

dr/owned said:


> View attachment 2582605
> 
> 
> I'm not happy with the $250 price tag so maybe I'll return it if Bykski comes out with their block for MSI.
> 
> If someone wants to look at the HWinfo for being shunted (stock Trio bios). Seems it still has awareness of the true core? (NVVDD) power output:
> 
> View attachment 2582609
> 
> 
> From eyeballing my whole-house power monitoring, it goes +500W when I start Furmark.


I too ordered a phanteks block but for my Gigabyte gaming OC card. I cant freaking wait! hoping it will be here in a few days and it loop will be ready for it.


----------



## KingEngineRevUp

Anyone buying a block, just remember these cards have a higher power to die ratio when compared to board power draw.

Don't expect the same kind of deltas as the 30 series.

That being said, I might be receiving my EKWB tomorrow. Let's see if they figured out any QC issues on their end. Hopefully I can report better deltas but I'm not hopeful.


----------



## newls1

KingEngineRevUp said:


> Anyone buying a block, just remember these cards have a higher power to die ratio when compared to board power draw.
> 
> Don't expect the same kind of deltas as the 30 series.
> 
> That being said, I might be receiving my EKWB tomorrow. Let's see if they figured out any QC issues on their end. Hopefully I can report better deltas but I'm not hopeful.


i dont understand what oyu mean?


----------



## J7SC

KingEngineRevUp said:


> Anyone buying a block, just remember these cards have a higher power to die ratio when compared to board power draw.
> 
> Don't expect the same kind of deltas as the 30 series.
> 
> That being said, I might be receiving my EKWB tomorrow. Let's see if they figured out any QC issues on their end. Hopefully I can report better deltas but I'm not hopeful.


...very true on the power-to-die ratio...I still recommend a water-block though because for 450W - 600W, they are much safer (never mind the size difference when compared to an air cooler). My 4090 did have serious heat problems which could have turned fatal at anything over 450 W+. I may have only gained a bin or two on the GPU with the w-block, but VRAM picked up a nice 80 MHz - 100 MHz re. most efficient.


----------



## yzonker

Ran these with the new driver. Some improvements.









I scored 11 125 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 37 904 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 20 139 in Time Spy Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





All I did is match my previous PR run though (was slightly lower). Kinda disappointed in that.









I scored 29 011 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## changboy

I put my water block today and now my system running but i cant put load on my gpu coz temp raise at 100c (hot spot).
My 2mm thermal pad are to thick, i will need drop my liquid tomorrow and do it again, i think i will just use the thermal pad came with the block. they are not 1.8mm maybe 1.4mm. Even i push on my thermal pads they are to thick. Even i order 1.5mm i not sure it will be good.
I hate working like this for nothing at the end, but it will work the next time lol.


----------



## sew333

Hi. I have 12900K stock

32 GB 2x16 GB DDR4 3600mhz

Palit Gamerock Pro OC rtx 4090

1300W Seasonic Platinum Prime

Aorus Elite DDR4 Z690

Windows 11 2H22. Updated new nvidia drivers.

Game is Cyberpunk 2077.

I used 1440,RT ULTRA,rest ultra for testing fps.

And you know? All 3 modes of DLSS not working.

Dlss is not working. I am gettin the same scores and fps on all 3 modes of dlss. Any ideas? my drivers is up to date. On previous drivers the same.



DLSS Quality,performance,ultra performance ,no fps gain in benchmark and game.

I compare benchmark scores with youtube user and he got 145fps on that settings in benchmark, when me 115fps because dlss is screwed. Any ideas? Dlss changing not improve fps. Only gpu usage goes down.

Here someone with Rtx 4090 and hes dlss Ultra Performance is "working" in benchmark:















My bios is updated to F7. But F20 is newest btw.


Any ideas? I bet is not because i have 3600 MHZ DDR4? What you think ? Just my fps is the same no matter what dlss setting. Thanks and sorry for bad language. Ah my 3dmark scores are fine in other games fps is fine.


----------



## Arizor

@sew333 are you restarting the game each time you change DLSS setting? Cyberpunk notoriously does not actually change the setting until you restart the game, especially with DLSS.


----------



## sew333

Arizor said:


> @sew333 are you restarting the game each time you change DLSS setting? Cyberpunk notoriously does not actually change the setting until you restart the game, especially with DLSS.


Yeah i try that. DLSS QUALITY,ULTRA PERFORMANCE,BALANCED the same fps in game and benchmark. ((. Thanks for reply. Its very weird situation.

I tested dlss in port royal and its working normally. 110fps without dlss. 250fps with dlss 3.

Only CB2077


----------



## anthony.baucher

changboy said:


> I put my water block today and now my system running but i cant put load on my gpu coz temp raise at 100c (hot spot).
> My 2mm thermal pad are to thick, i will need drop my liquid tomorrow and do it again, i think i will just use the thermal pad came with the block. they are not 1.8mm maybe 1.4mm. Even i push on my thermal pads they are to thick. Even i order 1.5mm i not sure it will be good.
> I hate working like this for nothing at the end, but it will work the next time lol.



Same with 2.0mm OC365 pads...
1.8mm are more 1.5 in fact


----------



## yzonker

sew333 said:


> Yeah i try that. DLSS QUALITY,ULTRA PERFORMANCE,BALANCED the same fps in game and benchmark. ((. Thanks for reply. Its very weird situation.
> 
> I tested dlss in port royal and its working normally. 110fps without dlss. 250fps with dlss 3.
> 
> Only CB2077


Does the image quality change any? Or visually the same also?


----------



## bigfootnz

Krzych04650 said:


> View attachment 2582029
> 
> 4090 can still be murdered with SGSSAA. LOTRO, 3840x1600, 8xMSAA+8xSGSSAA. 50 FPS and massive power throttling from 3000 target to 2700s. It would probably pull like 800W here if allowed. It is not modern RT games that is the most difficult test.


Can you please let me know what program you are using for overlay info, as it looks great. Thanks


----------



## GRABibus

sew333 said:


> Hi. I have 12900K stock
> 
> 32 GB 2x16 GB DDR4 3600mhz
> 
> Palit Gamerock Pro OC rtx 4090
> 
> 1300W Seasonic Platinum Prime
> 
> Aorus Elite DDR4 Z690
> 
> Windows 11 2H22. Updated new nvidia drivers.
> 
> Game is Cyberpunk 2077.
> 
> I used 1440,RT ULTRA,rest ultra for testing fps.
> 
> And you know? All 3 modes of DLSS not working.
> 
> Dlss is not working. I am gettin the same scores and fps on all 3 modes of dlss. Any ideas? my drivers is up to date. On previous drivers the same.
> 
> 
> 
> DLSS Quality,performance,ultra performance ,no fps gain in benchmark and game.
> 
> I compare benchmark scores with youtube user and he got 145fps on that settings in benchmark, when me 115fps because dlss is screwed. Any ideas? Dlss changing not improve fps. Only gpu usage goes down.
> 
> Here someone with Rtx 4090 and hes dlss Ultra Performance is "working" in benchmark:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My bios is updated to F7. But F20 is newest btw.
> 
> 
> Any ideas? I bet is not because i have 3600 MHZ DDR4? What you think ? Just my fps is the same no matter what dlss setting. Thanks and sorry for bad language. Ah my 3dmark scores are fine in other games fps is fine.


maybe because your are CPU bound.

Try at 4K and see if it scales when playing with DLSS modes.


----------



## sew333

yzonker said:


> Does the image quality change any? Or visually the same also?


i see its more blurred on Ultra performance so basically its working


----------



## sew333

GRABibus said:


> maybe because your are CPU bound.
> 
> Try at 4K and see if it scales when playing with DLSS modes.


In benchmark it must work,shouldnt be bound in benchmark. . I tried formatting system and the same. Maybe should i use DDU and reclean drivers and install again? I dont using normally DDU when updating drivers.


----------



## GRABibus

sew333 said:


> In benchmark it must work,shouldnt be bound in benchmark. . I tried formatting system and the same. Maybe should i use DDU and reclean drivers and install again? I dont using normally DDU when updating drivers.


I don’t know how the benchmark works but at 1440p with a 12900K, whatever the game, you are bound with the 4090.

did you try 4K ??


----------



## yzonker

sew333 said:


> In benchmark it must work,shouldnt be bound in benchmark. . I tried formatting system and the same. Maybe should i use DDU and reclean drivers and install again? I dont using normally DDU when updating drivers.


Oh yes it is CPU dependent at those framerates. If the image is visually changing then most likely CPU bound. Like others suggested, run it in 4k.


----------



## Zero989

sew333 said:


> Hi. I have 12900K stock
> 32 GB 2x16 GB DDR4 3600mhz
> Palit Gamerock Pro OC rtx 4090
> 1300W Seasonic Platinum Prime
> Aorus Elite DDR4 Z690
> Windows 11 2H22. Updated new nvidia drivers.
> 
> 
> 
> I used 1440,RT ULTRA,rest ultra for testing fps.
> 
> And you know? All 3-4 modes of DLSS not working.
> 
> Dlss is not working. I am gettin the same scores and fps on all 3 modes of dlss. Any ideas? my drivers is up to date. DLSS Quality,performance,ultra performance ,no fps gain in benchmark and game.
> 
> I compare benchmark scores with youtube user and he got 160fps on that settings in benchmark, when me 115fps because dlss is screwed. Any ideas?
> 
> Here someone with Rtx 4090 and hes dlss AUTO is "working" in benchmark:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He is getting at start 165 fps. I have 115fps.
> I tried all 3 different drivers nothing helps.


It likes DDR5. If you don't have DLSS working try verifying game files w/ 1.61 version. I could easily beat his score though with more CPU OC. I'm only at 5670Mhz P, 4.95 ring and 4.5 E cores. 

With my daily OC:


----------



## sew333

Zero989 said:


> It likes DDR5. If you don't have DLSS working try verifying game files w/ 1.61 version. I could easily beat his score though with more CPU OC. I'm only at 5670Mhz P, 4.95 ring and 4.5 E cores.
> 
> With my daily OC:
> 
> View attachment 2582655


Your scores are fine. My scores are identical no matter what dlss. So QUALITY,ULTRA PERFORMANCE i have identical fps. So only DLSS QUALITY is working. Other dlss modes not.


----------



## Zero989

sew333 said:


> Your scores are fine. My scores are identical no matter what dlss. So QUALITY,ULTRA PERFORMANCE i have identical fps. So only DLSS QUALITY is working. Other dlss modes not.











NVIDIA DLSS DLL (2.5.0) Download


This download provides various versions of NVIDIA's DLSS DLL for download. In this one file, which is bundled with all games that support NVIDIA's




www.techpowerup.com





Place in ..\Cyberpunk 2077\bin\x64


----------



## sew333

Zero989 said:


> NVIDIA DLSS DLL (2.5.0) Download
> 
> 
> This download provides various versions of NVIDIA's DLSS DLL for download. In this one file, which is bundled with all games that support NVIDIA's
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Place in ..\Cyberpunk 2077\bin\x64


Thx. If that will not work , should i update bios ?

Newest one released just a week+ ago: F20

2022/11/03:

Checksum : DFF2
Supports Intel 13th generation processor
Improve CPU OC stability, DDR compatibility & PCIe Gen5 stability


----------



## Zero989

sew333 said:


> Thx. If that will not work , should i update bios ?
> 
> Newest one released just a week+ ago: F20
> 
> 2022/11/03:
> 
> Checksum : DFF2
> Supports Intel 13th generation processor
> Improve CPU OC stability, DDR compatibility & PCIe Gen5 stability


Like a .000001% chance it's bios related but sure


----------



## sew333

Hmmmmmmmmm oki. Maybe it is because i have 22H2 win version? But i doubt it. You have too 22H2 win 11?


----------



## Zero989

sew333 said:


> Hmmmmmmmmm oki. Maybe it is because i have 22H2 win version? But i doubt it. You have too 22H2 win 11?


Yep


----------



## sew333

If i uninstalled drivers and installed again but not used <DDU> it can be here issue? But i doubt it


----------



## Zero989

Just use clean install option during nvidia driver install


----------



## sew333

Zero989 said:


> Just use clean install option during nvidia driver install


yaeah i always doing that. Weird.......so try update that dlss file? If that will not work i am completely without ideas. But i am very greatful for your help


----------



## Zero989

sew333 said:


> yaeah i always doing that. Weird.......so try update that dlss file? If that will not work i am completely without ideas. But i am very greatful for your help


Yea try the DLSS file. I just checked my original version and it was from June.


----------



## sew333

Zero989 said:


> Yea try the DLSS file. I just checked my original version and it was from June.
> 
> View attachment 2582658


My dlss version is 2.3.4.0. But updating dlss will affect fps?


----------



## Zero989

sew333 said:


> My dlss version is 2.3.4.0. But updating dlss will affect fps?


Didn't really affect my fps at all negative or positive.

Also, give me your memory timings, including proof of Gear 1.


----------



## KingEngineRevUp

newls1 said:


> i dont understand what oyu mean?


For the 30 series, for every 100W power drawn, 54W of that went to the die

For the 40 series, for every 100W drawn, 87W of that goes to the die

Therefore a drawing about 162W on the 30 series produced about the same heat on the die as a 40 series at 100W.

A 4090 running at 450W is like a 3090 running at 725W.

A 4090 running at 600W is like a 3090 running at 966W.

Those are rough numbers.








l


----------



## KingEngineRevUp

changboy said:


> I put my water block today and now my system running but i cant put load on my gpu coz temp raise at 100c (hot spot).
> My 2mm thermal pad are to thick, i will need drop my liquid tomorrow and do it again, i think i will just use the thermal pad came with the block. they are not 1.8mm maybe 1.4mm. Even i push on my thermal pads they are to thick. Even i order 1.5mm i not sure it will be good.
> I hate working like this for nothing at the end, but it will work the next time lol.


I keep telling people this... Just use the pads that came with the block.... Assuming they sent you the right ones.


----------



## sew333

Zero989 said:


> Didn't really affect my fps at all negative or positive.
> 
> Also, give me your memory timings, including proof of Gear 1.


Yes i have GEAR 1 in bios. For timings cpuz will show it?


----------



## Zero989

sew333 said:


> Yes i have GEAR 1 in bios. For timings cpuz will show it?











*Official* Intel DDR5 OC and 24/7 daily Memory Stability...


amazon had it in stock afaik 309usd now. just not sure which batch. Only one I can find is G.Skill Trident Z5 RGB Series (Intel XMP) 32GB (2 x 16GB) 288-Pin SDRAM DDR5 6600 CL34-40-40-105 1.40V Dual Channel Desktop Memory F5-6600J3440G16GA2-TZ5RK (Matte Black) at Amazon.com Part# doesn't...




www.overclock.net





Download both and show me both.


----------



## sew333

Zero989 said:


> *Official* Intel DDR5 OC and 24/7 daily Memory Stability...
> 
> 
> amazon had it in stock afaik 309usd now. just not sure which batch. Only one I can find is G.Skill Trident Z5 RGB Series (Intel XMP) 32GB (2 x 16GB) 288-Pin SDRAM DDR5 6600 CL34-40-40-105 1.40V Dual Channel Desktop Memory F5-6600J3440G16GA2-TZ5RK (Matte Black) at Amazon.com Part# doesn't...
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> Download both and show me both.


I will post later. But last thing. If i go to 1080P fps still the same fps. Not changing. So what can it be? I test resolution change in Control and there is fine with fps.


----------



## Zero989

sew333 said:


> I will post later. But last thing. If i go to 1080P fps still the same fps. Not changing. So what can it be? I test resolution change in Control and there is fine with fps.


It must be working and the 4090 is just too fast. Set RT to Psycho, or use DLDSR to do 4K.


----------



## bigfootnz

Gilgam3sh said:


> I'm waiting to get my PNY 4090 delivered, anyone here using it and have some feedback on it? thanks
> 
> 
> 
> 
> 
> 
> GeForce RTX 4090 24GB XLR8 VERTO EPIC-X RGB Triple Fan | pny.com
> 
> 
> The GeForce RTX 4090 24GB XLR8 GPU (powered by NVIDIA Ada Lovelace) features enhanced RT and Tensor Cores, 8K HDR, & the world's fastest G6X memory.
> 
> 
> 
> 
> www.pny.com


Have you received your PNY? At the moment this is only card available for me. What about BIOS power limits 450W with OC to what? Thanks


----------



## newls1

KingEngineRevUp said:


> For the 30 series, for every 100W power drawn, 54W of that went to the die
> 
> For the 40 series, for every 100W drawn, 87W of that goes to the die
> 
> Therefore a drawing about 162W on the 30 series produced about the same heat on the die as a 40 series at 100W.
> 
> A 4090 running at 450W is like a 3090 running at 725W.
> 
> A 4090 running at 600W is like a 3090 running at 966W.
> 
> Those are rough numbers.
> 
> View attachment 2582660
> l


thank you for the reply and that makes a lot of sense, but what are you trying to say when this comes to having a waterblock on the gpu?


----------



## PhuCCo

changboy said:


> I put my water block today and now my system running but i cant put load on my gpu coz temp raise at 100c (hot spot).
> My 2mm thermal pad are to thick, i will need drop my liquid tomorrow and do it again, i think i will just use the thermal pad came with the block. they are not 1.8mm maybe 1.4mm. Even i push on my thermal pads they are to thick. Even i order 1.5mm i not sure it will be good.
> I hate working like this for nothing at the end, but it will work the next time lol.


Use the included thermal pads as they are 1.5-1.8mm thick. Make sure you don't put thermal pads anywhere else other than what the diagram shows on the store page for the block, otherwise you'll warp your pcb and not make proper contact.


----------



## PhuCCo

So I installed a Bykski block to a Gaming OC and I'm getting a 20C delta between gpu and water at over 500W!!!! This thing is insane for the price.


----------



## AdamK47

Sheyster said:


> Good chance I will "down grade" to the ASUS 42" PG42UQ OLED (LG C2 panel). The 48" CX I'm using now is a little too big, but the ASUS is just right, not to mention a slight bump up to 138 Hz.


I went larger. Bought a 77" G1 with the EVO panel. Similar panel as what the C series gets this year. Resists burn in. Also have ABSL disabled through the service menu.

It helps that I have an entire room in my new house dedicated to PC Gaming.


----------



## newls1

PhuCCo said:


> So I installed a Bykski block to a Gaming OC and I'm getting a 20C delta between gpu and water at over 500W!!!! This thing is insane for the price.


any pics of the install?


----------



## Sheyster

AdamK47 said:


> I went larger. Bought a 77" G1 with the EVO panel. Similar panel as what the C series gets this year. Resists burn in. Also have ABSL disabled through the service menu.
> 
> It helps that I have an entire room in my new house dedicated to PC Gaming.


I have a 77" C2 as well (expand my sig). The problem with disabling ASBL in the service menu (I've done it on both the CX and C2) is that it does not fully disable ABL, it still kicks in but is less aggressive than before. The ASUS completely disables ABL from what I've heard. From a purely PC monitor point of view, the ASUS is purpose built for it, hence better.


----------



## KingEngineRevUp

newls1 said:


> thank you for the reply and that makes a lot of sense, but what are you trying to say when this comes to having a waterblock on the gpu?


Assuming the water blocks are designed the same, you can expect the Delta temperatures of core versus water temperature to be about 60% greater


----------



## EarlZ

cheddardonkey said:


> Anyone have an aorus master bios they are willing to extract and share?


Do you still need this? Drop me a DM!


----------



## Panchovix

NVIDIA inspector got a new small update, they added ReBAR as an option


















Release 2.4.0.2 · Orbmu2k/nvidiaProfileInspector


rBAR options added #75 #103




github.com


----------



## changboy

PhuCCo said:


> Use the included thermal pads as they are 1.5-1.8mm thick. Make sure you don't put thermal pads anywhere else other than what the diagram shows on the store page for the block, otherwise you'll warp your pcb and not make proper contact.


Ya i will just use those thermal pads and yes i follow the diagram on that store page. I just add some thermal pads under the backplate to help. I will post my result once i have re-built my gpu-water block.

Thank you


----------



## N19htmare666

EarlZ said:


> Do you still need this? Drop me a DM!


Would be great if you could upload here. The extension will need to be updated to pdf for the upload to work


----------



## BigMack70

OK, a little over a day with no black screen crashes... I _think_ I solved the problem by doing a fresh windows install, disabling my integrated graphics, and making sure I only ever have one of my two screens plugged in at the same time.

Don't think it's a hardware problem, I think Nvidia has some kind of driver issue and I think it is likely related to having multiple monitors.


----------



## J7SC

KingEngineRevUp said:


> Assuming the water blocks are designed the same, you can expect the Delta temperatures of core versus water temperature to be about 60% greater


...depends on the rest of the loop as well, at least to a certain extent.


----------



## KingEngineRevUp

It doesn't assume all conditions are the same and You're comparing two different dies. One generates 60% more heat.

If we're comparing different loops then that's just not an apples to apples comparison anymore


----------



## EarlZ

N19htmare666 said:


> Would be great if you could upload here. The extension will need to be updated to pdf for the upload to work


I am unsure what app to use to extract a copy of the bios as GPU-Z tells me that BIOS reading is not supported on this device.


----------



## PhuCCo

I just discovered something very strange.

I've been switching between a blocked Gaming OC and a blocked FE card for testing. Whenever I remove the FE and put the Gaming OC in the system, the Gaming OC will read 100W higher power draw than it is actually pulling.. so my scores and readings are all skewed by 100 extra watts and the card thinks it is slamming the power limit. Or it's actually pulling the extra power.. I did not measure from the wall. 

I've been able to correct this by resetting the PC and remaking my Afterburner profile. Then everything is perfectly fine.

So disregard my post on getting a 20C delta at over 500W, as it wasn't accurate.

Has anyone else ran into this?


----------



## GQNerd

I'm usually at 4k, but wanted to compare my #s with the 1440p posts.. 

This is with my non-benching daily settings @ +120 core (3030mhz) and +1500 mem

rest of the system:
[email protected]/4.6E/5.0Ring, 7200 DDR5


----------



## GQNerd

EarlZ said:


> I am unsure what app to use to extract a copy of the bios as GPU-Z tells me that BIOS reading is not supported on this device.
> 
> View attachment 2582731



Use Nvflash

nvflash64 --save *Filename.rom*


----------



## EarlZ

Miguelios said:


> Use Nvflash
> 
> nvflash64 --save *Filename.rom*


I've attached it an renamed to PDF. There is also an update on gigabyte's website but it only says improves compatibility and I am wondering if its worth updating to that?


----------



## sew333

Miguelios said:


> I'm usually at 4k, but wanted to compare my #s with the 1440p posts..
> 
> This is with my non-benching daily settings @ +120 core (3030mhz) and +1500 mem
> 
> rest of the system:
> [email protected]/4.6E/5.0Ring, 7200 DDR5
> 
> View attachment 2582732


So why here someone have 109fps average on 12900K and rtx 4090? Dlss ultra performance


----------



## Sheyster

EarlZ said:


> I've attached it an renamed to PDF. There is also an update on gigabyte's website but it only says improves compatibility and I am wondering if its worth updating to that?


All of the AIBs seem to have updated their BIOS. I installed the updated Gigabyte Gaming OC version (F2), didn't really notice any difference. This said, since the update came so soon after launch, I think it's not a bad idea to update. If you do, please post the newer BIOS here as well.


----------



## GQNerd

EarlZ said:


> I've attached it an renamed to PDF. There is also an update on gigabyte's website but it only says improves compatibility and I am wondering if its worth updating to that?


I wasn't looking for the vbios, just trying to help you out.

I would recommend trying the update and if you don't like it, you can roll back to your saved version


----------



## EarlZ

Sheyster said:


> All of the AIBs seem to have updated their BIOS. I installed the updated Gigabyte Gaming OC version (F2), didn't really notice any difference. This said, since the update came so soon after launch, I think it's not a bad idea to update. If you do, please post the newer BIOS here as well.


The F2 bios is on Gigabyte's website.



Miguelios said:


> I wasn't looking for the vbios, just trying to help you out.
> 
> I would recommend trying the update and if you don't like it, you can roll back to your saved version


someone else was asking for a bios image.


----------



## GQNerd

sew333 said:


> So why here someone have 109fps average on 12900K and rtx 4090? Dlss ultra performance


Not sure what you mean? I was just running the benchmark with the same settings as everyone else was earlier in the thread.. 

As for the video you posted, no idea what his bottleneck is. Could be anything, or could be a few things


----------



## KingEngineRevUp

PhuCCo said:


> I just discovered something very strange.
> 
> I've been switching between a blocked Gaming OC and a blocked FE card for testing. Whenever I remove the FE and put the Gaming OC in the system, the Gaming OC will read 100W higher power draw than it is actually pulling.. so my scores and readings are all skewed by 100 extra watts and the card thinks it is slamming the power limit. Or it's actually pulling the extra power.. I did not measure from the wall.
> 
> I've been able to correct this by resetting the PC and remaking my Afterburner profile. Then everything is perfectly fine.
> 
> So disregard my post on getting a 20C delta at over 500W, as it wasn't accurate.
> 
> Has anyone else ran into this?


So what are your temperatures?


----------



## Arizor

sew333 said:


> So why here someone have 109fps average on 12900K and rtx 4090? Dlss ultra performance


I get 109 too at 1440p, completely CPU limited (5900x), my GPU fans don't even bother starting up and usage hovers between 50-68% . This card is such a monster, demands 4K or it just doesn't even bother.


----------



## J7SC

Arizor said:


> I get 109 too at 1440p, completely CPU limited (5900x), my GPU fans don't even bother starting up and usage hovers between 50-68% . This card is such a monster, demands 4K or it just doesn't even bother.


...similar experience with FS2020 (just played a bit more). With 4K HDR ultra everything / DLSS / Quality / Frame Insertion, it is unbelievable since they released a major update on November 11th...touching 120 fps max several times...I don't even overclock the card at all when playing that.


----------



## PhuCCo

KingEngineRevUp said:


> So what are your temperatures?


20-22C gpu to water delta at 475W load. 

Picture is me sitting in the menu of MW2 with most settings cranked and at 200% resolution of 1440p.


----------



## Nizzen

PhuCCo said:


> So I installed a Bykski block to a Gaming OC and I'm getting a 20C delta between gpu and water at over 500W!!!! This thing is insane for the price.


Delta for aircooled card is about 25 to 30c  This is even more insane, because it's free


----------



## PhuCCo

Nizzen said:


> Delta for aircooled card is about 25 to 30c  This is even more insane, because it's free


Yes the stock coolers are extremely impressive to me after seeing how little water will do


----------



## sew333

So 4k monitor must buy xd


----------



## sew333

Arizor said:


> I get 109 too at 1440p, completely CPU limited (5900x), my GPU fans don't even bother starting up and usage hovers between 50-68% . This card is such a monster, demands 4K or it just doesn't even bother.


Can you run on 1440P with dlss ultra performance? I dont have any fps differenes between QUALITY and UP. On 12900K,DDR 3600,rtx 4090 ofc


----------



## SilenMar

4090 sucks even with 24GB of ram.
It barely runs Control at 8K. Cyberpunk crashes at 8K. GhostBusters is just 70fps at 4K. I thought I could've at least take some juicy screenshots from these games.


----------



## th3illusiveman

Someones 4090 FE adapter burned.... Nvidia needs to wake the fk up and address this now. It's embarrassing that they just bury their heads in the sand.


__
https://www.reddit.com/r/nvidia/comments/yu5gm5


----------



## th3illusiveman

sew333 said:


> Can you run on 1440P with dlss ultra performance? I dont have any fps differenes between QUALITY and UP. On 12900K,DDR 3600,rtx 4090 ofc


134avg/64min/190max (High crowd density) - (exact same FPS at 4k with DLSS ultra perf. in this test)
142avg/73min/190max (Low crowd density)
1440p/ Ultra RT preset/ DLSS Ultra Perf/ 5800X3D/4090FE stock - cpu bottleneck


----------



## mirkendargen

Nizzen said:


> Delta for aircooled card is about 25 to 30c  This is even more insane, because it's free


Ehhhhh more like 40C. I don't think anyone's getting a 50C core with a 20C ambient even with the fans maxed and the case open. And that's loud and annoying lol.


----------



## sew333

th3illusiveman said:


> 134avg/64min/190max (High crowd density) - (exact same FPS at 4k with DLSS ultra perf. in this test)
> 142avg/73min/190max (Low crowd density)
> 1440p/ Ultra RT preset/ DLSS Ultra Perf/ 5800X3D/4090FE stock - cpu bottleneck


nice


----------



## Arizor

sew333 said:


> Can you run on 1440P with dlss ultra performance? I dont have any fps differenes between QUALITY and UP. On 12900K,DDR 3600,rtx 4090 ofc


Yep - GPU barely goes above 45% usage, get a lower FPS at Ultra Performance. 

This is understandable since DLSS is effectively running at a much lower resolution and upscaling.

So the more you engage DLSS, the more CPU-bound you can be. Ultra Performance at 1440P is running at some horrendously low resolution, probably below 540p, massively limiting your GPU.

Meanwhile at 4K the GPU is sufficiently engaged to equalise CPUs (outside of a few frames, and mins). This is a 4K GPU, unless it's for competitive gaming where you want massively high frames, in which case you'll need to pair it with a brutally fast CPU as well, like the new Intel or AMD (or grab a 5800X3D as the good budget option).


----------



## sew333

Arizor said:


> Yep - GPU barely goes above 45% usage, get a lower FPS at Ultra Performance.
> 
> This is understandable since DLSS is effectively running at a much lower resolution and upscaling.
> 
> So the more you engage DLSS, the more CPU-bound you can be. Ultra Performance at 1440P is running at some horrendously low resolution, probably below 540p, massively limiting your GPU.
> 
> Meanwhile at 4K the GPU is sufficiently engaged to equalise CPUs (outside of a few frames, and mins). This is a 4K GPU, unless it's for competitive gaming where you want massively high frames, in which case you'll need to pair it with a brutally fast CPU as well, like the new Intel or AMD (or grab a 5800X3D as the good budget option).


so if i am cpu limited on 12900k on dlss ultra perf , 12900k is too slow?

Here is example with 10900K. Guy have the same fps on 1440P DLSS QUALITY and 1080P DLSS QUALITY:


----------



## Arizor

sew333 said:


> so if i am cpu limited on 12900k on dlss ultra perf , 12900k is too slow?
> 
> Here is example with 10900K. Guy have the same fps on 1440P DLSS QUALITY and 1080P DLSS QUALITY:


Yes at 1440P using Ultra Perf, which is really like 480P resolution, is putting the vast majority of the work on the cpu to supply frames, a GPU like this is sitting yawning at such resolutions and hardly doing any work.

Honestly this GPU really needs 4K to flex its muscles in the vast majority of use cases.

Edit: run CP2077 on Ultra Raytracing preset with DLSS off at 4K and let me know what frames you get. I get 48 when I max OC, which is all the work on the GPU, barely any CPU limit.


----------



## th3illusiveman

sew333 said:


> so if i am cpu limited on 12900k on dlss ultra perf , 12900k is too slow?
> 
> Here is example with 10900K. Guy have the same fps on 1440P DLSS QUALITY and 1080P DLSS QUALITY:


you wouldn't play C2077 at 1440 with DLSS Ultra perf anyways so its worrying about nothing. Turn up your settings and you GPU should be at 95%+ usage and there will be no difference between your CPU and others.


----------



## sew333

Arizor said:


> Yes at 1440P using Ultra Perf, which is really like 480P resolution, is putting the vast majority of the work on the cpu to supply frames, a GPU like this is sitting yawning at such resolutions and hardly doing any work.
> 
> Honestly this GPU really needs 4K to flex its muscles in the vast majority of use cases.
> 
> Edit: run CP2077 on Ultra Raytracing preset with DLSS off at 4K and let me know what frames you get. I get 48 when I max OC, which is all the work on the GPU, barely any CPU limit.


But someone here have more fps on DLSS ULTRA PERFORMANCE . Is this because he have 13900K?  Hes gpu usage is lower but have more fps. Look:


----------



## Arizor

sew333 said:


> But someone here have more fps on DLSS ULTRA PERFORMANCE . Is this because he have 13900K Hes gpu usage is lower but have more fps. Look:


yes stronger CPU is much more able to handle the work at low resolutions. As @th3illusiveman says, you don’t buy this GPU to run at that kind of level so don’t worry about it.

On this GPU you want 1440P native really as a minimum, otherwise you’re wasting it a bit (outside of competitive gaming).


----------



## sew333

Arizor said:


> yes stronger CPU is much more able to handle the work at low resolutions. As @th3illusiveman says, you don’t buy this GPU to run at that kind of level so don’t worry about it.
> 
> On this GPU you want 1440P native really as a minimum, otherwise you’re wasting it a bit (outside of competitive gaming).





Arizor said:


> yes stronger CPU is much more able to handle the work at low resolutions. As @th3illusiveman says, you don’t buy this GPU to run at that kind of level so don’t worry about it.
> 
> On this GPU you want 1440P native really as a minimum, otherwise you’re wasting it a bit (outside of competitive gaming).


Oki thanks for explanation. I am greatful.


----------



## Nizzen

mirkendargen said:


> Ehhhhh more like 40C. I don't think anyone's getting a 50C core with a 20C ambient even with the fans maxed and the case open. And that's loud and annoying lol.


I know, because I have Tuf and strix


----------



## long2905

Sheyster said:


> Several reasons:
> 
> The ASUS is brighter, it has a heat sink and ability to fully disable ABSL.
> Refresh rate is slightly higher, 138 Hz.
> Input lag is lower than the C2.
> Has more PC friendly options such as a DP input.
> The main gripe about the ASUS is the semi-matte coating they use on the panel. It doesn't quite "pop" as much as the LG C2. This can be compensated for somewhat with settings.


its more expensive though. i wonder if anyone tried to overclock these tv, 48cx, 48c1 and now the 42c2.
for myself i might utilize the tv functions directly for youtube and netflix screen casting if im a bit further away. so it makes more sense to me.


----------



## jootn2kx




----------



## doom3crazy

Sheyster said:


> All of the AIBs seem to have updated their BIOS. I installed the updated Gigabyte Gaming OC version (F2), didn't really notice any difference. This said, since the update came so soon after launch, I think it's not a bad idea to update. If you do, please post the newer BIOS here as well.


has the msi gaming trio 4090 had a bios update? where do you find those?


----------



## Sheyster

doom3crazy said:


> has the msi gaming trio 4090 had a bios update? where do you find those?


I believe you have to use the MSI Center utility to update the vBIOS, unless someone shares the vBIOS file here (or elsewhere), then you can use NVFLash.


----------



## Benni231990

The problem with this is when you have the Rare Suprim Liquid 600watt and you make the update you get the 530watt Bios


----------



## mickyc357

Anyone tried the galax sg cards? They're the cheapest here in Aus and according to the website have a 22 phase design but the first page of this thread says 18.


----------



## doom3crazy

Sheyster said:


> I believe you have to use the MSI Center utility to update the vBIOS, unless someone shares the vBIOS file here (or elsewhere), then you can use NVFLash.


oh dang. I dont have msi center. I am just wondering if there is a new bios if it just bumps up the power limit from 480 to like 520-530w?


----------



## Benni231990

no only Suprim X /Liquid X cards has the 520/530watt Power target

So you need to flash the Suprim X Bios go to MSI center and make the update


----------



## sugi0lover

Zero989 said:


> Yea try the DLSS file. I just checked my original version and it was from June.
> 
> View attachment 2582658


Here is my result.

[PC Setup]
○ CPU : 13900K / all cores P Cores 6.0Ghz / E Cores 4.7Ghz / Cache 5.0Ghz 
○ Ram OC : 8400-32-47-46-32-480-2T
○ VGA : RTX 4090
○ MB : Z790 Apex (Bios 0801)


----------



## sew333

sugi0lover said:


> Here is my result.
> 
> [PC Setup]
> ○ CPU : 13900K / all cores P Cores 6.0Ghz / E Cores 4.7Ghz / Cache 5.0Ghz
> ○ Ram OC : 8400-32-47-46-32-480-2T
> ○ VGA : RTX 4090
> ○ MB : Z790 Apex (Bios 0801)
> View attachment 2582831


how is possible that someonne with 13900k hes average fps is 150fps with dlss ultra performance , and you have 176fps average on dlss auto


----------



## Zero989

sugi0lover said:


> Here is my result.
> 
> [PC Setup]
> ○ CPU : 13900K / all cores P Cores 6.0Ghz / E Cores 4.7Ghz / Cache 5.0Ghz
> ○ Ram OC : 8400-32-47-46-32-480-2T
> ○ VGA : RTX 4090
> ○ MB : Z790 Apex (Bios 0801)
> View attachment 2582831


Just like SoTTR it does scale with bandwidth, as core usage goes up to 70+% of 16 cores.


----------



## sew333

Zero989 said:


> Just like SoTTR it does scale with bandwidth, as core usage goes up to 70+% of 16 cores.


what you mean bandwith?


----------



## Zero989

sew333 said:


> what you mean bandwith?


DDR memory bandwidth. His should be 135GB+/sec read.


----------



## RaMsiTo

doom3crazy said:


> has the msi gaming trio 4090 had a bios update? where do you find those?








MSI Tool Summary - Graphics Card


RTX 4090 VBIOS Update Tool(Resolve 93% Power Limit Problem with Afterburner) Read >>> Instruction <<< before VBIOS Update




sites.google.com


----------



## sew333

Zero989 said:


> Just like SoTTR it does scale with bandwidth, as core usage goes up to 70+% of 16 cores.


My fps doesnt change whetever i use DLSS QUALITY or ULTRA PERFORMANE on 1440P. Only gpu usage is lower. But fps is the same. Is this because i have DDR 3600?

But when i had 3090 ti in the same cpu ( 12900K ) fps between dlss modes was working on 1440P and i had better fps on dlss balanced. But on 4090 it not.


----------



## Panchovix

RaMsiTo said:


> MSI Tool Summary - Graphics Card
> 
> 
> RTX 4090 VBIOS Update Tool(Resolve 93% Power Limit Problem with Afterburner) Read >>> Instruction <<< before VBIOS Update
> 
> 
> 
> 
> sites.google.com
> 
> 
> 
> 
> 
> View attachment 2582833


Wait there will be a RTX 4090 Ventus? Damn can't imagine that lol


----------



## sew333

jootn2kx said:


> View attachment 2582808


nice what cpu ?


----------



## Azazil1190

Does anyone try to repaste the tuf?
I think i have a increase to hot spot temp ***
About 8-10c higher than before.
Need to mention that I flash the steix oc bios but i set the pl 90%

Τhis is after a run of superposition 4k
Ambient temp its about 23c









And this is after superposition on stock bios before two weeks









*Im gonna reflash the stock bios to test if this is the reason that i have higher hot spot temps


----------



## Zero989

sew333 said:


> My fps doesnt change whetever i use DLSS QUALITY or ULTRA PERFORMANE on 1440P. Only gpu usage is lower. But fps is the same. Is this because i have DDR 3600?
> 
> But when i had 3090 ti in the same cpu ( 12900K ) fps between dlss modes was working on 1440P and i had better fps on dlss balanced. But on 4090 it not.


Perhaps set pre rendered frames in Nvidia to 3 or more. Your gpu is getting hard handicapped and literally waiting around. 

I thought ddr4 would be really good in cyberpunk but the fact is it uses up to 11.5 cores so ddr5 it is...


----------



## sew333

Zero989 said:


> Perhaps set pre rendered frames in Nvidia to 3 or more. Your gpu is getting hard handicapped and literally waiting around.
> 
> I thought ddr4 would be really good in cyberpunk but the fact is it uses up to 11.5 cores so ddr5 it is...


But with DLSS QUALITY and 1440P i have normal scores and fps. Why ?


----------



## Zero989

sew333 said:


> But with DLSS QUALITY and 1440P i have normal scores and fps. Why ?


Are you saying your rtx 3090 ti had better fps than 4090 at same settings?


----------



## sew333

Zero989 said:


> Are you saying your rtx 3090 ti had better fps than 4090 at same settings?


No . 4090 have more fps definitely. I am asking that why on 4090 i have good fps on DLSS QUALITY and 1440P, when on DLSS ULTRA PERF i dont see any difference,


----------



## Zero989

sew333 said:


> No . 4090 have more fps definitely. I am asking that why on 4090 i have good fps on DLSS QUALITY and 1440P, when on DLSS ULTRA PERF i dont see any difference,


You answered your own question with ddr 3600. If you tighten your ram timings, and overclock your ram, you can increase your fps by a lot. That's why I asked to see them.


----------



## jootn2kx

sew333 said:


> nice what cpu ?


5800X3D which is not great for this game though, it doesn't really use the L3 cache to any good.


----------



## sew333

Zero989 said:


> You answered your own question with ddr 3600. If you tighten your ram timings, and overclock your ram, you can increase your fps by a lot. That's why I asked to see them.


Ok thx later i will show.


----------



## newls1

Is this doing ok for mild OC? 









I scored 28 140 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## sew333

newls1 said:


> Is this doing ok for mild OC?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 140 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


i have on stock palit 4090, 25 000


----------



## alasdairvfr

sew333 said:


> My fps doesnt change whetever i use DLSS QUALITY or ULTRA PERFORMANE on 1440P. Only gpu usage is lower. But fps is the same. Is this because i have DDR 3600?
> 
> But when i had 3090 ti in the same cpu ( 12900K ) fps between dlss modes was working on 1440P and i had better fps on dlss balanced. But on 4090 it not.


Do you have gsync/adaptive sync on by chance?


----------



## sew333

alasdairvfr said:


> Do you have gsync/adaptive sync on by chance?


yes gsync on


----------



## alasdairvfr

sew333 said:


> yes gsync on


try turning it off then run the various DLSS benchmarks again.


----------



## sew333

alasdairvfr said:


> try turning it off then run the various DLSS benchmarks again.


ok i wwill try. thx


----------



## yzonker

Zero989 said:


> Perhaps set pre rendered frames in Nvidia to 3 or more. Your gpu is getting hard handicapped and literally waiting around.
> 
> I thought ddr4 would be really good in cyberpunk but the fact is it uses up to 11.5 cores so ddr5 it is...


I suspect if you had a DDR4 setup as maxed out as @sugi0lover has for DDR5, they would be very close.


----------



## Panchovix

newls1 said:


> Is this doing ok for mild OC?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 140 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


It's nice if your card is able to overclock more.

With 28600+ you would enter the top 100 on Hall of Fame (PR, 1 GPU) 3DMark Port Royal Hall of Fame


----------



## newls1

Panchovix said:


> It's nice if your card is able to overclock more.
> 
> With 28600+ you would enter the top 100 on Hall of Fame (PR, 1 GPU) 3DMark Port Royal Hall of Fame


im just starting the OCing of this card. Phanteks waterblock comes in a few days and will push it more then. my max temp on this aircooler is 60c so far. Once I can stabilize max temp in the 40s Im sure ill be able to keep a bin or 2 higher clock speeds...


----------



## Nizzen

sew333 said:


> how is possible that someonne with 13900k hes average fps is 150fps with dlss ultra performance , and you have 176fps average on dlss auto


The power of overclocking


----------



## dante`afk

ESRCJ said:


> For those with a waterblock, how is your GPU die-over-fluid delta?
> 
> Also, based on some earlier comments, it appears the early versions of the EK block had a machining error. I pre-ordered mine and it arrived a few weeks ago. I just got a 4090 FE and haven't installed the WB yet. Have they fixed the machining error on their newer 4090 FE blocks?


the later shipped out charges should be good. the people who ordered early got an extra paper with different pad sizes. mine didnt have that.

water to core delta:

stock: 9c
OC heavy load 580w: 17c



newls1 said:


> i dont understand what oyu mean?


that the card produces lots of heat. same wattage on 3090 as on 4090 might produce more heat than on previous gen.



AdamK47 said:


> I went larger. Bought a 77" G1 with the EVO panel. Similar panel as what the C series gets this year. Resists burn in. Also have ABSL disabled through the service menu.
> 
> It helps that I have an entire room in my new house dedicated to PC Gaming.
> View attachment 2582700
> 
> View attachment 2582701


nice setup, dude is living in the future


----------



## Krzych04650

bigfootnz said:


> Can you please let me know what program you are using for overlay info, as it looks great. Thanks


The program is RTSS, through overlay editor. It allows you to build your own overlay and combine sensors from different software like HWiNFO, Afterburner, etc. Amazing tools all of them.

I can even give you the file for this particular arrangement, although it will likely need tweaking to match your system and it is generally very convoluted because of all the different sizes and alignments. 






4090 effective.rar







drive.google.com


----------



## KingEngineRevUp

Nizzen said:


> Delta for aircooled card is about 25 to 30c  This is even more insane, because it's free


Well free as in... Many cards don't fit in many cases and you sacrifice a load jet engine sound when you're at 100% for benchmarking. 

It's still very impressive engineering.


----------



## keikei

Interesting to see numbers on TW3, now that CDPR announced the next-gen release next month.


----------



## alasdairvfr

New driver plus some work on my PBO/memory that was long overdue:

I scored 26 989 in Fire Strike Ultra










I scored 11 020 in Speed Way


----------



## alasdairvfr

keikei said:


> Interesting to see numbers on TW3, now that CDPR announced the next-gen release next month.


Very interested in this as well. Game runs like a dream on this card at 4k - curious to see what the remaster brings.


----------



## keikei

alasdairvfr said:


> Very interested in this as well. Game runs like a dream on this card at 4k - curious to see what the remaster brings.


The big enhancement is RT, but I suspect better textures & such as well. It should be glorious with a 4090.


----------



## KingEngineRevUp

keikei said:


> The big enhancement is RT, but I suspect better textures & such as well. It should be glorious with a 4090.


They actually are working the the HD texture modder guy. So I imagine they might even rework stuff he's already worked on.


----------



## ESRCJ

dante`afk said:


> the later shipped out charges should be good. the people who ordered early got an extra paper with different pad sizes. mine didnt have that.
> 
> water to core delta:
> 
> stock: 9c
> OC heavy load 580w: 17c


Thanks for providing your results. My block shipped on October 17th, so mine probably has the error.


----------



## Gadfly

Does any model card have better memory than the others? Or is it all pretty much the same and just luck of the draw?

Trying to decide which card to buy.


----------



## alasdairvfr

Anyone tried using Blender benchmark??

Edit: this thing can take quite a bit of an OC, updated results


----------



## mirkendargen

Nizzen said:


> I know, because I have Tuf and strix


If you paid for a Strix, you already paid more extra than a block costs


----------



## J7SC

I dropped ambient temps to ~ 20 C (takes a while, given the building type) this morning and did a few more runs with the new water-cooling block...in addition to yesterday's results (ambient ~ 24 C), I managed to break 19K in TimeSpyEx on the 5950X in stock trim and stock OS. I subbed that one but haven't yet subbed the > 29K PortRoyal as I thought there might have been one section with artefacts (not the usual kind, but potentially weirder lights, not sure). I'll take another run at that later in the week also because it requires different resizable-BAR settings. At 20 C, shouldn't be a problem though, not least as the 5950X has a bit more left in the tank. Finally, at the bottom are temps of the block at a medium-OC Superposition at 520W+...I think the water-block install looks decent re. delta at that wattage - if anything, VRAM temps are a bit low ?









Per earlier post, I am thrilled with the 4090 in the newly patched 4K ultra FS2020 w/frame insertion etc. CP 2077 is already a joy but if their DLSS3 / frame insertion gets close to the gains with FS2020...Anyway, I had decided to hold off re. upgrading the mobo / CPU / RAM off until I saw some DLSS3 / frame insertion in games I mostly play...latest game / sim results suggest to wait a while longer


----------



## Panchovix

Gadfly said:


> Does any model card have better memory than the others? Or is it all pretty much the same and just luck of the draw?
> 
> Trying to decide which card to buy.


Gigabyte Gaming OC seems to have one of the best mems, not the best best, but above average. (Worth it for the price)

I have seen some Strix cards doing +2000 Mhz, or some MSI Liquid X ones, but yeah they are a good amount more expensive than the Gaming OC, which you could waterblock instead of paying an extra on Strix/Liquid X.


----------



## yzonker

J7SC said:


> I dropped ambient temps to ~ 20 C (takes a while, given the building type) this morning and did a few more runs with the new water-cooling block...in addition to yesterday's results (ambient ~ 24 C), I managed to break 19K in TimeSpyEx on the 5950X in stock trim and stock OS. I subbed that one but haven't yet subbed the > 29K PortRoyal as I thought there might have been one section with artefacts (not the usual kind, but potentially weirder lights, not sure). I'll take another run at that later in the week also because it requires different resizable-BAR settings. At 20 C, shouldn't be a problem though, not least as the 5950X has a bit more left in the tank. Finally, at the bottom are temps of the block at a medium-OC Superposition at 520W+...I think the water-block install looks decent re. delta at that wattage - if anything, VRAM temps are a bit low ?
> View attachment 2582901
> 
> 
> Per earlier post, I am thrilled with the 4090 in the newly patched 4K ultra FS2020 w/frame insertion etc. CP 2077 is already a joy but if their DLSS3 / frame insertion gets close to the gains with FS2020...Anyway, I had decided to hold off re. upgrading the mobo / CPU / RAM off until I saw some DLSS3 / frame insertion in games I mostly play...latest game / sim results suggest to wait a while longer


Yea, that light flashing/flickering is VRAM corruption I think, but it doesn't seem to boost the score, at least not much. I've also seen little "black holes" in some of the lights, etc... I made a run yesterday that was so badly artifacted that you couldn't hardly recognize which bench was running. LOL. 29800! Deleted it. Kinda wanted to keep it around just because it was so silly high.

I ran in to an oddity with my 13900k system. I think the reason that I couldn't quite beat my PR score was due to the way I had configured my CPU. Against general internet wisdom, PR on my system seems to not like HT off or a manual all core OC. I did some test runs and showed a 50-100pt loss compared to just running it in normal boost mode with HT on (made several runs to confirm, seemed consistent). Must be some bios gremlins or something for 13th gen on Z690. I don't think that was an issue on my 12900k.

My current 29k run was done with the CPU just running at stock.

That issue didn't seem to impact Speedway/TS/TSE though given my scores improved in all of them. Need to test though.

You or anyone else have any thoughts on this?


----------



## doom3crazy

RaMsiTo said:


> MSI Tool Summary - Graphics Card
> 
> 
> RTX 4090 VBIOS Update Tool(Resolve 93% Power Limit Problem with Afterburner) Read >>> Instruction <<< before VBIOS Update
> 
> 
> 
> 
> sites.google.com
> 
> 
> 
> 
> 
> View attachment 2582833


Interesting. I don’t have that issue. My after burner goes to 106% on power


----------



## yzonker

Oh and @J7SC, Speedway can usual tolerate 2-3 bins higher on the core clock than PR/TSE/etc... Looks like you ran them the same based on the 3DMark reported clocks.


----------



## Panchovix

yzonker said:


> Oh and @J7SC, Speedway can usual tolerate 2-3 bins higher on the core clock than PR/TSE/etc... Looks like you ran them the same based on the 3DMark reported clocks.


Can confirm, about 30Mhz more on SpeedWay, 15Mhz higher on TSE, and like -30Mhz on my case on PR


----------



## J7SC

yzonker said:


> Yea, that light flashing/flickering is VRAM corruption I think, but it doesn't seem to boost the score, at least not much. I've also seen little "black holes" in some of the lights, etc... I made a run yesterday that was so badly artifacted that you couldn't hardly recognize which bench was running. LOL. 29800! Deleted it. Kinda wanted to keep it around just because it was so silly high.
> 
> I ran in to an oddity with my 13900k system. I think the reason that I couldn't quite beat my PR score was due to the way I had configured my CPU. Against general internet wisdom, PR on my system seems to not like HT off or a manual all core OC. I did some test runs and showed a 50-100pt loss compared to just running it in normal boost mode with HT on (made several runs to confirm, seemed consistent). Must be some bios gremlins or something for 13th gen on Z690. I don't think that was an issue on my 12900k.
> 
> My current 29k run was done with the CPU just running at stock.
> 
> That issue didn't seem to impact Speedway/TS/TSE though given my scores improved in all of them. Need to test though.
> 
> You or anyone else have any thoughts on this?





yzonker said:


> Oh and @J7SC, Speedway can usual tolerate 2-3 bins higher on the core clock than PR/TSE/etc... Looks like you ran them the same based on the 3DMark reported clocks.


...sorry, on the 13900K issue you mentioned, I have zero experience with Intel's P+E core LG1700s. In general though, Port Royal likes_ fast *and* tight_ system memory. I actually ran all benches with the full 16c / 32t on my 5950X which has a bit more room...


Spoiler














 I should probably turn off either HT or one chiplet, but for now, I was just testing out how stable the GPU water-cooling is. Thanks also for the tip on Speedway clocks, I'll try that when I get back to some benching (back to 24 C+ here now...)


----------



## yzonker

Panchovix said:


> Can confirm, about 30Mhz more on SpeedWay, 15Mhz higher on TSE, and like -30Mhz on my case on PR


Yea I think SW is light on the core, heavy on the mem. It gains score with core clock, but not much. You can see my progression in these runs. 









I scored 11 087 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 11 097 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













 I scored 11 125 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Maybe those runs show some consistency too in regards to not being artifacted.


----------



## alasdairvfr

I find PR is the one on my card that just cannot handle a big core clock, SW/FS can go way up, TSE is just 2 bins shy of SW tho


SpeedWay 








TSE








PR








FSUltra


----------



## sew333

Dudes help. I am gettin near hotel <cpu intensive> when driving car 90fps to drops to 75fps. Its on 1440P,DLSS QUALITY, usage is 70-75% of gpu. I have 12900K stock,DDR4 3600mhz GEAR 1

I compared with someone and on 13900K stock and DDR5 6000mhz he have between 105-120fps. I have something broken on system or why i have 30 fps less than him. Look hes gameplay:second 0:40







I reinstalled drivers few times btw. Thanks


----------



## Nizzen

mirkendargen said:


> If you paid for a Strix, you already paid more extra than a block costs


Elmor tool + 1000w ready 🤓


----------



## newls1

J7SC said:


> I dropped ambient temps to ~ 20 C (takes a while, given the building type) this morning and did a few more runs with the new water-cooling block...in addition to yesterday's results (ambient ~ 24 C), I managed to break 19K in TimeSpyEx on the 5950X in stock trim and stock OS. I subbed that one but haven't yet subbed the > 29K PortRoyal as I thought there might have been one section with artefacts (not the usual kind, but potentially weirder lights, not sure). I'll take another run at that later in the week also because it requires different resizable-BAR settings. At 20 C, shouldn't be a problem though, not least as the 5950X has a bit more left in the tank. Finally, at the bottom are temps of the block at a medium-OC Superposition at 520W+...I think the water-block install looks decent re. delta at that wattage - if anything, VRAM temps are a bit low ?
> View attachment 2582901
> 
> 
> Per earlier post, I am thrilled with the 4090 in the newly patched 4K ultra FS2020 w/frame insertion etc. CP 2077 is already a joy but if their DLSS3 / frame insertion gets close to the gains with FS2020...Anyway, I had decided to hold off re. upgrading the mobo / CPU / RAM off until I saw some DLSS3 / frame insertion in games I mostly play...latest game / sim results suggest to wait a while longer


any pics of your block install?


----------



## changboy

I done my gpu again today and now its working fine, just tried 2 run of superposition 4k optimize and see my temp :


----------



## lawson67

Gadfly said:


> Does any model card have better memory than the others? Or is it all pretty much the same and just luck of the draw?
> 
> Trying to decide which card to buy.


My Zotac Amp Extreme does +2000 mhz on memory but anything higher than 3030mhz on the core i get artifacts


----------



## sew333

lawson67 said:


> My Zotac Amp Extreme does +2000 mhz on memory but anything higher than 3030mhz on the core i get artifacts


Why overclocking such monster? Hobby ?


----------



## sugi0lover

sew333 said:


> Dudes help. I am gettin near hotel <cpu intensive> when driving car 90fps to drops to 75fps. Its on 1440P,DLSS QUALITY, usage is 70-75% of gpu. I have 12900K stock,DDR4 3600mhz GEAR 1
> 
> I compared with someone and on 13900K stock and DDR5 6000mhz he have between 105-120fps. I have something broken on system or why i have 30 fps less than him. Look hes gameplay:second 0:40
> 
> 
> 
> 
> 
> 
> 
> I reinstalled drivers few times btw. Thanks


Ram OC is more important for gaming than CPU OC. I guess that your GPU usage of 70~75% and only 75fps got caused by the bottleneck from CPU & Ram, especially your D4 3600Mhz.
Try to overclock ram to 4000Mhz with some good sub timings and see if helps. Good luck~


----------



## sew333

sugi0lover said:


> Ram OC is more important for gaming than CPU OC. I guess that your GPU usage of 70~75% and only 75fps got caused by the bottleneck from CPU & Ram, especially your D4 3600Mhz.
> Try to overclock ram to 4000Mhz with some good sub timings and see if helps. Good luck~


OOOOOOOOOOOOOOOOOOOOOOOOOK THX hope it helps


----------



## J7SC

newls1 said:


> any pics of your block install?


...not really, my fingers were full of thermal putty  so no free / clean hand for the camera...these ones I already showed (disassembly, finished product before wiring and tubing clean-up) 



Spoiler


----------



## alasdairvfr

@J7SC how much gains you getting from DLSS3 in MSFlight? i am finding better than 100% performance improvement on initial run but literally only had a few mins just now to try it. Over 110fps where I used to get 50-some. Finally it looks and runs great, no longer have to choose between looks and performance!


----------



## newls1

J7SC said:


> ...not really, my fingers were full of thermal putty  so no free / clean hand for the camera...these ones I already showed (disassembly, finished product before wiring and tubing clean-up)
> 
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2582936
> 
> 
> View attachment 2582937
> 
> 
> View attachment 2582938


Love it!


----------



## J7SC

alasdairvfr said:


> @J7SC how much gains you getting from DLSS3 in MSFlight? i am finding better than 100% performance improvement on initial run but literally only had a few mins just now to try it. Over 110fps where I used to get 50-some. Finally it looks and runs great, no longer have to choose between looks and performance!


Per earlier posts, very similar - 4K Ultra max detail DLSS 3 / Frame Insertion etc can actually hit the 4K 120 OLED limit - and that is without any overclock whatsoever


----------



## doom3crazy

RaMsiTo said:


> MSI Tool Summary - Graphics Card
> 
> 
> RTX 4090 VBIOS Update Tool(Resolve 93% Power Limit Problem with Afterburner) Read >>> Instruction <<< before VBIOS Update
> 
> 
> 
> 
> sites.google.com
> 
> 
> 
> 
> 
> View attachment 2582833


you by chance have the suprim x air bios in a .rom vs exe where I could use nvflash? Or am I able to run the exe and still flash it to my card without problems?


----------



## KingEngineRevUp

Reporting in on my EKWB results. I have to go grab my son at daycare so will show more photos of the install process and what really annoyed me about the install.










At *520W *of power draw the block produces a temperature of *Delta 22C*, hotspot is *8-9C* over GPU temperatures.

I used MX-6 thermal paste. There's also a bunch of bubbles cycling through my loop still, so not sure if it'll get a little better. My guess is it's not or it'll be negligible like 21C.

I guess that's not too bad but it's not great either. For reference, this would be *equivalent to running a 3090 at 837W* based off of Optimus graphs from their tweet.

Edit: And the max temperature on the *memory modules was 54C*... Please just use the stock pads and save yourselves some money...

Edit 2:

If these numbers are correct, the EKWB runs about 3C hotter than the prototype Optimus block.










Edit 3: Full gallery posted here









[Official] NVIDIA RTX 4090 Owner's Club


@J7SC how much gains you getting from DLSS3 in MSFlight? i am finding better than 100% performance improvement on initial run but literally only had a few mins just now to try it. Over 110fps where I used to get 50-some. Finally it looks and runs great, no longer have to choose between looks...




www.overclock.net


----------



## lucasmedia

alasdairvfr said:


> New driver plus some work on my PBO/memory that was long overdue:
> 
> I scored 26 989 in Fire Strike Ultra
> View attachment 2582889
> 
> 
> 
> I scored 11 020 in Speed Way
> View attachment 2582890


Do you mind sharing your 5950x PBO settings in BIOS, I haven't touched my 5950x in a while and would love to revisit.

Do you think our 4090 is CPU limited in 4k?


----------



## sew333

Update


----------



## yzonker

KingEngineRevUp said:


> Reporting in on my EKWB results. I have to go grab my son at daycare so will show more photos of the install process and what really annoyed me about the install.
> 
> View attachment 2582945
> 
> 
> At *520W *of power draw the block produces a temperature of *Delta 22C*, hotspot is *8-9C* over GPU temperatures.
> 
> I used MX-6 thermal paste. There's also a bunch of bubbles cycling through my loop still, so not sure if it'll get a little better. My guess is it's not or it'll be negligible like 21C.
> 
> I guess that's not too bad but it's not great either. For reference, this would be *equivalent to running a 3090 at 837W* based off of Optimus graphs from their tweet.
> 
> Edit: And the max temperature on the *memory modules was 54C*... Please just use the stock pads and save yourselves some money...
> 
> Edit 2:
> 
> If these numbers are correct, the EKWB runs about 3C hotter than the prototype Optimus block.
> 
> View attachment 2582946


I've said this before a long time ago, a 2nd D5 would probably bring that delta down more inline with some of the other numbers people have reported too. It was good for about 2C when I added one.


----------



## motivman

KingEngineRevUp said:


> Reporting in on my EKWB results. I have to go grab my son at daycare so will show more photos of the install process and what really annoyed me about the install.
> 
> View attachment 2582945
> 
> 
> At *520W *of power draw the block produces a temperature of *Delta 22C*, hotspot is *8-9C* over GPU temperatures.
> 
> I used MX-6 thermal paste. There's also a bunch of bubbles cycling through my loop still, so not sure if it'll get a little better. My guess is it's not or it'll be negligible like 21C.
> 
> I guess that's not too bad but it's not great either. For reference, this would be *equivalent to running a 3090 at 837W* based off of Optimus graphs from their tweet.
> 
> Edit: And the max temperature on the *memory modules was 54C*... Please just use the stock pads and save yourselves some money...
> 
> Edit 2:
> 
> If these numbers are correct, the EKWB runs about 3C hotter than the prototype Optimus block.
> 
> View attachment 2582946


Any idea what your flow rate is?


----------



## KingEngineRevUp

yzonker said:


> I've said this before a long time ago, a 2nd D5 would probably bring that delta down more inline with some of the other numbers people have reported too. It was good for about 2C when I added one.


Yeah I was laying down at night thinking things over, and I remember you and I discussed putting another D5 in my case, which would definitely help because I have a stupid amount of 90 degree fittings. The money ultimately went into new fans though, which bring my water temperatures down 3-5C compared to the Corsair QL120s, which are horrible radiator fans BTW.

Another D5 is in the cards one day.



motivman said:


> Any idea what your flow rate is?


I do not have a flowrate meter in my loop unfortunately.


----------



## Antsu

yzonker said:


> I've said this before a long time ago, a 2nd D5 would probably bring that delta down more inline with some of the other numbers people have reported too. It was good for about 2C when I added one.


Yep, it's definitely diminishing returns at that point, but more flowrate does help. I run a very simple loop of just GPU block and 560 rad, and even then I instantly jump up a few degrees if I set both pumps to 50%.

I'd love to test what a third pump would do, but I already have some issues where the waterflow is so violent inside my reservoir that air bubbles get sucked back into the loop before they can get to the top and pop (turbulence? sorry for being such a smoothbrain). I know this isn't the thread for this, but in case anyone has experience dealing with this, feel free to give me some protips. Currently my reservoir is typical EK 140mm res/D5 combo with the default inlet/outlet ports used, maybe there is something better for high flow loops?


----------



## yzonker

Antsu said:


> Yep, it's definitely diminishing returns at that point, but more flowrate does help. I run a very simple loop of just GPU block and 560 rad, and even then I instantly jump up a few degrees if I set both pumps to 50%.
> 
> I'd love to test what a third pump would do, but I already have some issues where the waterflow is so violent inside my reservoir that air bubbles get sucked back into the loop before they can get to the top and pop (turbulence? sorry for being such a smoothbrain). I know this isn't the thread for this, but in case anyone has experience dealing with this, feel free to give me some protips. Currently my reservoir is typical EK 140mm res/D5 combo with the default inlet/outlet ports used, maybe there is something better for high flow loops?


Almost nothing (3rd D5). I have a chiller with a PMP-500 included. That pump is much more powerful than a D5 and my delta barely budges when I add it to the loop.


----------



## yzonker

KingEngineRevUp said:


> Yeah I was laying down at night thinking things over, and I remember you and I discussed putting another D5 in my case, which would definitely help because I have a stupid amount of 90 degree fittings. The money ultimately went into new fans though, which bring my water temperatures down 3-5C compared to the Corsair QL120s, which are horrible radiator fans BTW.
> 
> Another D5 is in the cards one day.
> 
> 
> 
> I do not have a flowrate meter in my loop unfortunately.


Reality is that's it's zero performance gain anyway. More of just knowing you have everything you can get I think. I do like having 2 pumps for redundancy though.


----------



## th3illusiveman

alasdairvfr said:


> @J7SC how much gains you getting from DLSS3 in MSFlight? i am finding better than 100% performance improvement on initial run but literally only had a few mins just now to try it. Over 110fps where I used to get 50-some. Finally it looks and runs great, no longer have to choose between looks and performance!


DLSS3 will in theory always offer up around 100% more FPS then DLSS 2, since it's essentially just interpolation.


----------



## sew333

th3illusiveman said:


> DLSS3 will in theory always offer up around 100% more FPS then DLSS 2, since it's essentially just interpolation.


but in real world how many 50%?


----------



## alasdairvfr

sew333 said:


> but in real world how many 50%?


For me its 100+%. Hard to say entirely because I am fairly certain the recent patch improved performance outside the scope of frame generation. At the end of the day it runs super well and no longer frustrating to play being so cpu and gpu bottlenecked at the same time.


----------



## alasdairvfr

lucasmedia said:


> Do you mind sharing your 5950x PBO settings in BIOS, I haven't touched my 5950x in a while and would love to revisit.
> 
> Do you think our 4090 is CPU limited in 4k?


I'll share the settings in the am, I bumped my memory a bit which in turn caused me to need to dial in my PBO for stability.

The 5950x can be cpu limiting at 4k in some games but a decent pbo and decent memory speed/timings can help mitigate that.


----------



## Antsu

yzonker said:


> Almost nothing (3rd D5). I have a chiller with a PMP-500 included. That pump is much more powerful than a D5 and my delta barely budges when I add it to the loop.


Yeah, that's what I figured but I might still try it just to confirm it with my own eyes, hehe. My CPU loop "only" has a single D5 so I could just throw the extra pump into that loop once I've satisfied my autism.  Thanks for the input!


----------



## J7SC

Antsu said:


> Yeah, that's what I figured but I might still try it just to confirm it with my own eyes, hehe. My CPU loop "only" has a single D5 so I could just throw the extra pump into that loop once I've satisfied my autism.  Thanks for the input!


...I recently went from 3x D5 in the loop that now has the 4090 down to 2x D5 - no loss in performance, and easier to synchro the older-style D5s ....ON THE OTHER HAND, I never run a loop with just 1x D5, unless for testing purposes. FYI, DerBauer did a YouTube vid for caseking.tv some years back comparing D5 to DDC and he confirmed what I already knew which is 2x D5s in a normal loop is probably a great option as D5s have a larger internal diameter and can at times 'temporarily starve' under certain circumstances. Not an issue with 2x D5s though, on the contrary. Besides, 2x D5 adds a nice fail-safe, for example in workstation and server applications where any kind of unplanned downtime is expensive.

Adding a third one doesn't make that much sense from a performance POV unless it is an extremely long and complex loop with lots of resistance, but I still have another system currently running w/ 3x D5s for a loop with dual w-cooled 2080 Tis, a Threadripper and lots of rads. BTW, I once ran a single giant loop with 6x D5s (I can't really remember why...).


----------



## KingEngineRevUp

KingEngineRevUp said:


> Reporting in on my EKWB results. I have to go grab my son at daycare so will show more photos of the install process and what really annoyed me about the install.
> 
> View attachment 2582945
> 
> 
> At *520W *of power draw the block produces a temperature of *Delta 22C*, hotspot is *8-9C* over GPU temperatures.
> 
> I used MX-6 thermal paste. There's also a bunch of bubbles cycling through my loop still, so not sure if it'll get a little better. My guess is it's not or it'll be negligible like 21C.
> 
> I guess that's not too bad but it's not great either. For reference, this would be *equivalent to running a 3090 at 837W* based off of Optimus graphs from their tweet.
> 
> Edit: And the max temperature on the *memory modules was 54C*... Please just use the stock pads and save yourselves some money...
> 
> Edit 2:
> 
> If these numbers are correct, the EKWB runs about 3C hotter than the prototype Optimus block.
> 
> View attachment 2582946


Here is the full gallery, unboxing and the stupid back plate standoffs, you'll understand why this part was really dumb.



http://imgur.com/a/ArnYRIp


----------



## J7SC

KingEngineRevUp said:


> Here is the full gallery, unboxing and the stupid back plate standoffs, you'll understand why this part was really dumb.
> 
> 
> 
> http://imgur.com/a/ArnYRIp


That's a super-clean build - love it !
---
I have been playing around w/ 3DM Speedway to find best VRAM clocks, and as @yzonker already suggested, VRAM is more important than GPU clocks in Speedway - though GPU clocks don't hurt either, at least up to a point...that point seems to be die-internal heat generation. I ran about 20 separate tests, most with the GPU clock with a fixed offset. Just changing PL and voltage slider shows an actual / effective GPU clock that varies by 105 MHz _at the same GPU offset and same ambient temps_...also, higher ambient temps will actually allow a higher starting offset but that is obviously 'wasted' more quickly by the boost algorithm's temp inputs. All in all, these cars are incredibly efficient at stock (450W on mine) but efficiency falls of a cliff thereafter.


----------



## sew333

alasdairvfr said:


> For me its 100+%. Hard to say entirely because I am fairly certain the recent patch improved performance outside the scope of frame generation. At the end of the day it runs super well and no longer frustrating to play being so cpu and gpu bottlenecked at the same time.





KingEngineRevUp said:


> Here is the full gallery, unboxing and the stupid back plate standoffs, you'll understand why this part was really dumb.
> 
> 
> 
> http://imgur.com/a/ArnYRIp


nice


----------



## jootn2kx

alasdairvfr said:


> For me its 100+%. Hard to say entirely because I am fairly certain the recent patch improved performance outside the scope of frame generation. At the end of the day it runs super well and no longer frustrating to play being so cpu and gpu bottlenecked at the same time.


Agreed a plague tale: reqiem is a nice example of that, where the CPU bottlenecks were hitting hard on my 5800X3D + 4090 I saw my GPU usage going down to 50% in heavy parts. Enabled frame generation and boom my fps going from 80 to 150fps and gpu usage stayed around 98-100% fantastic.
Looking forward to use frame generation on cyberpunk and future games


----------



## alasdairvfr

@lucasmedia


lucasmedia said:


> Do you mind sharing your 5950x PBO settings in BIOS, I haven't touched my 5950x in a while and would love to revisit.


Tried to extract the BIOS as text but this motherboard (x570 Taichi) seems to only export files it can import, my Asus could export as text.

PBO:
PPT 225
TDC 150
EDC 170

Scalar Auto

CO: -26,-26,-21,-30,-29,-30,-19,-25-30,-30-,30-,30-,30-,30,-27,-30

cores 2/6 are my "best" cores, the rest I started at 30 then lowered until I could pass stability tests in OCCT without errors. That core 14 just needed to be dialed in a little as well, sent the pc into a random crash at idle. Keeping an eye on things for the next few days.

Also, in dealing with memory, the default voltages can be to high or low so my SoC which was 1.100v I lowered to 1.060 or 1.07 I can't remember any more but that helped a fair bit in bumping my RAM from 3600 to 3800 keeping the same timings.

Running 4x16 Crucial Ballistix Micron B-Die which is okay RAM but nothing crazy.


----------



## lawson67

Think this is the best i can get out my Zotac Amp extreme with the stock 500w bios even though in timespy is does not hit 500w i don't know why?, my vram scales all the way up to 2000mhz though which is pleasing, and the Core clocks over 3.0ghz now so i am happy with that, will upping the IF help my Ryzen 7 5800x3D?


----------



## Panchovix

lawson67 said:


> Think this is the best i can get out my Zotac Amp extreme with the stock 500w bios even though in timespy is does not hit 500w i don't know why?, my vram scales all the way up to 2000mhz though which is pleasing, and the Core clocks over 3.0ghz now so i am happy with that, will upping the IF help my Ryzen 7 5800x3D?
> 
> View attachment 2583047


You have way better clocks than me but lower scores, did you enable hardware accelerated gpu scheduling and forced ReBar?










(I get lower scores at +1200Mhz on VRAM instead of +1100Mhz or close to that)


----------



## EarlZ

Is there a stress test app I can use to see if my core/vram is stable ans not producing rendering errors w/o the need for me to contantly watch my screen? I am getting a texture flickering in 1 game (on certain water/glass textures) but 3dmark,cp2077 seems to be fine.


----------



## Frosted racquet

EarlZ said:


> Is there a stress test app I can use to see if my core/vram is stable ans not producing rendering errors w/o the need for me to contantly watch my screen? I am getting a texture flickering in 1 game (on certain water/glass textures) but 3dmark,cp2077 seems to be fine.


Try comparing benchmark scores with ECC enabled in drivers and with it disabled. If your scores go down with ECC enabled it means you're getting artifacts.
Also, try this tool GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability


----------



## lawson67

Panchovix said:


> You have way better clocks than me but lower scores, did you enable hardware accelerated gpu scheduling and forced ReBar?
> 
> View attachment 2583054
> 
> 
> (I get lower scores at +1200Mhz on VRAM instead of +1100Mhz or close to that)





Panchovix said:


> You have way better clocks than me but lower scores, did you enable hardware accelerated gpu scheduling and forced ReBar?
> 
> View attachment 2583054
> 
> 
> (I get lower scores at +1200Mhz on VRAM instead of +1100Mhz or close to that)


Rebar is on but i have not enable hardware accelerated gpu scheduling and forced ReBar?, how do i do that?
Edit:- turned on enable hardware accelerated gpu scheduling, but how do i force rebar?


----------



## Panchovix

lawson67 said:


> Rebar is on but i have not enable hardware accelerated gpu scheduling and forced ReBar?, how do i do that?


Here is a video that explains it





You just have to select "3DMark TimeSpy" and then change the options in the video to force it; also, you can force it on newer nvidia profile inspector on "Common"


















Since you have a Ryzen 5000, you can force it on Port Royal and it will also help.


----------



## lawson67

Panchovix said:


> Here is a video that explains it
> 
> 
> 
> 
> 
> You just have to select "3DMark TimeSpy" and then change the options in the video to force it; also, you can force it on newer nvidia profile inspector on "Common"
> View attachment 2583058
> 
> View attachment 2583059
> 
> 
> 
> Since you have a Ryzen 5000, you can force it on Port Royal and it will also help.


What power limit are you running though your card, i am only using stock bios 500w, are you using more?


----------



## Panchovix

lawson67 said:


> What power limit are you running though your card, i am only using stock bios 500w, are you using more?


I'm using my stock VBIOS (TUF non-OC), the limit is 600W. Though in Port Royal, I haven't seen more than 500W (maybe some spikes), even at 1.1

On TimeSpy Extreme I see more than 500W


----------



## EarlZ

Frosted racquet said:


> Try comparing benchmark scores with ECC enabled in drivers and with it disabled. If your scores go down with ECC enabled it means you're getting artifacts.
> Also, try this tool GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability


I havent tried to compare with ECC enabled yet but I expected this to actually lower the score when enabled on any benchmark.

Edit: Did a quick google search and memory bandwith is expected to drop with ECC enabled. But ill try to see if that get rids of the texture flashing as it only happens with 1 game in 2 areas


----------



## lawson67

Panchovix said:


> I'm using my stock VBIOS (TUF non-OC), the limit is 600W. Though in Port Royal, I haven't seen more than 500W (maybe some spikes), even at 1.1
> 
> On TimeSpy Extreme I see more than 500W


your higher power draw bios will definitely help you beat my higher clock's, i will have a go at force rebar etc later on when i get some time, thanks for the heads up mate


----------



## Panchovix

lawson67 said:


> your higher power draw bios will definitely help you beat my higher clock's, i will have a go at force rebar etc later on when i get some time, thanks for the heads up mate


Doing those tweaks you will probably get top 100 on HoF, so will wait to see that haha, good luck mate


----------



## alasdairvfr

Maybe I'm alone in this but reBAR for me helps in PR but actually lowers my TimeSpy by a fair bit


----------



## Sheyster

Panchovix said:


> enable hardware accelerated gpu scheduling


Does nVidia have an official recommendation for turning this on for regular gaming? I know this option has been around for a while, in Windows 10 as well. I've heard it only helps for low-end or midrange systems (CPU-wise).


----------



## alasdairvfr

Sheyster said:


> Does nVidia have an official recommendation for turning this on for regular gaming? I know this option has been around for a while, in Windows 10 as well. I've heard it only helps for low-end or midrange systems (CPU-wise).


if you want DLSS3 frame generation you need it enabled otherwise the option is greyed out. I remember troubleshooting issues in the past disabling this setting could help with some games that had stuttering issues


----------



## Sheyster

alasdairvfr said:


> if you want DLSS3 frame generation you need it enabled otherwise the option is greyed out. I remember troubleshooting issues in the past disabling this setting could help with some games that had stuttering issues


Then I guess it's probably best to enable it. If I run into stuttering issues it's easy enough to disable.


----------



## Panchovix

Sheyster said:


> Does nVidia have an official recommendation for turning this on for regular gaming? I know this option has been around for a while, in Windows 10 as well. I've heard it only helps for low-end or midrange systems (CPU-wise).


No official recommendation, even in latest drivers they suggest to disable it to fix some issues.

I haven't had any though, so I just leave it enabled, and also helps me a little on benchmarks sometimes lol



alasdairvfr said:


> Maybe I'm alone in this but reBAR for me helps in PR but actually lowers my TimeSpy by a fair bit


I think it helps more on slower CPUs (like my 5800X)


----------



## lawson67

Panchovix said:


> Doing those tweaks you will probably get top 100 on HoF, so will wait to see that haha, good luck mate


Wow just done a quick one after your tweak recommendation's, i am sure i can get better when its colder into 40,000 area but i am in the top 100 at 39th with my system and graphic score


----------



## yzonker

alasdairvfr said:


> Maybe I'm alone in this but reBAR for me helps in PR but actually lowers my TimeSpy by a fair bit


Graphics score or overall score? It does nerf your CPU score to some extent.


----------



## alasdairvfr

yzonker said:


> Graphics score or overall score? It does nerf your CPU score to some extent.



Turning on reBAR just did a run and it dropped my CPU score by 2k or so which translates into about 1400 points drop overall. Graphics score about the same. Am I doing something wrong?

Clock/mem settings both the same (+255/+1600) which is my stable point for almost everything

Rebar off:









Rebar on:


----------



## Frosted racquet

EarlZ said:


> I havent tried to compare with ECC enabled yet but I expected this to actually lower the score when enabled on any benchmark.


I'm not sure, you can compare non-OC results with ECC enabled/disabled. But to my knowledge, ECC will only decrease scores if there are errors corrected.


----------



## yzonker

alasdairvfr said:


> Turning on reBAR just did a run and it dropped my CPU score by 2k or so which translates into about 1400 points drop overall. Graphics score about the same. Am I doing something wrong?
> 
> Clock/mem settings both the same (+255/+1600) which is my stable point for almost everything
> 
> Rebar off:
> View attachment 2583071
> 
> 
> Rebar on:
> View attachment 2583072


It may be the new driver and/or 4090 behaves differently. I haven't fully tested it yet with my new system. I'll try it tonight when I get a chance.


----------



## Panchovix

alasdairvfr said:


> Turning on reBAR just did a run and it dropped my CPU score by 2k or so which translates into about 1400 points drop overall. Graphics score about the same. Am I doing something wrong?
> 
> Clock/mem settings both the same (+255/+1600) which is my stable point for almost everything
> 
> Rebar off:
> View attachment 2583071
> 
> 
> Rebar on:
> View attachment 2583072


Interesting, gonna try tonight how it goes. (I forced it by habit when I was doing it with the 3080, which I can assure increased the score there)
I get about 1k less CPU score on my 5800X when I force it.


----------



## lawson67

Panchovix said:


> Interesting, gonna try tonight how it goes. (I forced it by habit when I was doing it with the 3080, which I can assure increased the score there)
> I get about 1k less CPU score on my 5800X when I force it.


Do you enable rebar just for 3d mark? I enabled it to globally so on for everything


----------



## KingEngineRevUp

I'm having very low GPU load on Timespy, I assume it's a CPU bottleneck? I didn't enable REBAR and HAGS is on. Any tips? Running Windows 11. 

I scored 26 197 in Time Spy


----------



## Panchovix

lawson67 said:


> Do you enable rebar just for 3d mark? I enabled it to globally so on for everything


I enabled ReBar for TimeSpy, and Port Royal. (and DX12 benchmarks in general, except the next bench)
On SpeedWay there seems to be no difference.
On FireStrike, I get both less GPU and CPU scores with ReBar Forced. (DX11)


----------



## yzonker

KingEngineRevUp said:


> I'm having very low GPU load on Timespy, I assume it's a CPU bottleneck? I didn't enable REBAR and HAGS is on. Any tips? Running Windows 11.
> 
> I scored 26 197 in Time Spy
> 
> View attachment 2583087


Win11 22H2? I had trouble with 22H2 killing my CPU score. Reverting 21H2 fixed it. I don't think my GPU score was hurt so badly though.


----------



## Benni231990

For all Suprim user Alphacool has announced her Waterblock you can now pre Order 









Alphacool Eisblock Aurora Acryl GPX-N RTX 4090 Suprim mit Backplate


Die nächste Generation - Der Alphacool Eisblock Aurora Wasserkühler für Geforce RTX 4XXX Grafikkarten. Um die enorme Abwärme dieser Grafikkartengeneration bestmöglich abzuführen, wurden am Wasserkühler zahlreiche Optimierungen im...




www.alphacool.com


----------



## Azazil1190

***Update.
Im gonna rma my tuf even at stock the card can reach easy 77-80c hot spot .
After two weeks so much different at hotspot temp its enough


----------



## alasdairvfr

KingEngineRevUp said:


> I'm having very low GPU load on Timespy, I assume it's a CPU bottleneck? I didn't enable REBAR and HAGS is on. Any tips? Running Windows 11.
> 
> I scored 26 197 in Time Spy
> 
> View attachment 2583087


How is your DDR memory? Timespy will drop CPU/GPU scores if your system memory isn't fast and tight.

Cant speak of the W11, I'm still on 10


----------



## alasdairvfr

I can confirm (again) PR sees increased performance with reBAR forced on. So for me at least its just TimeSpy (normal) that doesn't like it:

reBAR off:










reBAR on:


----------



## Panchovix

Azazil1190 said:


> ***Update.
> Im gonna rma my tuf even at stock the card can reach easy 77-80c hot spot .
> After two weeks so much different at hotspot temp its enough


I've been having similar hotspot temps at ~550W on my TUF since the beginning, I just assumed it's hot AF lol (Core temps don't go above 65-66°C). No change though, would lose the warranty if I open the card.


----------



## KingEngineRevUp

yzonker said:


> Win11 22H2? I had trouble with 22H2 killing my CPU score. Reverting 21H2 fixed it. I don't think my GPU score was hurt so badly though.


I just put the 5800X3D in. I think PBO is off. I don't think it was going above 3.5 GHz. I'm not sure what the issue is. I'm thinking of doing a clean Windows 10 install on a external hard for benchmarking.


----------



## GRABibus

Sheyster said:


> Does nVidia have an official recommendation for turning this on for regular gaming? I know this option has been around for a while, in Windows 10 as well. I've heard it only helps for low-end or midrange systems (CPU-wise).


I have higher GPU usage with it enabled (W10 and W11).


----------



## Azazil1190

Panchovix said:


> I've been having similar hotspot temps at ~550W on my TUF since the beginning, I just assumed it's hot AF lol (Core temps don't go above 65-66°C). No change though, would lose the warranty if I open the card.


If it was at 550w it will be fine but for 450w its too much.
In the beginning it was 10-12c lower (before two weeks)at 520w.
The strange thing is core temp and memory temps are the same like the beginning 
58-60c max for core and 62-64 for the memory.
Lets say im 2c higher for core and memory but thats ok its nothing.
But hotspot? 65c(2 weeks ago) vs 75/78c(today)is more than enough

Of course i don't want to open and repaste and repad the card because is so new.


----------



## J7SC

alasdairvfr said:


> I can confirm (again) PR sees increased performance with reBAR forced on. So for me at least its just TimeSpy (normal) that doesn't like it:
> 
> reBAR off:
> View attachment 2583100
> 
> 
> 
> reBAR on:
> View attachment 2583098


I generally leave global r_BAR on, but that is based mostly on the experience with the 3090. PR definitely gains via forced r_BAR while TimeSpyEx seems to see slightly reduced scores with it on...same for Superposition. But I don't run benchmarks all that much and am more concerned with certain DX12 games, such as FS2020 and CP 2077. In those, it's very hard to discern which r_BAR approach is better. With FS2020, r_BAR forced on seemed to help a bit, not hurt, but I'll have to check that out with the 4090 once I have the time.



Azazil1190 said:


> If it was at 550w it will be fine but for 450w its too much.
> In the beginning it was 10-12c lower (before two weeks)at 520w.
> The strange thing is core temp and memory temps are the same like the begging.
> 58-60c max for core and 62-64 for the memory.
> Lets say im 2c higher for core and memory but thats ok its nothing.
> But hotspot? 65c(2 weeks ago) vs 75/78c(today)is more than enough


...sounds like TIM application wasn't that good from the factory, and/or the mount has shifted a bit (can happen after multiple heat cycles, depending on the original assembly and on the type of TIM re. pump-out). Relatively higher hotspot compared to general GPU temp is often a sign of that (happened on my 3090, fixed it).

...I would take lots of HWInfo / GPUz screenshots to document the deteriorating hotspot, just in case you want to remount / re-paste later and are worried about warranty coverage.


----------



## 8472

__ https://twitter.com/i/web/status/1592570552426049536


----------



## GRABibus

Azazil1190 said:


> If it was at 550w it will be fine but for 450w its too much.
> In the beginning it was 10-12c lower (before two weeks)at 520w.
> The strange thing is core temp and memory temps are the same like the beginning
> 58-60c max for core and 62-64 for the memory.
> Lets say im 2c higher for core and memory but thats ok its nothing.
> But hotspot? 65c(2 weeks ago) vs 75/78c(today)is more than enough
> 
> Of course i don't want to open and repaste and repad the card because is so new.


Huuuum

burnt cables, hot spot deterioration….

my hotspot is 9-10 degrees beyond GPU temp on my gaming OC on stock cooler (Screenshot) :











The card has returned in its original package currently, waiting for next build (7000 X3D CPU and X670 plateform) 😊


----------



## Azazil1190

...sounds like TIM application wasn't that good from the factory, and/or the mount has shifted a bit (can happen after multiple heat cycles, depending on the original assembly and on the type of TIM re. pump-out). Relatively higher hotspot compared to general GPU temp is often a sign of that (happened on my 3090, fixed it).

...I would take lots of HWInfo / GPUz screenshots to document the deteriorating hotspot, just in case you want to remount / re-paste later and are worried about warranty coverage.
[/QUOTE]

I think the same but i don't want to open it because its new.If doesn't pass the rma for that reason i will repaste


----------



## Azazil1190

GRABibus said:


> Huuuum
> 
> burnt cables, hot spot deterioration….
> 
> my hotspot is 9-10 degrees beyond GPU temp on my gaming OC on stock cooler.
> 
> The card has returned in its original package currently, waiting for next build (7000 X3D CPU and X670 plateform) 😊


My giga thnx god is still fine


----------



## lawson67

Panchovix said:


> I enabled ReBar for TimeSpy, and Port Royal. (and DX12 benchmarks in general, except the next bench)
> On SpeedWay there seems to be no difference.
> On FireStrike, I get both less GPU and CPU scores with ReBar Forced. (DX11)


I am getting 3 higher FPS in tomb Rader with forced Rebar on, ill test some other games in abit and try to work out how to force rebar on only certain games etc, also the differance between the Ryzen 7 5800x and the 3D version is insane, i was 46% GPU bound with the Ryzen 7 5800x, but with the 3D one i am 99% Gpu Bound


----------



## yzonker

KingEngineRevUp said:


> I just put the 5800X3D in. I think PBO is off. I don't think it was going above 3.5 GHz. I'm not sure what the issue is. I'm thinking of doing a clean Windows 10 install on a external hard for benchmarking.


There is no PBO for the x3D. All you can do is use the PBO tuner app someone wrote that allows you to change CO values from Windows. 

Seems like I recall some people with that issue with it staying at base clocks but can't remember what the issue was. Make sure the CMOS is cleared and maybe even pull the battery. Verify you're on the latest bios. Maybe try switching power plans. A Google search turned up a bunch of hits.

You should do some searches in the 5800x3D thread directly. I think something might turn up. I followed that thread until I went to the blue side last summer.


----------



## Panchovix

Azazil1190 said:


> If it was at 550w it will be fine but for 450w its too much.
> In the beginning it was 10-12c lower (before two weeks)at 520w.
> The strange thing is core temp and memory temps are the same like the beginning
> 58-60c max for core and 62-64 for the memory.
> Lets say im 2c higher for core and memory but thats ok its nothing.
> But hotspot? 65c(2 weeks ago) vs 75/78c(today)is more than enough
> 
> Of course i don't want to open and repaste and repad the card because is so new.


Oh for sure, at 450W is too hot having near 80°C on hotspot (I mean, not bad but too hot for my taste)
The weird thing is that it changed with time, something wasn't goin good there.

Good luck with the RMA mate!




lawson67 said:


> I am getting 3 higher FPS in tomb Rader with forced Rebar on, ill test some other games in abit and try to work out how to force rebar on only certain games etc, also the differance between the Ryzen 7 5800x and the 3D version is insane, i was 46% GPU bound with the Ryzen 7 5800x, but with the 3D one i am 99% Gpu Bound


Some games work as default with ReBar, if you have enabled it on the VBIOS (like Forza Horizon, MFS2020, etc), on others you have to force it.
And yeah, I'm waiting for the 7800X3D, else I would have upgraded from my 5800X to a 5800X3D, I get CPU bottlenecked sometimes at 4K


----------



## J7SC

yzonker said:


> There is no PBO for the x3D. All you can do is use the PBO tuner app someone wrote that allows you to change CO values from Windows.
> 
> Seems like I recall some people with that issue with it staying at base clocks but can't remember what the issue was. Make sure the CMOS is cleared and maybe even pull the battery. Verify you're on the latest bios. Maybe try switching power plans. A Google search turned up a bunch of hits.
> 
> You should do some searches in the 5800x3D thread directly. I think something might turn up. I followed that thread until I went to the blue side last summer.


...there actually are some recent custom bios listed over at the Asus X570 thread which allow for CO (and other?) oc steps for 5800X3D...come with the usual 'use at your own risk' disclaimer.


----------



## cheddardonkey

KingEngineRevUp said:


> I just put the 5800X3D in. I think PBO is off. I don't think it was going above 3.5 GHz. I'm not sure what the issue is. I'm thinking of doing a clean Windows 10 install on a external hard for benchmarking.


 Make sure your bios isnt trying to OC the 5800X3D with autotuning. The X3D will boost to 4.45 on its own but it will stay locked at 3.4 if your bios attempts to OC it.


----------



## PLATOON TEKK

Optimus Strix/TUF block available


----------



## GRABibus

yzonker said:


> There is no PBO for the x3D.


There are some Modded bioses :









ASUS ROG X570 Crosshair VIII Overclocking &amp...


Hi, is it possible to add inside the Dark Hero bios, same functions of the Extreme edition like "core voltage suspension" ... Sadly it is not possible. Sorry. is it possible to include PBO in bios older than 4201 , which is age 1.2.0.6b?[/USER]...




www.overclock.net


----------



## KingEngineRevUp

yzonker said:


> There is no PBO for the x3D. All you can do is use the PBO tuner app someone wrote that allows you to change CO values from Windows.
> 
> Seems like I recall some people with that issue with it staying at base clocks but can't remember what the issue was. Make sure the CMOS is cleared and maybe even pull the battery. Verify you're on the latest bios. Maybe try switching power plans. A Google search turned up a bunch of hits.
> 
> You should do some searches in the 5800x3D thread directly. I think something might turn up. I followed that thread until I went to the blue side last summer.


Thanks, I need to investigate this because that Timespy graphs clearly show my GPU hits a wall somehow. My core is at 3045-3030 at the start but during test 1 the GPU usage goes down a lot and the core relaxes too.

Very odd behavior.


----------



## lawson67

Well i've just managed to get myself into the top 20 on Port Royal


----------



## GRABibus

lawson67 said:


> Well i've just managed to get myself into the top 20 on Port Royal
> 
> View attachment 2583154


Top 20 ?


----------



## yzonker

J7SC said:


> ...there actually are some recent custom bios listed over at the Asus X570 thread which allow for CO (and other?) oc steps for 5800X3D...come with the usual 'use at your own risk' disclaimer.


Yea they were doing that way back when I got mine. But I think all it gets is CO/lmits (and only reducing limits) and there's a batch version of PBO Tuner that you can run at startup. I didn't feel like risking bricking my mobo just for that.

I could be wrong as I haven't followed it in a while, but most of that stuff is hard coded in to the CPU so the bios can't change it is my understanding.


----------



## lawson67

GRABibus said:


> Top 20 ?


Yes 19th in 3Dmark Port Royal top 100 here


----------



## KingEngineRevUp

alasdairvfr said:


> How is your DDR memory? Timespy will drop CPU/GPU scores if your system memory isn't fast and tight.
> 
> Cant speak of the W11, I'm still on 10


3800 MHz CL16, timings are pretty tight


----------



## alasdairvfr

RTX 4080 looks to be about 30 less performance compared to 4090


----------



## alasdairvfr

KingEngineRevUp said:


> 3800 MHz CL16, timings are pretty tight


you should be hitting like 29.5k or more with decent settings. I think something's up with you CPU. Run a few CBs and test your stability on OCCT (plus cpu/memory bench). I discovered that some minor stability issues that never cause crashing can still plummet performance due to errata.


----------



## Panchovix

lawson67 said:


> Yes 19th in 3Dmark Port Royal top 100 here


It seems the ReBar and HAGPU Scheduling did help you, that's an small (or not so much) bonus to PR and TS/TSE


----------



## lawson67

Panchovix said:


> It seems the ReBar and HAGPU Scheduling did help you, that's an small (or not so much) bonus to PR and TS/TSE


I got into the 40,000 grafic score now but i think i need the 600w bios to get much higher my Zotac 500w bios is not cutting it, plus i don't even know how to flash the bios on a Nvida card i've been team red for years until the 4090 came out, i can flash any bios on the AMD cards with my eyes closed lol


----------



## yzonker

lawson67 said:


> I got into the 40,000 grafic score now but i think i need the 600w bios to get much higher my Zotac 500w bios is not cutting it, plus i don't even know how to flash the bios on a Nvida card i've been team red for years until the 4090 came out, i can flash any bios on the AMD cards with my eyes closed lol
> 
> View attachment 2583164


Just,

nvflash64 -b <original bios> (backup)
nvflash64 --protectoff

Disable card in device manager. If you do this first, the protectoff step will re-enable it anyway.

nvflash64 -6 <new bios>

Get the latest NVflash from Techpowerup.


----------



## GRABibus

yzonker said:


> Yea they were doing that way back when I got mine. But I think all it gets is CO/lmits (and only reducing limits) and there's a batch version of PBO Tuner that you can run at startup. I didn't feel like risking bricking my mobo just for that.
> 
> I could be wrong as I haven't followed it in a while, but most of that stuff is hard coded in to the CPU so the bios can't change it is my understanding.


ask @Reous









ASUS ROG X570 Crosshair VIII Overclocking &amp...


Hi, is it possible to add inside the Dark Hero bios, same functions of the Extreme edition like "core voltage suspension" ... Sadly it is not possible. Sorry. is it possible to include PBO in bios older than 4201 , which is age 1.2.0.6b?[/USER]...




www.overclock.net


----------



## GRABibus

lawson67 said:


> Yes 19th in 3Dmark Port Royal top 100 here


Hey I am first 









3DMark.com search


3DMark.com search




www.3dmark.com


----------



## lawson67

GRABibus said:


> Hey I am first
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark.com search
> 
> 
> 3DMark.com search
> 
> 
> 
> 
> www.3dmark.com


Fantastic well done mate


----------



## KingEngineRevUp

alasdairvfr said:


> you should be hitting like 29.5k or more with decent settings. I think something's up with you CPU. Run a few CBs and test your stability on OCCT (plus cpu/memory bench). I discovered that some minor stability issues that never cause crashing can still plummet performance due to errata.


It's my CPU boost. It's not going above 3.5 GHz. If you look at my test URL you can see it's not boosting.

I'm going to reset CMOS on bios, a lot of people said that fixed their issue.


----------



## changboy

I tried nvidia inspector and score higher in port royal :








Result not found







www.3dmark.com





But my game Uncharted dont work anymore, i just change the 3 setting show in the video, i dont know why this game dont start anymore, any idea ?


----------



## yzonker

KingEngineRevUp said:


> It's my CPU boost. It's not going above 3.5 GHz. If you look at my test URL you can see it's not boosting.
> 
> I'm going to reset CMOS on bios, a lot of people said that fixed their issue.


Yea I think I had that problem originally. I re-loaded a profile I had saved to get my ram timings back, but that loaded some PBO stuff I think that makes the CPU unhappy.


----------



## KingEngineRevUp

yzonker said:


> Yea I think I had that problem originally. I re-loaded a profile I had saved to get my ram timings back, but that loaded some PBO stuff I think that makes the CPU unhappy.


That's exactly what I did. I loaded my profile for my 5900x thinking it would just load my memory timings.

That's what I get for being lazy and not wanting to put in my memory timings manually.

Thanks! I think you figured it out! Will test when I get home.


----------



## yzonker

KingEngineRevUp said:


> That's exactly what I did. I loaded my profile for my 5900x thinking it would just load my memory timings.
> 
> That's what I get for being lazy and not wanting to put in my memory timings manually.
> 
> Thanks! I think you figured it out! Will test when I get home.


It took me a while to remember, but I'm 100% certain I did that. The other way to get a bit more performance is to bump up the bclk. But that has some hazards with Ryzen as everything increases and can play havoc with your drives, etc... I could reliably do 102 on my machine. It did help my CPU scores. This is what it ran with my 3080ti.









I scored 21 044 in Time Spy


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 3080 Ti x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## yzonker

Tested reBar on/off in TS. About what I expected other than my CPU score took just enough of a hit for the combined to be slightly lower. These are just test runs and not max OC.

Off,









I scored 36 254 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





On,









I scored 36 229 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## changboy

I been able to star Uncharted, i just restore to default the latest setting the guy change in the video.

Changing things can also affect program you use and not just benchmark.


----------



## KingEngineRevUp

So how do we extrapolate data from Der8auer test of an Alphacool block?






He's at 400W, 45C and +10C hotspot. But he doesn't say his water temperature. He has the card connected to a MORA all by itself. Is it safe to assume it's around room temperature than? 25-26C? I mean... He can be at 22C also or 28C.


----------



## bmagnien

Fiinally got the motivation to clean between all my fans and rads (gross). Slightly better scores, no artifacts, and probably the max I'll see on the giga oc with bykski block for the forseeable future:









I scored 11 116 in Speed Way


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 28 830 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Silent Scone

Azazil1190 said:


> If it was at 550w it will be fine but for 450w its too much.
> In the beginning it was 10-12c lower (before two weeks)at 520w.
> The strange thing is core temp and memory temps are the same like the beginning
> 58-60c max for core and 62-64 for the memory.
> Lets say im 2c higher for core and memory but thats ok its nothing.
> But hotspot? 65c(2 weeks ago) vs 75/78c(today)is more than enough
> 
> Of course i don't want to open and repaste and repad the card because is so new.





J7SC said:


> I generally leave global r_BAR on, but that is based mostly on the experience with the 3090. PR definitely gains via forced r_BAR while TimeSpyEx seems to see slightly reduced scores with it on...same for Superposition. But I don't run benchmarks all that much and am more concerned with certain DX12 games, such as FS2020 and CP 2077. In those, it's very hard to discern which r_BAR approach is better. With FS2020, r_BAR forced on seemed to help a bit, not hurt, but I'll have to check that out with the 4090 once I have the time.
> 
> 
> 
> ...sounds like TIM application wasn't that good from the factory, and/or the mount has shifted a bit (can happen after multiple heat cycles, depending on the original assembly and on the type of TIM re. pump-out). Relatively higher hotspot compared to general GPU temp is often a sign of that (happened on my 3090, fixed it).
> 
> ...I would take lots of HWInfo / GPUz screenshots to document the deteriorating hotspot, just in case you want to remount / re-paste later and are worried about warranty coverage.


I’ve just fitted the EKWB block to my TUF and the contact is really poor. Double checked torque, pads and remounted but GPU temps sit at 55-60c and hot spot can breach 100c no problem at 400-450w. Going to tear it down again tomorrow, hoping it’s PEBCAK and not poor machining.


----------



## KingEngineRevUp

Silent Scone said:


> I’ve just fitted the EKWB block to my TUF and the contact is really poor. Double checked torque, pads and remounted but GPU temps sit at 55-60c and hot spot can breach 100c no problem at 400-450w. Going to tear it down again tomorrow, hoping it’s PEBCAK and not poor machining.


Get a hair dryer, heat all the pads up so they soften up, and then torque the bolts down as much as possible.

Also, you are using the stock pads right?


----------



## yzonker

Silent Scone said:


> I’ve just fitted the EKWB block to my TUF and the contact is really poor. Double checked torque, pads and remounted but GPU temps sit at 55-60c and hot spot can breach 100c no problem at 400-450w. Going to tear it down again tomorrow, hoping it’s PEBCAK and not poor machining.


There was a user on reddit the other day that I was trying to help that had the same problem. Maybe the discussion is useful. Seemed to eventually get it better by just compressing the pads more as @KingEngineRevUp suggests. 


__
https://www.reddit.com/r/watercooling/comments/yvd8nu

This is why I went with the thermal putty...


----------



## Spit051261

Hey guys
Has anyone had any luck flashing the Zotac with a Gigabyte BIOS ?
The card is great but it needs more power and hopefully with another BIOS , I can get it to work on the Dark which it won't at the moment.








spit051261`s GPUPI - 1B score: 1sec 207ms with a GeForce RTX 4090


The GeForce RTX 4090 @ 3075/2947.5MHzscores getScoreFormatted in the GPUPI - 1B benchmark. spit051261ranks #2 worldwide and #2 in the hardware class. Find out more at HWBOT.




hwbot.org


----------



## Silent Scone

yzonker said:


> There was a user on reddit the other day that I was trying to help that had the same problem. Maybe the discussion is useful. Seemed to eventually get it better by just compressing the pads more as @KingEngineRevUp suggests.
> 
> 
> __
> https://www.reddit.com/r/watercooling/comments/yvd8nu
> 
> This is why I went with the thermal putty...


Yeah, I used the supplied pads from EK for which the GPU side are the same thickness they’ve always supplied (never had issues before), but the tolerances might be too little this gen. Will strip it down again.


----------



## KingEngineRevUp

yzonker said:


> Yea I think I had that problem originally. I re-loaded a profile I had saved to get my ram timings back, but that loaded some PBO stuff I think that makes the CPU unhappy.


Thanks again, CMOS did the trick.









Result







www.3dmark.com


----------



## morph.

Silent Scone said:


> I’ve just fitted the EKWB block to my TUF and the contact is really poor. Double checked torque, pads and remounted but GPU temps sit at 55-60c and hot spot can breach 100c no problem at 400-450w. Going to tear it down again tomorrow, hoping it’s PEBCAK and not poor machining.


Got my ek 4090 block for my strix yesterday and I've been too lazy to do a teardown and install for now... this is not good news to hear please keep me posted in findings/learnings....


----------



## Spit051261

Manged to get BIOS working but card getting a bit toasty
Card now showing as a Gigabyte card lol
Card


----------



## Silent Scone

morph. said:


> Got my ek 4090 block for my strix yesterday and I've been too lazy to do a teardown and install for now... this is not good news to hear please keep me posted in findings/learnings....


Will do. Hardline build so not going to get time to drain it till later. Bit disappointed as fitted many blocks in the past and never had a problem, although I know some people have had pad issues.


----------



## J7SC

...pad issues seem to be more prevalent this time around; I ended up using thermal putty for VRAM and other select bits.

In other news, I did my weekly check on the availability of 4090s (excluding FE) just now, and all major chains in Canada were still sold out


----------



## KedarWolf

J7SC said:


> ...pad issues seem to be more prevalent this time around; I ended up using thermal putty for VRAM and other select bits.
> 
> In other news, I did my weekly check on the availability of 4090s (excluding FE) just now, and all major chains in Canada were still sold out


Memory Express let me pay in advance and get my Strix OC 4090 when they had stock, but they prioritize those who buy their third-party warranty first.

I paid the $300 CAD for it but only because they'll still honour it if I install a water block, just as long as I don't have water damage.


----------



## jootn2kx

Flashed my Gainward 4090 non GS card with the Gigabyte OC bios last night that I found @techpowerup.com.
Its definitely scoring higher in 3Dmark for sure, also getting higher base clocks with the same power usage.
In games I have been seeing the same so seems like the gigabyte bios is a little bit more efficient, not huge difference but relevant enough.

Gona stay on this bios I think =) working flawless so far


----------



## KedarWolf

Does anyone want to test the V2 ASUS BIOS that came out maybe a week ago?

If you do, I can save both the 120 and the 133 power limit BIOS's with nvflash and share them here.

They are both 600W BIOS's.


----------



## doom3crazy

KedarWolf said:


> Does anyone want to test the V2 ASUS BIOS that came out maybe a week ago?
> 
> If you do, I can save both the 120 and the 133 power limit BIOS's with nvflash and share them here.
> 
> They are both 600W BIOS's.


would they work on the msi gaming trio ? Id be down. Whats the difference between them if 120 and 133 on the power limit are both 600w? or am I not understanding that right?


----------



## Zero989

doom3crazy said:


> would they work on the msi gaming trio ? Id be down. Whats the difference between them if 120 and 133 on the power limit are both 600w? or am I not understanding that right?


 Yes, the base power is higher if the % is lower


----------



## doom3crazy

KedarWolf said:


> Does anyone want to test the V2 ASUS BIOS that came out maybe a week ago?
> 
> If you do, I can save both the 120 and the 133 power limit BIOS's with nvflash and share them here.
> 
> They are both 600W BIOS's.


I’m game. 


Zero989 said:


> Yes, the base power is higher if the % is lower


Makes sense. Well yeah I’m down to try it then haha. Would I still have access to full fan speed? I was thinking I read with liquid bios you don’t have full rpm or something ?


----------



## Zero989

doom3crazy said:


> I’m game.
> 
> Makes sense. Well yeah I’m down to try it then haha. Would I still have access to full fan speed? I was thinking I read with liquid bios you don’t have full rpm or something ?


No idea, need to measure in dba not rely on hardware monitoring


----------



## heptilion

I'm getting lower benchmark scores with same settings after the MW2 warzone update. Lost around 30fps AVG. Can anyone confirm?


----------



## sew333

I compared fps in gameplay in Cyberpunk 2077 ( 1440P dlss quality,rt ultra ) with my Rtx 4090 on Ryzen 9 7950X DDR5 6000mhz. I have 12900K stock,ddr4 3600mhz.And i have the same fps . So its ok ,that cpus are comparable?
I tested on cpu bound scene.

I guess SMT was fixed on ryzen


----------



## morph.

Silent Scone said:


> Will do. Hardline build so not going to get time to drain it till later. Bit disappointed as fitted many blocks in the past and never had a problem, although I know some people have had pad issues.


Yep, I was one of the early people that had a 3090 Strix with ek block-supplied pad issues last time around. This time round got everything on hand but was hesitant to jump right into it, especially after what you mentioned haha...


----------



## alasdairvfr

KedarWolf said:


> Memory Express let me pay in advance and get my Strix OC 4090 when they had stock, but they prioritize those who buy their third-party warranty first.
> 
> I paid the $300 CAD for it but only because they'll still honour it if I install a water block, just as long as I don't have water damage.


My local store told me they can NOT condone overclocking from a warranty standpoint... but can we sell them the story that installing a block != overclocking ??


----------



## alasdairvfr

SMT off gave a little CPU boost, still not able to crack 40k on GPU



















I scored 33 070 in Time Spy


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 10}




www.3dmark.com


----------



## lawson67

Spit051261 said:


> Manged to get BIOS working but card getting a bit toasty
> Card now showing as a Gigabyte card lol
> Card
> View attachment 2583223


I would use Liquid metal on the die if i run that bios as it should bring temps down quite a bit and it wont dry up, normal paste will pump out and dry up fast at 600w which is why some users are already seeing there hotspots and GPU temps rising in just a few weeks of owning them


----------



## GRABibus

lawson67 said:


> I am getting 3 higher FPS in tomb Rader with forced Rebar on, ill test some other games in abit and try to work out how to force rebar on only certain games etc, also the differance between the Ryzen 7 5800x and the 3D version is insane, i was 46% GPU bound with the Ryzen 7 5800x, but with the 3D one i am 99% Gpu Bound
> 
> View attachment 2583131


Do you play Spiderman Remastered ?

Did you check your GPU usage with 5800X3D at 4K with Ray tracing + DLSS (Frame generation disabled) ?

I am curious about that.


----------



## lawson67

GRABibus said:


> Do you play Spiderman Remastered ?
> 
> Did you check your GPU usage with 5800X3D at 4K with Ray tracing + DLSS (Frame generation disabled) ?
> 
> I am curious about that.


No i don't have that game sorry


----------



## alasdairvfr

lawson67 said:


> I would use Liquid metal on the die if i run that bios as it should bring temps down quite a bit and it wont dry up, normal paste will pump out and dry up fast at 600w which is why some users are already seeing there hotspots and GPU temps rising in just a few weeks of owning them


looks like stock temp creep is prevalent on these 4090s as well, I wonder if some cards are worse than others. My 3080 TUF OC went from having good temps to terrible in 8 months or so. I sold it rather than repasting as I already had a 3080TI. Big issue with LM though is for vertical mounts, I'd be concerned about it seeping down due to gravity.


----------



## Benni231990

heptilion said:


> I'm getting lower benchmark scores with same settings after the MW2 warzone update. Lost around 30fps AVG. Can anyone confirm?


Me to i also got lower FPS with the update


----------



## GRABibus

Benni231990 said:


> Me to i also got lower FPS with the update


Yes, like Modern Warfare 2019, -10% fps at each Warzone updates...L.


----------



## dk_mic

alasdairvfr said:


> SMT off gave a little CPU boost, still not able to crack 40k on GPU
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 33 070 in Time Spy
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


You have a lot of headroom on your CPU score. Do a manual all core overclock.. Depending on your chip you can do ~ 4800 MHz with SMT off.

Here, I have 18.7k CPU score with like 4800/4700 @ 1.35 or so








I scored 33 235 in Time Spy


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Max memory for my card is +1000 MHz, still enough to hit top 10 for 5950x / 4090 in firestrike and timespy in all its flavors. Only in PR I can't go much over 28000. The core also tops at 3060 or so. Benching with the suprim x bios on a gaming x trio. The liquid bios with 600W would be better i guess, not sure if the loss of 1000 rpm on the fans would actually make a difference.

I find 3DMark really annoying to bench, since most of the benches have different preferences for PBO/SMT/HAGS/ReBAR.

and btw, new driver is out, didnt test it yet GeForce Game Ready Driver | 526.98 | Windows 10 64-bit, Windows 11 | NVIDIA


----------



## Silent Scone

morph. said:


> Yep, I was one of the early people that had a 3090 Strix with ek block-supplied pad issues last time around. This time round got everything on hand but was hesitant to jump right into it, especially after what you mentioned haha...


Yeah I didn't block Ampere, but I've heard a lot of people had a similar experience. Makes me wonder when it comes to EK.


----------



## Sheyster

KedarWolf said:


> Does anyone want to test the V2 ASUS BIOS that came out maybe a week ago?
> 
> If you do, I can save both the 120 and the 133 power limit BIOS's with nvflash and share them here.
> 
> They are both 600W BIOS's.


I would try the 120% 500/600w PL BIOS on my GB-G-OC. Who knows, I might get better memory OC.. Wishful thinking I know..


----------



## yzonker

dk_mic said:


> You have a lot of headroom on your CPU score. Do a manual all core overclock.. Depending on your chip you can do ~ 4800 MHz with SMT off.
> 
> Here, I have 18.7k CPU score with like 4800/4700 @ 1.35 or so
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 33 235 in Time Spy
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Max memory for my card is +1000 MHz, still enough to hit top 10 for 5950x / 4090 in firestrike and timespy in all its flavors. Only in PR I can't go much over 28000. The core also tops at 3060 or so. Benching with the suprim x bios on a gaming x trio. The liquid bios with 600W would be better i guess, not sure if the loss of 1000 rpm on the fans would actually make a difference.
> 
> I find 3DMark really annoying to bench, since most of the benches have different preferences for PBO/SMT/HAGS/ReBAR.
> 
> and btw, new driver is out, didnt test it yet GeForce Ga
> 
> 
> dk_mic said:
> 
> 
> 
> You have a lot of headroom on your CPU score. Do a manual all core overclock.. Depending on your chip you can do ~ 4800 MHz with SMT off.
> 
> Here, I have 18.7k CPU score with like 4800/4700 @ 1.35 or so
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 33 235 in Time Spy
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Max memory for my card is +1000 MHz, still enough to hit top 10 for 5950x / 4090 in firestrike and timespy in all its flavors. Only in PR I can't go much over 28000. The core also tops at 3060 or so. Benching with the suprim x bios on a gaming x trio. The liquid bios with 600W would be better i guess, not sure if the loss of 1000 rpm on the fans would actually make a difference.
> 
> I find 3DMark really annoying to bench, since most of the benches have different preferences for PBO/SMT/HAGS/ReBAR.
> 
> and btw, new driver is out, didnt test it yet GeForce Game Ready Driver | 526.98 | Windows 10 64-bit, Windows 11 | NVIDIA
> 
> 
> 
> 
> me Ready Driver | 526.98 | Windows 10 64-bit, Windows 11 | NVIDIA
Click to expand...

Some of the people here might be interested in this fix in the new driver,

"G-SYNC logo is not displayed in the LG OLED TV menu when connected to GeForce RTX 40 series GPUs "


----------



## Sheyster

New 526.98 driver today:









GeForce Game Ready Driver | 526.98 | Windows 10 64-bit, Windows 11 | NVIDIA


Download the English (US) GeForce Game Ready Driver for Windows 10 64-bit, Windows 11 systems. Released 2022.11.16



www.nvidia.com


----------



## J7SC

lawson67 said:


> I would use Liquid metal on the die if i run that bios as it should bring temps down quite a bit and it wont dry up, normal paste will pump out and dry up fast at 600w which is why some users are already seeing there hotspots and GPU temps rising in just a few weeks of owning them


I've used LM before on horizontally-mounted CPUs and GPUs with a surrounding barrier of MX4, but it can still get spooky very quickly (especially with vertical mount)...


----------



## KedarWolf

doom3crazy said:


> I’m game.
> 
> Makes sense. Well yeah I’m down to try it then haha. Would I still have access to full fan speed? I was thinking I read with liquid bios you don’t have full rpm or something ?


Here's the 133 Power Limit BIOS, the one I use.


----------



## Sheyster

Looks like CableMod is offering a "basic" version of the 12VHPWR cable in 2x, 3x and 4x PSU connector variants, with faster delivery:






CableMod Basics E-Series 12VHPWR PCI-e Cable for EVGA G7 / G6 / G5 / G3 / G2 / P2 / T2 – CableMod Global Store







store.cablemod.com






> As our custom sleeved cables are currently experiencing a lead time of 3-4 weeks, these non-sleeved cables are handled by a separate production team, and are a good option for those looking for a cable that can be delivered in a week or less.


----------



## Sheyster

KedarWolf said:


> Here's the 133 Power Limit BIOS, the one I use.


Any chance to also get the V2 120% BIOS? Thanks in advance!


----------



## J7SC

Silent Scone said:


> Yeah I didn't block Ampere, but I've heard a lot of people had a similar experience. Makes me wonder when it comes to EK.


...I mentioned before that I actually replaced my EK 3090 Strix block due to several quality issues and mounted a Phanteks Glacier block for the 3090 Strix instead. For the 4090, the Bykski block is performing really well so far (just as with another card w/ Bykski in my work+play setups). 



Sheyster said:


> I would try the 120% 500/600w PL BIOS on my GB-G-OC. Who knows, I might get better memory OC.. Wishful thinking I know..


...depending what you mean by 'better memory OC' on the Giga-G-OC, it can also be temp-related. With air-cooling on my Giga-G-OC, I could have either the VRAM hit its most efficient speed around 1500 MHz, or clock the GPU up to max (all with 133% PL), but could not do both at the same time - likely due to heat problems (hotspot was getting out of control) that must have affected the internal IMC. After water-cooling, I can max both the core and VRAM at the same time.


----------



## Sheyster

J7SC said:


> ...depending what you mean by 'better memory OC' on the Giga-G-OC, it can also be temp-related.


I don't think temp is an issue for me. I know sometimes memory timings can be different with these vBIOS. FIgured it might be worth a try since +1400 mem is my hard limit. Additionally, I like the idea of using a vBIOS with a default PL of 500w, hence my request for the other BIOS he has on the other switch setting.


----------



## PhuCCo

I ordered the Alphacool FE block and should get it in the next couple of weeks. I'll post results when I get it mounted.


----------



## GRABibus

Sheyster said:


> Any chance to also get the V2 120% BIOS? Thanks in advance!


This one?









Asus RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com


----------



## Sheyster

GRABibus said:


> This one?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Asus RTX 4090 VBIOS
> 
> 
> 24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com


That's the one but it might be the version that originally shipped with the card. I believe @KedarWolf has the newer V2 version that was released,


----------



## KingEngineRevUp

PhuCCo said:


> Alphacool


Are you sure you did? Because they haven't released one yet. I'm pretty sure you purchased the "reference" one meant for reference boards like the Inno3D. I would cancel the order ASAP!


----------



## lawson67

alasdairvfr said:


> looks like stock temp creep is prevalent on these 4090s as well, I wonder if some cards are worse than others. My 3080 TUF OC went from having good temps to terrible in 8 months or so. I sold it rather than repasting as I already had a 3080TI. Big issue with LM though is for vertical mounts, I'd be concerned about it seeping down due to gravity.


Ive used LM on my RX 6800 XT and my RX 6900 XT LM if applied correctly IE not overloaded "putting to much on the die" it should not seep off as it tends to stick to its self , ive never had any problems with LM as long as you put 2 coats of nail polish over the semiconductors around the die its safe even if you do put to much on the die, LM can greatly reduce temps and it does not dry out, ill do my 4090 if temps creep up


----------



## dr/owned

doom3crazy said:


> would they work on the msi gaming trio ? Id be down. Whats the difference between them if 120 and 133 on the power limit are both 600w? or am I not understanding that right?


I think I tried the Strix on my Trio and it was no-go with one of the DP outputs.


----------



## cheddardonkey

..


----------



## bmagnien

yzonker said:


> Some of the people here might be interested in this fix in the new driver,
> 
> "G-SYNC logo is not displayed in the LG OLED TV menu when connected to GeForce RTX 40 series GPUs "


Nice call! Confirmed fixed (it was only showing ‘VRR’ prior:


----------



## bmagnien

Lol - imagine hundreds of noobs posting photos and sending their melted cables to GN just for GN to forensically expose them for gross user error:


----------



## cheddardonkey

Has anyone custom modified their bios yet or have the ability to?


----------



## changboy

I always use liquid metal but i protect all SMD module around the chip coz most of the time we can see little splash around the chip and if you dont protect those module you can get problem.


----------



## Sheyster

yzonker said:


> Some of the people here might be interested in this fix in the new driver,
> 
> "G-SYNC logo is not displayed in the LG OLED TV menu when connected to GeForce RTX 40 series GPUs "


Also:



> Introduces DLSS Frame Generation support for VSync when G-SYNC is enabled


----------



## Sheyster

cheddardonkey said:


> Has anyone custom modified their bios yet or have the ability to?


Nope.


----------



## alasdairvfr

So its not so much the bending as the propensity for a taut cable to slowly dislodge over time if the clip didn't fully engage.
A visual inspection of the connection the connector should be truly flush with the socket:


----------



## newls1

took delivery of the new phanteks G40 waterblock for my Giga Gaming OC 4090 this morning and just wanted to post up pics for those of you guys wondering how the phanteks blocks fits and looks. Other then I think my house is going to burn down cause the power plug is bent more then allowed due to radiator clearance, i had no choice but to gently message the connector down with the least amount of bend possible but its clearly a sharper then I wanted bend...... ol well .. My cable mods cable should be here at somepoint soon to fix that issue. Only issue I dont like about this block is the backplate.. It seems it doesnt make the best contact with the back components as well as i would have hoped, but hopefully that will be ok. With my 2 420mm rads blowing down into case, the GPU backplate will have plenty of air flow at least. Anyways, here are some pics....






















































*^^^^^ Is my house going to burn down?????!!!!! ^^^^^*


----------



## Silent Scone

Remounted and checked for good contact / heated up the pads and hot spot temps seem to be much improved. Around 70c at 450w. Certainly better than 105.


----------



## doom3crazy

_b_


dr/owned said:


> I think I tried the Strix on my Trio and it was no-go with one of the DP outputs.


Did you try hdmi ? And what bios are you running now ?


----------



## Azazil1190

Does anyone have the stock tuf bios ?
Non oc ,the regular tuf


----------



## Netherwind

I'm having some fan problems with my Gigabyte Gaming OC.

First I noticed that Fan 1 (which according to Gigabyte Control Centre is left and middle fan) and Fan 2 (which according to Gigabyte Control Centre is right fan) spin at different RPMs. The difference is about 1242 RPM vs 1060 RPM at 59% speed. Not sure if this is normal but OK.

Next thing is worse. Below a certain fan speed % (~50% in my case) Fan 1 will start spinning up/down every second (even if the temperature doesn't fluctuate at all) while Fan 2 RPM is pretty much fixed. Below ~40% fan speed Fan 2 will also behave like this. Not only does this show up in MSI AB HW monitoring, I can visually see it since the card has these RGB halos and I see Fan 2 through the backplate cutout - the fans do indeed look like they're gaining speed and slowing down intermittently.

Below is an example when I was playing a non-demanding game which made Fan 1 start spinning up and down every second or less (30-33% fan speed) and Fan 2 stayed at 0 RPM (not visible in the picture). This is at stock (OC BIOS), the same behavior happens with the Silent BIOS.

My current workaround is to set a fixed fan speed of 59% which makes the fans run at a stable RPM.


----------



## Panchovix

Azazil1190 said:


> Does anyone have the stock tuf bios ?
> Non oc ,the regular tuf


I have the stock performance TUF non-OC VBIOS (both V1 and V2), that's the one you want?

EDIT: saw your message, sent it.


----------



## PhuCCo

KingEngineRevUp said:


> Are you sure you did? Because they haven't released one yet. I'm pretty sure you purchased the "reference" one meant for reference boards like the Inno3D. I would cancel the order ASAP!











Alphacool Eisblock Aurora Acryl GPX-N RTX 4090 Founders Edition mit Backplate


Die nächste Generation - Der Alphacool Eisblock Aurora Wasserkühler für Geforce RTX 4XXX Grafikkarten. Um die enorme Abwärme dieser Grafikkartengeneration bestmöglich abzuführen, wurden am Wasserkühler zahlreiche Optimierungen im...




www.alphacool.com


----------



## keikei

Guys, how is INNO3D as a brand?


----------



## Brandur

keikei said:


> Guys, how is INNO3D as a brand?


I like the brand  Temperatures are very good and the card is quiet! Got the iChill Black!


----------



## KingEngineRevUp

PhuCCo said:


> Alphacool Eisblock Aurora Acryl GPX-N RTX 4090 Founders Edition mit Backplate
> 
> 
> Die nächste Generation - Der Alphacool Eisblock Aurora Wasserkühler für Geforce RTX 4XXX Grafikkarten. Um die enorme Abwärme dieser Grafikkartengeneration bestmöglich abzuführen, wurden am Wasserkühler zahlreiche Optimierungen im...
> 
> 
> 
> 
> www.alphacool.com


Wow I emailed them a few days ago and they said they didn't know.

Well let us know how it goes!


----------



## J7SC

KingEngineRevUp said:


> Wow I emailed them a few days ago and they said they didn't know.
> 
> Well let us know how it goes!


The yellow-font line in German means s.th. along the lines of 'likely deliverable in 14 to 15 days'...


----------



## mirkendargen

alasdairvfr said:


> So its not so much the bending as the propensity for a taut cable to slowly dislodge over time if the clip didn't fully engage.
> A visual inspection of the connection the connector should be truly flush with the socket:
> View attachment 2583274
> 
> 
> 
> 
> View attachment 2583275


It's simpler than that. If you can wiggle the cable out without pushing the tab, you're doing it wrong. Super easy to check without removing your cable daily and wearing it out.


----------



## AvengedRobix

Finally re-have Gsync properly work on LG C1


----------



## GRABibus

Sheyster said:


> New 526.98 driver today:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GeForce Game Ready Driver | 526.98 | Windows 10 64-bit, Windows 11 | NVIDIA
> 
> 
> Download the English (US) GeForce Game Ready Driver for Windows 10 64-bit, Windows 11 systems. Released 2022.11.16
> 
> 
> 
> www.nvidia.com


Stutters in MW2 for me with this set.
Back to 526.86


----------



## Nizzen

GRABibus said:


> Stutters in MW2 for me with this set.
> Back to 526.86


Works smooth here on newest driver and 4090. Maybe AMD cpu is doing some trick with you


----------



## GRABibus

Nizzen said:


> Works smooth here on newest driver and 4090. Maybe AMD cpu is doing some trick with you


I am still with my 3090 currently.


----------



## Nizzen

GRABibus said:


> I am still with my 3090 currently.


Stutter in games is AMD thing 
Atleast it has been for me. Maybe I'm unlucky


----------



## GRABibus

Nizzen said:


> Stutter in games is AMD thing
> Atleast it has been for me. Maybe I'm unlucky


Maybe I had something running in background…
Will try them again tomorrow.
With 525.86, no stutters at all.


----------



## GRABibus

Your 4090 is paired with 7950X ?


----------



## Nizzen

GRABibus said:


> Your 4090 is paired with 7950X ?


Not now, using the 13900k now. Faster in everything.
Sad but true... wanted 7950x3d day one....


----------



## GRABibus

Nizzen said:


> Not now, using the 13900k now. Faster in everything.
> Sad but true... wanted 7950x3d day one....


I will get 7950X3D or 7900X3D.
AM5 plateform is also so futurproof.


----------



## AvengedRobix

Nizzen said:


> Not now, using the 13900k now. Faster in everything.
> Sad but true... wanted 7950x3d day one....


from the "big" rumor the 3d version is two.. six and eight core


----------



## dr/owned

I got the Phanteks MSI block delivered. Looks clean where they clearly got their CNC dialed in and wiped the block down from finger prints. I'll make a separate thread with measurements for thermal pad selection and probably try slamming it together tonight. Tis the time of year to break out the depth gauge.


----------



## GRABibus

AvengedRobix said:


> from the "big" rumor the 3d version is two.. six and eight core


Yep,just read a French news about that.

only 7600X3D and 7700X3D.
Launch spring 2023.


----------



## doom3crazy

With many of the AIB partners all releasing bios updates, can someone out there recommend the best "custom" or "third party" bios I could flash to my msi gaming trio? It maxes out at 480w and I just wanna be able to get a little more juice out of it while still having the full range of rpm from my fans. If there's a 600w air bios that would work on the trio that would be great but i'd settle for a 520-530w bios as well. Thanks! There was the suprim liquid x bios that was posted but I read somewhere that it maxed their fans out at 2500rpm when before they flashed they could go to 3500rpm.


----------



## dr/owned

doom3crazy said:


> With many of the AIB partners all releasing bios updates, can someone out there recommend the best "custom" or "third party" bios I could flash to my msi gaming trio? It maxes out at 480w and I just wanna be able to get a little more juice out of it while still having the full range of rpm from my fans. If there's a 600w air bios that would work on the trio that would be great but i'd settle for a 520-530w bios as well. Thanks! There was the suprim liquid x bios that was posted but I read somewhere that it maxed their fans out at 2500rpm when before they flashed they could go to 3500rpm.


I tried the Gaming OC and Suprim X Liquid and neither allowed the full fan RPM.


----------



## doom3crazy

dr/owned said:


> I tried the Gaming OC and Suprim X Liquid and neither allowed the full fan RPM.


Damn thats so lame! I wonder if the suprim gaming x non liquid version would allow it to get to full rpm? Also you mentioned trying the strix bios on the gaming trio and it not working. was it just DP not working? did you try hdmi?


----------



## dr/owned

doom3crazy said:


> Damn thats so lame! I wonder if the suprim gaming x non liquid version would allow it to get to full rpm? Also you mentioned trying the strix bios on the gaming trio and it not working. was it just DP not working? did you try hdmi?


I didn't try HDMI since all 3 of my monitors are DP-over-fiber. I tried the Suprim X Liquid, Gaming OC, and then went back to the stock-but-updated Trio bios because I was messing with the voltage controller and wanted as much fan RPM as possible. It's soon to not matter for me though with the waterblock. Flashing yourself is pretty low risk with the dual bios. If it doesn't work in some horrific way you just shut down , flip to the other bios switch, boot, flip to the broken one, then reflash.

BTW though I don't think you really need full RPM most of the time anyways.


----------



## doom3crazy

dr/owned said:


> I didn't try HDMI since all 3 of my monitors are DP-over-fiber. I tried the Suprim X Liquid, Gaming OC, and then went back to the stock-but-updated Trio bios because I was messing with the voltage controller and wanted as much fan RPM as possible. It's soon to not matter for me though with the waterblock. Flashing yourself is pretty low risk with the dual bios. If it doesn't work in some horrific way you just shut down , flip to the other bios switch, boot, flip to the broken one, then reflash.
> 
> BTW though I don't think you really need full RPM most of the time anyways.


ah yeah that makes sense. anyway you might be able to post the newest updated trio bios here? I figure if I am gonna run stock for the moment might as well be updated. What changes did they make between bios, do you know?


----------



## dr/owned

doom3crazy said:


> ah yeah that makes sense. anyway you might be able to post the newest updated trio bios here? I figure if I am gonna run stock for the moment might as well be updated. What changes did they make between bios, do you know?


It's .7z not .txt Also this is the Trio not the Trio X but irrelevant if you're overclocking.


----------



## doom3crazy

dr/owned said:


> It's .7z not .txt Also this is the Trio not the Trio X but irrelevant if you're overclocking.


Perfect! yeah i have just the trio. not the trio x. are there any notes as to what has changed?


----------



## J7SC

AvengedRobix said:


> from the "big" rumor the 3d version is two.. six and eight core


...may be instead of waiting for X3D I should upgrade some workstations with some new Epyc 96C/192T, PCIe 5, DDR5 12-channel and a ton of PCIe lanes...whatever money is left over goes to ElmorLabs for some 'custom tuning tools'


----------



## KedarWolf

Sheyster said:


> That's the one but it might be the version that originally shipped with the card. I believe @KedarWolf has the newer V2 version that was released,


----------



## KedarWolf

If your card had two BIOS's, you can also flash both the ASUS BIOS's found on TechPowerUp, then run the V2 BIOS flashing utility, it should update both to the newest version.

I'm pretty sure it'll trick the V2 utility into thinking you have an ASUS card and update both of them.


----------



## Sheyster

KedarWolf said:


> If your card had two BIOS's, you can also flash both the ASUS BIOS's found on TechPowerUp, then run the V2 BIOS flashing utility, it should update both to the newest version.
> 
> I'm pretty sure it'll trick the V2 utility into thinking you have an ASUS card and update both of them.


LOL, funny you mention that! If you didn't post up the file I wasn't going to hound you about it, I was just going to do exactly that. 

Thank you!


----------



## doom3crazy

KedarWolf said:


>


thanks for sharing


----------



## dr/owned

doom3crazy said:


> Perfect! yeah i have just the trio. not the trio x. are there any notes as to what has changed?


I don't think they really made any changes that matter if you already were able to get the card working without problem. Initially at launch I think some AMD platforms were having POST issues with the 4090 so updates were "Improve compatibility".


----------



## dr/owned

Phanteks made a standoff oopsie. I didn't see it until I was measuring standoff heights and it was 1mm off. Fortunately they didn't cross thread it so it went in correctly when I put it on square.


----------



## GRABibus

GRABibus said:


> Stutters in MW2 for me with this set.
> Back to 526.86


Retested 526.98 in MW2.
No more stutters...


----------



## Brandur

What did you change?


----------



## doom3crazy

Does anyone have the most recent suprim x 4090 air gaming bios ?


----------



## N19htmare666

This would be top 13 for the 5950x if the driver was allowed. Any tips for getting this closer to 29k?
4090 MSI liquid X with the 600w bios. Only had 1 day so prob can OC a bit further.


----------



## J7SC

N19htmare666 said:


> This would be top 13 for the 5950x if the driver was allowed. Any tips for getting this closer to 29k?
> 4090 MSI liquid X with the 600w bios. Only had 1 day so prob can OC a bit further.
> 
> View attachment 2583324
> 
> 
> View attachment 2583325


...Port Royal has a few issues re. potential artefacts, and waiting a few days before the driver is approved by 3DM is not the end of the world anyhow. FYI, I have a few 29+ k PR scores I haven't subbed yet as I plan a bit more benching on the weekend after iterative testing here and there. 

FYI, with water-cooling, I basically had to throw out all my previous MSI AB profiles I gathered the week before with air-cooling, and I have yet to go for a custom voltage curve (if I even remember how to do them... 🥴)


----------



## newls1

newls1 said:


> took delivery of the new phanteks G40 waterblock for my Giga Gaming OC 4090 this morning and just wanted to post up pics for those of you guys wondering how the phanteks blocks fits and looks. Other then I think my house is going to burn down cause the power plug is bent more then allowed due to radiator clearance, i had no choice but to gently message the connector down with the least amount of bend possible but its clearly a sharper then I wanted bend...... ol well .. My cable mods cable should be here at somepoint soon to fix that issue. Only issue I dont like about this block is the backplate.. It seems it doesnt make the best contact with the back components as well as i would have hoped, but hopefully that will be ok. With my 2 420mm rads blowing down into case, the GPU backplate will have plenty of air flow at least. Anyways, here are some pics....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *^^^^^ Is my house going to burn down?????!!!!! ^^^^^*


Finally my coolant came so I was finally able to see what the temps were like. Temps are pretty good, with running PR with a +230c +1350mem in a 68f degree room.... PR Ran @ 3045MHzand 38-40c! Im happy with that. Wish I could have seen mem temps but I didnt have hwinfo open. Pretty good block and price.


----------



## N19htmare666

J7SC said:


> ...Port Royal has a few issues re. potential artefacts, and waiting a few days before the driver is approved by 3DM is not the end of the world anyhow. FYI, I have a few 29+ k PR scores I haven't subbed yet as I plan a bit more benching on the weekend after iterative testing here and there.
> 
> FYI, with water-cooling, I basically had to throw out all my previous MSI AB profiles I gathered the week before with air-cooling, and I have yet to go for a custom voltage curve (if I even remember how to do them... 🥴)


I've got voltage, power and temps, GPU fans on max. 600w bios with a fan and copper on the backplate. A curve on 1.1v at +265 and vram at +1625. Currently stability testing higher clocks. 
Rebar on, hardware GPU scheduling acceleration on.

Driver installed with minimum extras (core and physX). GPU scaling disabled.

Windows services debloated. Timer resolution minimised. MSI mode high priority for GPU. Custom power plan. Pbo and ram tuned for lowest latency. PCIe bandwidth over 26gb.

Any other tricks that I'm missing for 3d mark?


----------



## Baka_boy

dr/owned said:


> Phanteks made a standoff oopsie. I didn't see it until I was measuring standoff heights and it was 1mm off. Fortunately they didn't cross thread it so it went in correctly when I put it on square.
> 
> View attachment 2583305


Mine was missing a standoff. Unfortunately, I only realized after putting on the block, so I had to remove it again and put it back on with a cannibalized standoff from an EK block. 😪


----------



## motivman

I ended up ordering the EKWB block for my FE. So general consensus is to use the supplied thermal pads correct? anyone with the EKWB block and an FE, what is your water to core temp delta T running PR?


----------



## J7SC

N19htmare666 said:


> I've got voltage, power and temps, GPU fans on max. 600w bios with a fan and copper on the backplate. A curve on 1.1v at +265 and vram at +1625. Currently stability testing higher clocks.
> Rebar on, hardware GPU scheduling acceleration on.
> 
> Driver installed with minimum extras (core and physX). GPU scaling disabled.
> 
> Windows services debloated. Timer resolution minimised. MSI mode high priority for GPU. Custom power plan. Pbo and ram tuned for lowest latency. PCIe bandwidth over 26gb.
> 
> Any other tricks that I'm missing for 3d mark?


...I haven't done any of those things yet, but generally speaking, max your CPU and RAM speed / tight timings and 'get your ambient temps low and you're good to go' (hey that rhymed )


----------



## N19htmare666

J7SC said:


> ...I haven't done any of those things yet, but generally speaking, max your CPU and RAM speed / tight timings and 'get your ambient temps low and you're good to go' (hey that rhymed )


All core or pbo for PR?


----------



## wtf_apples

NVIDIA GeForce RTX 4090 video card benchmark result - Intel Core i9-12900KF Processor,EVGA Corp. Z690 DARK KINGPIN (3dmark.com) 

Ran it a few times and waiting to see how they fix the memory bug. Havent seen any artifacting myself.
Fun to be tinkering again.


----------



## chispy

Can someone please share the 4090 MSI Suprim liquid X 600w bios with me please. I finally found and bought one 4090 msi gaming x trio at msrp locally on a brick and mortar store in my town. I have ordered the EK water block already and some cablemod new 600w adapter. I will really appreciate it ,thanks.


----------



## J7SC

N19htmare666 said:


> All core or pbo for PR?


PBO !


----------



## wtf_apples

KedarWolf said:


>


Thank you 😀


----------



## motivman

chispy said:


> Can someone please share the 4090 MSI Suprim liquid X 600w bios with me please. I finally found and bought one 4090 msi gaming x trio at msrp locally on a brick and mortar store in my town. I have ordered the EK water block already and some cablemod new 600w adapter. I will really appreciate it ,thanks.


rename to .rom


----------



## chispy

motivman said:


> rename to .rom


Thank you very much , i really appreciate it


----------



## KingEngineRevUp

yzonker said:


> Reality is that's it's zero performance gain anyway. More of just knowing you have everything you can get I think. I do like having 2 pumps for redundancy though.


I got the second pump to test. Scored it for $100. 










Not sure if it made a big difference, at 530W max power draw, delta was 24C










I did a 370W test because I think you and someone else did one? The delta was 16.5C










It's not fantastic, but it's not bad either. But for the price, yeah that's bad.


----------



## morph.

Silent Scone said:


> Yeah I didn't block Ampere, but I've heard a lot of people had a similar experience. Makes me wonder when it comes to EK.


I can confirm my hotspot temp on the OEM Strix cooler is around 10deg dif, for game loads compared to GPU temp. Max at one point was 70 deg ambient around 25 deg.


----------



## KingEngineRevUp

motivman said:


> I ended up ordering the EKWB block for my FE. So general consensus is to use the supplied thermal pads correct? anyone with the EKWB block and an FE, what is your water to core temp delta T running PR?


Well you can read the post I just made. There are two data points there and the results are pretty linear in regards to power load. 

Just remember, the 4090 is 61% more effort to cool over a 3090 because it has a higher die heat density.


----------



## mickyc357

What's the general consensus on the gaming OC vs MSI trio x if available at the same price? Gaming OC has better PL but how have OC results been between them ?


----------



## ESRCJ

Given the terrible results I'm seeing here for the EK blocks, I should probably just send mine back to EK and take the hit for the restocking fee (mine is still unopened). It seems like almost zero improvement over air cooling. A 20C+ fluid-die delta is doorstop territory.


----------



## KingEngineRevUp

ESRCJ said:


> Given the terrible results I'm seeing here for the EK blocks, I should probably just send mine back to EK and take the hit for the restocking fee (mine is still unopened). It seems like almost zero improvement over air cooling. A 20C+ fluid-die delta is doorstop territory.



Yeah, Optimus isn't fairing too great either. Theirs looks like 19C at 520W. Alphacool, from der8baur test, is like 20C at 400W.

Water-cooling this generation is definitely for aesthetics or noise control. It'll perform just a little better than an air cooler running at 100% fan speed.

Edit: I wouldn't say the results are terrible, I would phrase it differently. The air coolers are great. These are the biggest most fin dense air coolers for GPUs we have ever seen.


----------



## ESRCJ

KingEngineRevUp said:


> Yeah, Optimus isn't fairing too great either. Theirs looks like 19C.
> 
> Water-cooling this generation is definitely for aesthetics or noise control. It'll perform just a little better than an air cooler running at 100% fan speed.


Such a shame. With Turing my water-die delta was 8-9C at 400W. I got stuck with a Hydro Copper block for my 3090 KP, which was the worst waterblock I've ever owned. This hobby isn't as fun when going custom gets you almost nothing. Seeing a 2080 Ti drawing 400W while sitting in the low 30s was incredibly rewarding.


----------



## KingEngineRevUp

ESRCJ said:


> Such a shame. With Turing my water-die delta was 8-9C at 400W. I got stuck with a Hydro Copper block for my 3090 KP, which was the worst waterblock I've ever owned. This hobby isn't as fun when going custom gets you almost nothing. Seeing a 2080 Ti drawing 400W while sitting in the low 30s was incredibly rewarding.


I edited my post. I don't think nothing is wrong with the water cooling blocks, they're actually doing a lot of work of you think about it.

I think the better way to phrase things is that the air coolers are just really good this generation.

This is one of the highest heat densities on a die that we know of. Approximately 87% of it's board power draw turns into die heat vs. 54% for the 3090.

Running a 4090 at 520W is like running a 3090 at 837W.


----------



## Azazil1190

NVIDIA has a fix for GeForce RTX 4090 and RTX 4080 blank screen issues - VideoCardz.com


New firmware for GeForce RTX 4090/4080 to fix blank screen during boot NVIDIA releases an updated Firmware Update Tool that should resolve an issue reported by some reviewers. Some RTX 4090/4080 cards may show blank screens during system boot. This issue has now been identified and fixed by...




videocardz.com


----------



## Spit051261

Bit late for me
Just returned my card .
Oh well
Handy to know though
Saying that , my one didn't boot at all.
Did 53,54,7F in a continuous loop
That was a Zotac Extreme.


----------



## th3illusiveman

Currently running a 2700Mhz @ 0.95 UV to see if that can keep room temps down. Power never exceeds 330w in games. This thing heats up a room quick when stock.


----------



## J7SC

ESRCJ said:


> Such a shame. With Turing my water-die delta was 8-9C at 400W. I got stuck with a Hydro Copper block for my 3090 KP, which was the worst waterblock I've ever owned. This hobby isn't as fun when going custom gets you almost nothing. Seeing a 2080 Ti drawing 400W while sitting in the low 30s was incredibly rewarding.





KingEngineRevUp said:


> I edited my post. I don't think nothing is wrong with the water cooling blocks, they're actually doing a lot of work of you think about it.
> 
> I think the better way to phrase things is that the air coolers are just really good this generation.
> 
> This is one of the highest heat densities on a die that we know of. Approximately 87% of it's board power draw turns into die heat vs. 54% for the 3090.
> 
> Running a 4090 at 520W is like running a 3090 at 837W.


The ever-smaller nodes are harder to cool for sure because even with extensive water-cooling, you are talking about 76.3 billion transistors in a die w/ 608.4 mm² for the RTX 4090, compared to 28.3 billion transistors in a die w/ 628 mm² for the RTX 3090 - ...never mind the RTX 2080 Ti with 18.6 billion transistors in a die w/ 754 mm²...

My 2x 2080 Ti NVL/SLI with a combined 760 W for the two were (are) easier to water-cool for sure, yet water-cooling my 4090 makes a huge difference, per info in the table below. Under light 3D loads even on air, it can clock past 3200 MHz, but any serious type of 3D load heated that thing up beyond what I was comfortable with on air. Water-cooling makes a huge difference, and now I can sustain higher clocks even at heavier 3D loads AND run VRAM at its most efficient at the same time.

All that said, the factory air-coolers for both my 3090 Strix and 4090 Giga-G-OC surprised me re. their efficiency (in the short time each was air-cooled ) - but air still won't be as good as water to transport heat away from a densely packed die on a small node...besides, I run 2x ATX mobos side-by-side in a single TT Core P8 case, and factory air-cooled models for this and even the previous gen are just too, well, bloody big to fit  since I haven't mastered the art of 'Tardis' yet.


----------



## Baka_boy

Azazil1190 said:


> NVIDIA has a fix for GeForce RTX 4090 and RTX 4080 blank screen issues - VideoCardz.com
> 
> 
> New firmware for GeForce RTX 4090/4080 to fix blank screen during boot NVIDIA releases an updated Firmware Update Tool that should resolve an issue reported by some reviewers. Some RTX 4090/4080 cards may show blank screens during system boot. This issue has now been identified and fixed by...
> 
> 
> 
> 
> videocardz.com


I actually have the same problem with an ASUS 4090. Just goes blank on restart so I can't see the bios when the system enters it. It only works if you shutdown first.


----------



## newls1

J7SC said:


> The ever-smaller nodes are harder to cool for sure because even with extensive water-cooling, you are talking about 76.3 billion transistors in a die w/ 608.4 mm² for the RTX 4090, compared to 28.3 billion transistors in a die w/ 628 mm² for the RTX 3090 - ...never mind the RTX 2080 Ti with 18.6 billion transistors in a die w/ 754 mm²...
> 
> My 2x 2080 Ti NVL/SLI with a combined 760 W for the two were (are) easier to water-cool for sure, yet water-cooling my 4090 makes a huge difference, per info in the table below. Under light 3D loads even on air, it can clock past 3200 MHz, but any serious type of 3D load heated that thing up beyond what I was comfortable with on air. Water-cooling makes a huge difference, and now I can sustain higher clocks even at heavier 3D loads AND run VRAM at its most efficient at the same time.
> 
> All that said, the factory air-coolers for both my 3090 Strix and 4090 Giga-G-OC surprised me re. their efficiency (in the short time each was air-cooled ) - but air still won't be as good as water to transport heat away from a densely packed die on a small node...besides, I run 2x ATX mobos side-by-side in a single TT Core P8 case, and factory air-cooled models for this and even the previous gen are just too, well, bloody big to fit  since I haven't mastered the art of 'Tardis' yet.
> View attachment 2583362


What are your OC settings.... just wondering. 3100+ is really neat to see


----------



## Sheyster

mickyc357 said:


> What's the general consensus on the gaming OC vs MSI trio x if available at the same price? Gaming OC has better PL but how have OC results been between them ?


Generally speaking, the 4090 Gaming OC has faired well. Most of them OC core and memory decently, but it's still a lottery, so YMMV.

This said, I think most people here are buying whatever they can get for MSRP. You can always flash a 600w BIOS to the Trio X.


----------



## schoolofmonkey

Baka_boy said:


> I actually have the same problem with an ASUS 4090. Just goes blank on restart so I can't see the bios when the system enters it. It only works if you shutdown first.


I get it too, but it's not my card, it's the Samsung G7, turn it off and on the screen appears, doesn't happen if I make the Predator my main monitor.


----------



## newls1

temps while gaming with my newly installed waterblock. Im satisfied! Really liking the memory temps. Leaves alot more room to play with. Prob can increase my +1350m and +230c offsets in AB now.


----------



## dk_mic

newls1 said:


> temps while gaming with my newly installed waterblock. Im satisfied! Really liking the memory temps. Leaves alot more room to play with. Prob can increase my +1350m and +230c offsets in AB now.
> 
> View attachment 2583380


14C ambient or what 🥶


----------



## newls1

dk_mic said:


> 14C ambient or what 🥶


was about 55f in the room!


----------



## Frosted racquet

newls1 said:


> was about 55f in the room!


So 13°C


----------



## dante`afk

"bench" temps, not 24/7temps, unless you like freezing whole day?


----------



## dr/owned

I'm pretty sure Phanteks isn't putting thick enough thermal pads on the VRM's. They use 0.6mm pads everywhere (single precut sheet) but by my math that's only good for the VRAM. VRM = 0.75mm tall, they left a void between the PCB and the waterblock of 1.5mm. 1.5mm - 0.75mm = 0.75mm so you need a 1.0mm pad to fill that. 0.6mm won't even touch it.

EDIT: disregard this, they included a separate sheet of 1.0mm thick pads for the VRM that I forgot about.


----------



## newls1

dr/owned said:


> I'm pretty sure Phanteks isn't putting thick enough thermal pads on the VRM's. They use 0.6mm pads everywhere (single precut sheet) but by my math that's only good for the VRAM. VRM = 0.75mm tall, they left a void between the PCB and the waterblock of 1.5mm. 1.5mm - 0.75mm = 0.75mm so you need a 1.0mm pad to fill that. 0.6mm won't even touch it.


Interesting…. Is there a temp read out for vrm temps in hwinfo by chance?


----------



## dr/owned

newls1 said:


> Interesting…. Is there a temp read out for vrm temps in hwinfo by chance?


No temperature report on the VRM. This was just me taking my autistic measurements of everything and knowing things like the VRM are 0.75mm tall, VRAM is 1.0mm tall.










I know the thermal pads they give are 0.6mm because the total thickness is 0.8mm with the plastic sheets on both sides, and when I measure the plastic sheets separately they're 0.2mm. It's also really aggressive trying to compress 0.6mm thermal pads into a 0.3mm void for the VRAM. Typically you want max 40% compression.


----------



## newls1

I hope my vrm are touching the pads, cause that would be some BS spending this amount of money and have them drop the ball on something so basic to figure out. I should have checked contact before screwing everything down right…. Damn it. I’ll be honest here, I’m not tearing down this loop right now to check either, she’s running so good!


----------



## PhuCCo

KingEngineRevUp said:


> I got the second pump to test. Scored it for $100.
> 
> View attachment 2583338
> 
> 
> Not sure if it made a big difference, at 530W max power draw, delta was 24C
> 
> View attachment 2583339
> 
> 
> I did a 370W test because I think you and someone else did one? The delta was 16.5C
> 
> View attachment 2583340
> 
> 
> It's not fantastic, but it's not bad either. But for the price, yeah that's bad.


Thank you for posting your results. They match what I was getting with my FE block, so I think it is safe to say that's as good as the block will get. The performance isn't that bad now that we've seen what the other blocks can do, but yeah the pricing really put me off when I could at worst match the performance with a Bykski. The EK block does look really nice imo, especially the way that the backplate wraps around the pcb. Looks much better than the Bykski when vertically mounted.

Interesting that your captive backplate spacers were so loose, as mine were pretty tight around the screws and never tried to come off.


----------



## Panchovix

Finally got my Moddiy cable, 600W works with just 3x8 pins instead of 4x8 on my TUF, and also I don't get the red led on when I shutdown/suspend my PC.

Gonna see when I can if another cable "helps" with OC; I doubt it, but hey, better trying than do nothing lol.


----------



## yzonker

newls1 said:


> I hope my vrm are touching the pads, cause that would be some BS spending this amount of money and have them drop the ball on something so basic to figure out. I should have checked contact before screwing everything down right…. Damn it. I’ll be honest here, I’m not tearing down this loop right now to check either, she’s running so good!


For 30 series the VRM temp was included in the hotspot temp. Quite possible that 40 series is the same, but I haven't seen anything to prove that.


----------



## J7SC

newls1 said:


> What are your OC settings.... just wondering. 3100+ is really neat to see


...just MHz sliders with good ol' MSI AB (no curve) and leaving PL at stock. I have a screenie of that setting I'll post later as it is on the 'play' machine which I leave off during work time.



dr/owned said:


> No temperature report on the VRM. This was just me taking my autistic measurements of everything and knowing things like the VRM are 0.75mm tall, VRAM is 1.0mm tall.
> 
> View attachment 2583406
> 
> 
> I know the thermal pads they give are 0.6mm because the total thickness is 0.8mm with the plastic sheets on both sides, and when I measure the plastic sheets separately they're 0.2mm. It's also really aggressive trying to compress 0.6mm thermal pads into a 0.3mm void for the VRAM. Typically you want max 40% compression.


...this time around, more than one water-block supplier seems to have issues with their sourced pads. Mine didn't quite match either (not to mention that they weren't all the same thickness even though they are supposed to be). Didn't much matter because I used thermal putty for the VRAM and for the VRM caps, I added a bit of thermal paste on top and can see it squeeze out just a bit in a pattern that matches the actual cap position underneath.

Per my last post, water-cooling helped hugely (perhaps my card wasn't assembled as well as it should have re. factory pad placements, TIM etc). Once heat-cycled a few times, one of the harder tests IMO is 3DM TimeSpy Extreme - there is one spot where temps shoot up by 4 C - 5 C according to the 3DM graphs and that 'hump of the camel' matches fps jumps...unfortunately, it usually downclocks by a bin right after, even with water-cooling.



yzonker said:


> For 30 series the VRM temp was included in the hotspot temp. Quite possible that 40 series is the same, but I haven't seen anything to prove that.


...at least for the 3090 Strix, VRM temps (2x) were included but I never saw them get out of the 30 C range when water-cooled no matter what...still, I would like to see those on the 4090 Giga somehow.


----------



## dr/owned

Correcting myself: when I woke up I forgot they included thicker pads as a separate sheet of 1.0mm for the front side VRM and a couple other components. *So the Phanteks block is OK again.*

Grrrr where the coffee at.


----------



## motivman

ended up cancelling my EKWB waterblock order for my FE... the high delta temps on the wb, as well as the sexiness and how cool my FE runs... I might just go air cooling this gen. Also waiting to see what heatkiller has up their sleeves, might be waiting for a while though...






HEATKILLER V PRO for RTX 4090 FOUNDERS EDITION, 1,00 €


Coming Soon




shop.watercool.de


----------



## KingEngineRevUp

PhuCCo said:


> Thank you for posting your results. They match what I was getting with my FE block, so I think it is safe to say that's as good as the block will get. The performance isn't that bad now that we've seen what the other blocks can do, but yeah the pricing really put me off when I could at worst match the performance with a Bykski. The EK block does look really nice imo, especially the way that the backplate wraps around the pcb. Looks much better than the Bykski when vertically mounted.
> 
> Interesting that your captive backplate spacers were so loose, as mine were pretty tight around the screws and never tried to come off.


The price is the issue. I gu the mystery bag made up for some of it because I needed a new mousepad actually. 

EKWB are like Corsair fans. They perform mediocre to good but you're paying for the aesthetics of the block, not necessarily for the performance. One thing I'm concerned with Bykski blocks is the IO bracket. They don't give you a new one, so for the FE it would be this big 3 slot bracket.

But for certain, blocking this generation really is to just reduce the size of the card and reduce noise. There's hardly any boost to gain, maybe 2 bins at most vs. a cards benchmarking at 100% fan speed.

PR runs at 44C whisper quiet. I'm pleased with that.


----------



## newls1

yzonker said:


> For 30 series the VRM temp was included in the hotspot temp. Quite possible that 40 series is the same, but I haven't seen anything to prove that.


if thats the case then im perfect.... wish there was a way to confirm this


----------



## newls1

dr/owned said:


> Correcting myself: when I woke up I forgot they included thicker pads as a separate sheet of 1.0mm for the front side VRM and a couple other components. *So the Phanteks block is OK again.*
> 
> Grrrr where the coffee at.


yeah they are dark gray in color where as the others are bluish......

EDIT*** Here is a pic of what i used.... #6's use thick pads and gray in color....


----------



## Netherwind

I'll try another question, hoping I'll have better this time. Anyone with a Gigabyte 4090 Gaming OC tried flashing to F2 BIOS? I tried myself but it didn't work. According to the readme, N4090GOA.f2 goes with the OC BIOS but when I try to launch the program, the error message says "The BIOS mode does not match". However, if I try N4090GOL.f2 which goes with the Silent BIOS, it works (I haven't done it yet of course since my card is set to OC BIOS).

Does the Readme file have a typo or are the BIOSes from Gigabyte website mixed up?


----------



## Fr0stkeule

have some one the 4090 GS bios phantomgs.rom.PDF I/O ERROR: Cannot open file: phantomgs.rom .
where do i make the mistake the --protectoff i want to flash the silent Bios


----------



## newls1

fluid is finally all bled and shes performing wonderfully so far. I think all GPU's should be waterblocked!


----------



## dr/owned

Netherwind said:


> I'll try another question, hoping I'll have better this time. Anyone with a Gigabyte 4090 Gaming OC tried flashing to F2 BIOS? I tried myself but it didn't work. According to the readme, N4090GOA.f2 goes with the OC BIOS but when I try to launch the program, the error message says "The BIOS mode does not match". However, if I try N4090GOL.f2 which goes with the Silent BIOS, it works (I haven't done it yet of course since my card is set to OC BIOS).
> 
> Does the Readme file have a typo or are the BIOSes from Gigabyte website mixed up?


You got it backwards I think
L is OC
A is Silent:










This is 4090 OC Bios F2.


----------



## mirkendargen

dr/owned said:


> You got it backwards I think
> L is OC
> A is Silent:
> 
> View attachment 2583455
> 
> 
> This is 4090 OC Bios F2.


This, and I noticed no performance difference. I tried that Nvidia update tool today for the black screen issue thinking it might also help with my issue with my monitor not waking up from standby in anything but the launch driver, and when I ran it it said my BIOS is already updated, so I think that's all these BIOSes are.


----------



## zkareemz

*can anyone share Colorful iGame GeForce RTX 4090 Neptune OC-V bios?*


----------



## SilenMar

Let me help you dodge a bullet. Don't buy Gigabyte 4090 Xtreme Waterforce. It is made like crap.

I bought two 4090. Another one is Strix. Others are all sold out. They have been used for two weeks. Here is my impression:

1. This version of Gigabyte 4090 is built like plastic.
(1)The shroud looks cheap.
(2) The copper plate is machined with scratches. It doesn't cover components such as the VRM of the memory the way air-cooled GPU does.
(3) *A part of the north core VRM is only covered 1/3*. *It's more like Gigabyte adjusted that part of the PCB upside down but forgot to communicate with the guys who made the copper plate. The components that are less than half-covered likes this will have higher temperature, higher chance to degrade overtime to the point of burndown. *
(4) One of the thermal pads is off-centered.
(5) The 360mm radiator is made of light aluminum instead of copper.
(6) The fans are not new. They seem to be refurbished from the previous 3090Ti. I can still see the debris left on the blades washed by electronic cleaning fluid.

































2. It's not exactly quiet when the fans go to 1300rpm. The fans are 3000rpm. They are only slightly quiet at 900rpm which is 30% of the max speed. And the fans keep spinning at 30% all the time. They don't stop. In comparison, ASUS Strix let the fan stops when idle. Strix is also quieter when at the same load.

3. *This is critical. The default fans of Gigabyte hit to max like a jet every now and then under load. *Something triggered the annoying behavior. The BIOS update didn't fix it. When you control the fans through software like MSI afterburner. The fans go to the double speed of the set parameter. Multiple people have the same issues.

4. The power is limited to 500W. After the BIOS update, the power is limited even down to 490W. No overclocking protentional. The website of official Gigabyte says it is OC version. But the package of the box has no mention of the OC version. It is just a normal PCB despite the premium price.

5. The temperature reaches to 58c-63C easily at 490W. Any more than that the fan will spin louder.

6. *The possible culprit of the fans hitting the max is a sudden temperature rise.* Some components such as the partially covered VRM or the uneven pasted memory are not properly cooled. They can heat up to 100C threshold multiple times such as the memory temperature. No more BIOS updates can fix high temperatures like this.









7. The thermal pad on this GPU is also a cheap adaption due to incompetent design:

(1) Though the original pad has good thermal conductivity, it is brittle enough to deform that you might need to replace them after every disassembly.

(2) The distance between VRM chips is 2.78mm. The copper plate distance for each type of chip is 3.00mm. The main power delivery chips have a 1.5mm thermal pad. So the side chips require a 1.28mm (1.5mm+2.78mm-3.0mm) thermal pad.


















(3) *There is no conventional 1.28mm thermal pad. This can cause a lot of variations to overheat components. The thermal pad is usually the thinnest at the end. And the north VRM at the end is only partially covered. This is why a large portion of burned chips reside at the edge of a PCB. A lot of things can go wrong with a design like this. Who the hell thinks this is a good idea? *

(4) The actual pad comes off with the plate is 1.20mm short of touch after disassembly. If using a 1.5mm thermal pad instead, it also causes less pressure on the memory and core.

(5) So you have to hammer a 1.5mm thermal pad down to 1.28mm rated at 15W/mK. You don't buy a premium card to disassemble it multiple times to deal with hassles.


















8.* Overall the cooling capacity of a cheaply made copper plate is limited. It is engineered in a way the plate doesn't even cover all the VRMs as the heat dissipation of that plate is less than 500W. The thermal pad is also uniquely required at 1.28mm. These Gigabyte engineers don't do proper calculation, they do ill calculation to make core temperature looks good with a risk to burn components elsewhere. The overall result is the fans hit like a jet in the combination above. This is also why a so-called water-cooled GPU with incompetent cooling capacity in disguise has a worse power limit down to 500W instead of 600W compared to the air-cooled GPU.*

=======================

The conclusion:

The value of this card is bad. It is not comparable to the ASUS Strix in terms of almost everything such as build quality, noise level, OC performance. Strix is built like a piece of art in comparison. The premium product is all about the user experience with as little annoyance as possible. The Gigabyte AIO 4090 has a crap-tier user experience not even mentioning there is a risk of burn of 100C components in the long run. Gigabyte didn't test everything.

*So don't buy this GPU. A thin piece of loose bare cheap-machined copper with scratches won't cut it when other components need to be cooled down properly. *


----------



## Netherwind

dr/owned said:


> You got it backwards I think
> L is OC
> A is Silent:
> 
> View attachment 2583455
> 
> 
> This is 4090 OC Bios F2.


Thank you!



mirkendargen said:


> This, and I noticed no performance difference. I tried that Nvidia update tool today for the black screen issue thinking it might also help with my issue with my monitor not waking up from standby in anything but the launch driver, and when I ran it it said my BIOS is already updated, so I think that's all these BIOSes are.


I was hoping they adjusted the awful fan curve. Fans go from 0% to 61/66% once you hit 50C. Should ramp up a bit instead if you ask me.
(don't mind the RPM in the picture, I set a manual speed)


----------



## mirkendargen

Netherwind said:


> Thank you!
> 
> 
> I was hoping they adjusted the awful fan curve. Fans go from 0% to 61/66% once you hit 50C. Should ramp up a bit instead if you ask me.
> (don't mind the RPM in the picture, I set a manual speed)


Oh before I was on water I just did a custom curve in AB.


----------



## KingEngineRevUp

SilenMar said:


> Gigabyte 4090 Extreme Waterforce


Wow thanks for letting us know. What are the temperatures if you do a Port Royal Stress test up to loop 10 or above? That should stagnate the water.


----------



## SilenMar

KingEngineRevUp said:


> Wow thanks for letting us know. What are the temperatures if you do a Port Royal Stress test up to loop 10 or above? That should stagnate the water.


The GPU core temperature ranges from 58C to 62C. The Hot Sport Temperature is 10C higher than GPU core from 68C to 74C. 
But the Memory Junction Temperature is usually at 80C to 88+C. Some areas are not properly cooled with only a layer of loose cooper. Once the temperature there hits 88C, the fan will spin at the max speed at 2500-3000rpm which wakes me up in the middle of the night.
Somebody needs to stop thinking a 4-year warranty with a cheap made card is a good deal. It is not. It will bother you with things like these.


----------



## KingEngineRevUp

SilenMar said:


> The GPU core temperature ranges from 58C to 62C. The Hot Sport Temperature is 10C higher than GPU core from 68C to 74C.
> But the Memory Junction Temperature is usually at 80C to 88+C. Some areas are not properly cooled with only a layer of loose cooper. Once the temperature there hits 88C, the fan will spin at the max speed at 2500-3000rpm which wakes me up in the middle of the night.
> Somebody needs to stop thinking a 4-year warranty with a cheap made card is a good deal. It is not. It will bother you with things like these.


Can you disassemble the card and torque the screws down around the memory more? I would actually check the thermal pads. there's a chance the thermal pads might only be touching half of the memory, like someone assembled it carelessly.


----------



## newls1

KingEngineRevUp said:


> Can you disassemble the card and torque the screws down around the memory more? I would actually check the thermal pads. there's a chance the thermal pads might only be touching half of the memory, like someone assembled it carelessly.


something is certainly wrong there. no way in hell temps should be that high.


----------



## SilenMar

KingEngineRevUp said:


> Can you disassemble the card and torque the screws down around the memory more? I would actually check the thermal pads. there's a chance the thermal pads might only be touching half of the memory, like someone assembled it carelessly.


I did open it up and torque all the 9 screws plus reapplying thermal paste. It still has the junction memory temperature around 80C at 490W. The temperature will reach 88C regardless because the surface contact is not optimal with a cheap AIO plate. The plate is just CNCed rather cheap.


----------



## KingEngineRevUp

newls1 said:


> something is certainly wrong there. no way in hell temps should be that high.


Yeah this kind of **** happens, be warned. It's very graphic... Well for a computer geek it is. Gamers discover missing or misaligned thermal pads on new graphics cards - VideoCardz.com


----------



## KingEngineRevUp

SilenMar said:


> I did open it up and torque all the 9 screws plus reapplying thermal paste. It still has the junction memory temperature around 80C at 490W. The temperature will reach 88C regardless because the surface contact is not optimal with a cheap AIO plate. The plate is just CNCed rather cheap.


That's interesting, the MSI Suprim X has a very similar coldplate design but the memory stays relatively cool thanks to the blower fan it has. 

I wonder why gigabyte skipped the blower.


----------



## SilenMar

newls1 said:


> something is certainly wrong there. no way in hell temps should be that high.


Have you ever seen the junction memory on the 30 series? It reached 100C as well. But this is worse on 4090 Waterforce.
How much heat you can dissipate under a layer of copper without airflow?


----------



## KingEngineRevUp

SilenMar said:


> Have you ever seen the junction memory on the 30 series? It reached 100C as well. But this is worse on 4090 Waterforce.
> How much heat you can dissipate under a layer of copper without airflow?


What are you thinking of doing? Returning it and getting another AIB? Or maybe water block it yourself?


----------



## SilenMar

KingEngineRevUp said:


> I wonder why gigabyte skipped the blower.


It will cost more, makes less profit. 



KingEngineRevUp said:


> What are you thinking of doing? Returning it and getting another AIB? Or maybe water block it yourself?


A BIOS update might let the fan ignore the junction memory temperature so the fans won't spin to the max. Or just replace/plug the fan into a different control board. 

But The high temperature is still there.

I'm going to flash a 600W BIOS. Let that area hit up to 100C with more load. I'm rather sure the components will likely blow up within half a year.
By then, I already have a 4090Ti. I RMA this 4090 with a new one then sell it immediately after that.


----------



## yzonker

SilenMar said:


> Let me help you dodge a bullet.
> 
> *Don't buy Gigabyte 4090 Extreme Waterforce *
> 
> This is one of the cards I can buy. Another one is Strix. Others are all sold out. They have been used for two weeks. Here is my impression.
> 
> 1. Gigabyte is built like plastic. The shroud looks cheap. The density of PCB is rather loose. The 360mm radiator is in fact made of light aluminum, not copper. The card looks rather cheap in comparison. The fans are not new. They seem to be refurbished from previous 3090Ti. I can still see the debris left on the blades washed by electronic cleaning fluid.
> 
> 2. It's not exactly quiet when the fans go to 1300rpm. The fans are 3000rpm. They are only slightly quiet at 900rpm which is 30% of the max speed. And the fans keep spinning at 30% all the time. They don't stop. In comparison, ASUS Strix let the fan stop when idle. Strix is also quieter when at the same load.
> 
> 3. The default fans of Gigabyte hit to max like a jet every now and then under load. Something triggered the annoying behavior. BIOS update didn't fix it. When you control the fans through the software like MSI afterburner. The fans actually go to the double speed of the set parameter.
> 
> 4. The power is limited to 500W. After the BIOS update, the power is limited even down to 490W. No overclocking protentional. The website of the official Gigabyte says it is OC version. But the package of the box has no mention of the OC version. I guess it is just a normal PCB despite the premium price.
> 
> 5. The temperature reaches to 58c-63C easily at 490W. Anymore than that the fan will spin louder.
> 
> The conclusion:
> 
> The value of this card is bad. It is not comparable to the ASUS Strix in terms of almost everything such as build quality, noise level, OC performance.
> 
> Strix is built like a piece of art in comparison. The premium product is all about the user experience with as little annoyance as possible. The Gigabyte AIO 4090 has the bottom-tier user experience. Gigabyte didn't test everything.
> 
> Edit:
> 6. Ironically, the culprit of the fans hitting the max is due to some components in certain area are not properly cooled. They can heat up 100C threshold multiple times. No more BIOS updates can fix high temperature like this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *So don't buy this GPU. A thin piece of loose bare copper won't cut it when other components need to be cooled down properly.*


That's odd. You're showing 46C on the core at the same time as the mem hits 100C. Are you mining? 

If not then I wonder if the card is defective and the mem reading is incorrect. Does the memory temp erratically jump up to 100C? Or steadily increase under load?


----------



## KingEngineRevUp

SilenMar said:


> It will cost more, makes less profit.
> 
> 
> A BIOS update might let the fan ignore the junction memory temperature so the fans won't spin to the max. Or just replace/plug the fan into a different control board.
> 
> But The high temperature is still there.
> 
> I'm going to flash a 600W BIOS. Let that area hit up to 100C with more load. I'm rather sure the components will likely blow up within half a year.
> By then, I already have a 4090Ti. I RMA this 4090 with a new one then sell it immediately after that.


It's a $1900 card, the surpim X is $1750. They're making a profit they're just being greedy.


----------



## SilenMar

yzonker said:


> That's odd. You're showing 46C on the core at the same time as the mem hits 100C. Are you mining?
> 
> If not then I wonder if the card is defective and the mem reading is incorrect. Does the memory temp erratically jump up to 100C? Or steadily increase under load?


The memory junction temperature at just 450W is still at 80C even if the copper plate is properly placed assuming every memory chip is uniformly contacted by the plate. 








There is thermal throttle at 110C for the internal memory temperature. By that time it is probably too late.


----------



## neteng101

Anyone noticing 526.98 weirdness? My OC doesn't survive a reboot now?


----------



## J7SC

KingEngineRevUp said:


> It's a $1900 card, the surpim X is $1750. They're making a profit they're just being greedy.





SilenMar said:


> The memory junction temperature at just 450W is still at 80C even if the copper plate is properly placed assuming every memory chip is uniformly contacted by the plate.
> View attachment 2583500
> 
> There is thermal throttle at 110C for the internal memory temperature. By that time it is probably too late.


The regular Gigabyte-Gaming-OC has very good VRAM temps, speaking not only about my own experience but online tests (ie. Hardware Unboxed). That said, even the MSI Suprim Liquid AIO seems to have higher VRAM temps than its regular air-cooled cousin. Presumably, the AIO setups require a very different cold-plate design by manufacturers...The pics you showed earlier suggest just a weaker clamping force / easier deformation. I posted > these temps yesterday (incl. VRAM) for my Gaming-OC, both in air-cooled and subsequent water-blocked trim.

I presume you purchased the AIO model so that you don't have to build a new / integrate into an existing custom loop - otherwise, getting an affordable water-block (such as from Bykski) might be another option for you. A less drastic option would be to use thermal putty for the VRAM and the existing cooler.


----------



## KingEngineRevUp

J7SC said:


> The regular Gigabyte-Gaming-OC has very good VRAM temps, speaking not only about my own experience but online tests (ie. Hardware Unboxed). That said, even the MSI Suprim Liquid AIO seems to have higher VRAM temps than its regular air-cooled cousin. Presumably, the AIO setups require a very different cold-plate design by manufacturers...The pics you showed earlier suggest just a weaker clamping force / easier deformation. I posted > these temps yesterday (incl. VRAM) for my Gaming-OC, both in air-cooled and subsequent water-blocked trim.
> 
> I presume you purchased the AIO model so that you don't have to build a new / integrate into an existing custom loop - otherwise, getting an affordable water-block (such as from Bykski) might be another option for you. A less drastic option would be to use thermal putty for the VRAM and the existing cooler.


The gigabyte one doesn't have the traditional hybrid blower fan that cools the modules. It just has a plastic shroud that covers the entire PCB.

They probably thought the AIO would do all of the work but it clearly isn't if memory is hitting 110C.


----------



## J7SC

KingEngineRevUp said:


> The gigabyte one doesn't have the traditional hybrid blower fan that cools the modules. It just has a plastic shroud that covers the entire PCB.
> 
> They probably thought the AIO would do all of the work but it clearly isn't if memory is hitting 110C.


I am currently looking for some online reviews of the Aorus AIO to see if the reported VRAM temps are out of the ordinary for that model (meaning it is some sort of quality problem with the specific sample), or whether that is supposed to be 'normal'.


----------



## xrb936

Guys I ordered one Gigabyte 4090 Gaming OC, but I still want to wait for the Strix OC, since usually the Strix OC gets better GPU. Will there be any big difference this year?


----------



## motivman

xrb936 said:


> Guys I ordered one Gigabyte 4090 Gaming OC, but I still want to wait for the Strix OC, since usually the Strix OC gets better GPU. Will there be any big difference this year?


NO


----------



## KingEngineRevUp

J7SC said:


> I am currently looking for some online reviews of the Aorus AIO to see if the reported VRAM temps are out of the ordinary for that model (meaning it is some sort of quality problem with the specific sample), or whether that is supposed to be 'normal'.


The results make sense to me. Almost every air cooler for the 4090 has vapor chambers over the memory modules or a dedicated heatsink. They have a fluid media (air in this case) flowing to keep dissipate the heat.

The gigabyte water force has that come plate integrated to the AIO but unfortunately the resistance to conduct that hear to the water is way too slow. So it's acting more like a passive cooler.

I bet if he took the shroud off and had a fan blowing at it the temperatures would go down.


----------



## UdoG

Has anyone installed a bios with a higher power limit on the Zotac Amp Extreme Airo?


----------



## mirkendargen

Didn't like every 3090 in stock form get 90C+ VRAM temps on the backside, and the vast majority of people just lived with it and didn't care? Is it actually any kind of problem or are we just being overclock.net and saying "90c = bad!".


----------



## J7SC

xrb936 said:


> Guys I ordered one Gigabyte 4090 Gaming OC, but I still want to wait for the Strix OC, since usually the Strix OC gets better GPU. Will there be any big difference this year?


I can't very well generalize with just one sample, but I have been very happy with my Gigabyte Gaming-OC's performance, and judging by most other owners' comments here, it is a very good 'buy'. It also seems that the spread in GPU performance of 4090s in general is fairly tight, at least with the early examples...not entirely sure about specific VRAM, though. All that said, it is hard to go wrong with a Strix - all depends on your budget as well.


----------



## xrb936

motivman said:


> NO





J7SC said:


> I can't very well generalize with just one sample, but I have been very happy with my Gigabyte Gaming-OC's performance, and judging by most other owners' comments here, it is a very good 'buy'. It also seems that the spread in GPU performance of 4090s in general is fairly tight, at least with the early examples...not entirely sure about specific VRAM, though. All that said, it is hard to go wrong with a Strix - all depends on your budget as well.


Thank you, guys. Budget is never the issue, just very hard to find the stock lol


----------



## dr/owned

mirkendargen said:


> Didn't like every 3090 in stock form get 90C+ VRAM temps on the backside, and the vast majority of people just lived with it and didn't care? Is it actually any kind of problem or are we just being overclock.net and saying "90c = bad!".


At something like 105C the card hits therm throttling and pulls clocks real hard. I'm not sure though if most people didn't hit the back VRAM though as 12-24GB where it kept them at idle.


----------



## KingEngineRevUp

mirkendargen said:


> Didn't like every 3090 in stock form get 90C+ VRAM temps on the backside, and the vast majority of people just lived with it and didn't care? Is it actually any kind of problem or are we just being overclock.net and saying "90c = bad!".


Go a few pages back. We're talking about a users cars causing the fans to spin 100% because their memory is going 100C+


----------



## SilenMar

J7SC said:


> The regular Gigabyte-Gaming-OC has very good VRAM temps, speaking not only about my own experience but online tests (ie. Hardware Unboxed). That said, even the MSI Suprim Liquid AIO seems to have higher VRAM temps than its regular air-cooled cousin. Presumably, the AIO setups require a very different cold-plate design by manufacturers...The pics you showed earlier suggest just a weaker clamping force / easier deformation. I posted > these temps yesterday (incl. VRAM) for my Gaming-OC, both in air-cooled and subsequent water-blocked trim.
> 
> I presume you purchased the AIO model so that you don't have to build a new / integrate into an existing custom loop - otherwise, getting an affordable water-block (such as from Bykski) might be another option for you. A less drastic option would be to use thermal putty for the VRAM and the existing cooler.


A cheap thin layer of copper won't cut it. It is just bad at cooling.









I even replaced the thermal pads with more premium ones. Nothing has changed. 









500W cannot be cooled with a design like this while the Strix runs like thousands of miles better.


----------



## kairi_zeroblade

SilenMar said:


> A cheap thin layer of copper won't cut it. It is just bad at cooling.
> View attachment 2583534
> 
> 
> I even replaced the thermal pads with more premium ones. Nothing has changed.
> View attachment 2583535
> 
> 
> 500W cannot be cooled with a design like this while the Strix runs like thousands of miles better.


that's "worth" 2000$+ already?? (gigabutt's making me laugh)


----------



## SilenMar

kairi_zeroblade said:


> that's "worth" 2000$+ already?? (gigabutt's making me laugh)


The core temperature actually goes a bit higher with better thermal pads. It's obvious the air cooled 600W GPU is superior than 500W AIO now


----------



## kairi_zeroblade

SilenMar said:


> The core temperature actually goes a bit higher with better thermal pads. It's obvious the air cooled 600W GPU is superior than 500W AIO now


with that only small chunk of copper in there I doubt you could call that very effective VS the humongous air cooled variants (3 fans and thicc heatsink with pipes still winning)


----------



## SilenMar

That's why all the AIO GPU has limited power target up to only 520W. The AIO cannot be cooled anymore.


----------



## kairi_zeroblade

SilenMar said:


> That's why all the AIO GPU has limited power target up to only 520W. The AIO cannot be cooled anymore.


seems that's part of NVIDIA's plans, a watercooled segment for clickbait 2000$. Nice one!!


----------



## SilenMar

kairi_zeroblade said:


> seems that's part of NVIDIA's plans, a watercooled segment for clickbait 2000$. Nice one!!


What are you talking about? Nvidia has FE goes up to 660W. It's a nice card.


----------



## doom3crazy

Panchovix said:


> Finally got my Moddiy cable, 600W works with just 3x8 pins instead of 4x8 on my TUF, and also I don't get the red led on when I shutdown/suspend my PC.
> 
> Gonna see when I can if another cable "helps" with OC; I doubt it, but hey, better trying than do nothing lol.


ive been using the corsair two 8 pin to 12vhpwr cable and it's been great. Least amount of cable clutter possible heh


----------



## kairi_zeroblade

doom3crazy said:


> ive been using the corsair two 8 pin to 12vhpwr cable and it's been great. Least amount of cable clutter possible heh


how much you got it? is it for free?


----------



## doom3crazy

kairi_zeroblade said:


> how much you got it? is it for free?


600W PCIe 5.0 12VHPWR Type-4 PSU Power Cable (corsair.com)


----------



## doom3crazy

Im still looking for the msi suprim x gaming bios(air)(most recent bios update) if anyone has this card and could upload a rom for it, I would like to flash it to my gaming trio. It would appear as though the asus cards do not play nicely with the msi cards as far as bios flashes go. One user reported DP didnt work upon flashing his gaming trio with asus bios

EDIT: I found the bios. If anyone needs it let me know


----------



## Sajin1337

SilenMar said:


> Let me help you dodge a bullet.
> 
> *Don't buy Gigabyte 4090 Extreme Waterforce *
> 
> This is one of the cards I can buy. Another one is Strix. Others are all sold out. They have been used for two weeks. Here is my impression.
> 
> 1. Gigabyte is built like plastic. The shroud looks cheap. The density of PCB is rather loose. The 360mm radiator is in fact made of light aluminum, not copper. The card looks rather cheap in comparison. The fans are not new. They seem to be refurbished from previous 3090Ti. I can still see the debris left on the blades washed by electronic cleaning fluid.
> 
> 2. It's not exactly quiet when the fans go to 1300rpm. The fans are 3000rpm. They are only slightly quiet at 900rpm which is 30% of the max speed. And the fans keep spinning at 30% all the time. They don't stop. In comparison, ASUS Strix let the fan stop when idle. Strix is also quieter when at the same load.
> 
> 3. The default fans of Gigabyte hit to max like a jet every now and then under load. Something triggered the annoying behavior. BIOS update didn't fix it. When you control the fans through the software like MSI afterburner. The fans actually go to the double speed of the set parameter.
> 
> 4. The power is limited to 500W. After the BIOS update, the power is limited even down to 490W. No overclocking protentional. The website of the official Gigabyte says it is OC version. But the package of the box has no mention of the OC version. I guess it is just a normal PCB despite the premium price.
> 
> 5. The temperature reaches to 58c-63C easily at 490W. Anymore than that the fan will spin louder.
> 
> The conclusion:
> 
> The value of this card is bad. It is not comparable to the ASUS Strix in terms of almost everything such as build quality, noise level, OC performance.
> 
> Strix is built like a piece of art in comparison. The premium product is all about the user experience with as little annoyance as possible. The Gigabyte AIO 4090 has the bottom-tier user experience. Gigabyte didn't test everything.
> 
> Edit:
> 6. Ironically, the culprit of the fans hitting the max is due to some components in certain area are not properly cooled. They can heat up 100C threshold multiple times. No more BIOS updates can fix high temperature like this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *So don't buy this GPU. A thin piece of loose bare copper won't cut it when other components need to be cooled down properly.*


I heard from a reliable source that the memory junction temps are junk data, so they mean nothing, and can't be relied on...



https://forums.evga.com/FindPost/3586866


----------



## AvengedRobix

UdoG said:


> Has anyone installed a bios with a higher power limit on the Zotac Amp Extreme Airo?


Yes.. i try all.bios.. but only for bench.. in game Is worth


----------



## yzonker

Sajin1337 said:


> I heard from a reliable source that the memory junction temps are junk data, so they mean nothing, and can't be relied on...
> 
> 
> 
> https://forums.evga.com/FindPost/3586866


Except that's wrong. The junction temp varies directly with how well the mem chips are cooled. And I've seen examples, including this one (and my own 3090 when it was air cooled), where the card will ramp the fans hard when the junction temp goes over 100C. The cards are obviously using that temp reading as part of the fan control.


----------



## Netherwind

mirkendargen said:


> Oh before I was on water I just did a custom curve in AB.


That's what I'm using now as I grew tired of these fan issues. Did you also notice that setting a fan % in MSI AB makes Fan 1 and 2 spin at different RPM?



xrb936 said:


> Guys I ordered one Gigabyte 4090 Gaming OC, but I still want to wait for the Strix OC, since usually the Strix OC gets better GPU. Will there be any big difference this year?


I would say that the Strix cooler is the best in terms of cooling and fans are among the quietest. If you're prepared to pay premium for that, only you can decide.



J7SC said:


> I can't very well generalize with just one sample, but I have been very happy with my Gigabyte Gaming-OC's performance, and judging by most other owners' comments here, it is a very good 'buy'. It also seems that the spread in GPU performance of 4090s in general is fairly tight, at least with the early examples...not entirely sure about specific VRAM, though. All that said, it is hard to go wrong with a Strix - all depends on your budget as well.


I'm very interested in your opinion about the card as I've had mine for a week or two and have some rather negative opinions about it. First I owned a TUF but I sent it back due to extreme coil whine.

The good thing about the Gigabyte would be that it doesn't have any coil whine, or at least very little. The GPU bracket solution also seemed like a good idea but after using it for some time I notice that at specific fan RPM, the card vibrates a little which translates into a humming which is spread to the case itself. Not sure if you've experience it yourself? Next would be the fans and the fan profiles - I think they are pretty bad. If we start with the fan themselves, they are quiet up to ~1200RPM but after that they sound pretty bad. When I increase fan speed up to 90% they start making a scratching sound but since I never use such speeds I don't believe it's an issue. The fan profiles are also crap as they spin up to 60% (1300RPM) as soon as either 50C or 60C GPU temp is met (depending on BIOS). So if one plays a ligher game the fans will not spin until 60C (if Silent BIOS is enabled) where they ramp up, which is audible, and spin until the temps decrease below 60C where they stop spinning altogether. This of course makes the temps go up again to 60C where the fans start spinning again and so forth. My old TUF didn't make this behavior at all.


----------



## Panchovix

doom3crazy said:


> ive been using the corsair two 8 pin to 12vhpwr cable and it's been great. Least amount of cable clutter possible heh


I did test with 2 cables it seems it also works, but kinda scared haha, 525W in 2 cables is 262.5W per each, and with 525 on 3 cables is 175 per each, so I guess I went for the safe side.

Though, even though that's confirmed that any cable can burn, I still feel way more relaxed with this cable.


----------



## Azazil1190

Just received my Corsair cable for the tuf for my second system.
Nice clean look cable btw! And less cables on psu and more space to the case.
Still waiting the cablemod since 29 οctomber for the giga .


----------



## Sheyster

doom3crazy said:


> 600W PCIe 5.0 12VHPWR Type-4 PSU Power Cable (corsair.com)


I still can't believe EVGA doesn't have a cable like this available, despite their falling out with nVidia.


----------



## Sheyster

doom3crazy said:


> EDIT: I found the bios. If anyone needs it let me know


If it's the newer one please post it up. I have the older one.

I'm currently on the V2 ASUS Strix BIOS that Kedarwolf posted up (the 120% PL/600w OC version) as a daily driver for gaming. I'm just running it at default settings (500w 100% PL). So far so good!


----------



## KingEngineRevUp

SilenMar said:


> That's why all the AIO GPU has limited power target up to only 520W. The AIO cannot be cooled anymore.


An AIO would have worked of you combine both the Suprim X blower and the gigabyte 360mm (or 280mm) radiator. Just like EVGA used to do with their 3090 Kingpin.

Sadly we don't have EVGA around anymore, but you're damn right they would have done a better job since they've been doing hybrids for ages.


----------



## Sheyster

SilenMar said:


> What are you talking about? Nvidia has *FE goes up to 660W*. It's a nice card.


600W max PL...


----------



## keikei

Hey guys, I managed to snag an msi gaming trio & got a ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable from moddiy. Any advice for the installation?


----------



## yzonker

Sheyster said:


> I still can't believe EVGA doesn't have a cable like this available, despite their falling out with nVidia.


I think Jacob said on Twitter that they are shooting for next month.


----------



## J7SC

SilenMar said:


> A cheap thin layer of copper won't cut it. It is just bad at cooling.
> View attachment 2583534
> 
> 
> I even replaced the thermal pads with more premium ones. Nothing has changed.
> View attachment 2583535
> 
> 
> 500W cannot be cooled with a design like this while the Strix runs like thousands of miles better.


...below is the Gigabyte 4090 Gaming-OC (then air-cooled, now water-cooled). The cold-plate obviously differs from your AIO setup, but apparently so does the left side of the VRM (?) on the PCB, unless Gigabyte made a running change on the PCBs, per lower-right pic...you would think they would use the the same PCB. The Giga-Gaming-OC does seem to use the same PCB as the Aorus Master (DerBauer had an in-depth look, also showing the PCB of the Master).

...adjusting for the fact that I'm a great fan of thermal putty anyways (ie. TG 10) and used it for both the 6900XT and 3090 Strix below (these are work+play combos), I would seriously consider thermal putty for the AIO setup, especially given the different coldplate. I would also doublecheck the original thermal pads thickness on the chokes (not that chokes necessarily need cooling, unlike the caps). That actually could affect VRAM temps if there are any net height differentials.










Netherwind said:


> That's what I'm using now as I grew tired of these fan issues. Did you also notice that setting a fan % in MSI AB makes Fan 1 and 2 spin at different RPM?
> 
> 
> I would say that the Strix cooler is the best in terms of cooling and fans are among the quietest. If you're prepared to pay premium for that, only you can decide.
> 
> 
> I'm very interested in your opinion about the card as I've had mine for a week or two and have some rather negative opinions about it. First I owned a TUF but I sent it back due to extreme coil whine.
> 
> The good thing about the Gigabyte would be that it doesn't have any coil whine, or at least very little. The GPU bracket solution also seemed like a good idea but after using it for some time I notice that at specific fan RPM, the card vibrates a little which translates into a humming which is spread to the case itself. Not sure if you've experience it yourself? Next would be the fans and the fan profiles - I think they are pretty bad. If we start with the fan themselves, they are quiet up to ~1200RPM but after that they sound pretty bad. When I increase fan speed up to 90% they start making a scratching sound but since I never use such speeds I don't believe it's an issue. The fan profiles are also crap as they spin up to 60% (1300RPM) as soon as either 50C or 60C GPU temp is met (depending on BIOS). So if one plays a ligher game the fans will not spin until 60C (if Silent BIOS is enabled) where they ramp up, which is audible, and spin until the temps decrease below 60C where they stop spinning altogether. This of course makes the temps go up again to 60C where the fans start spinning again and so forth. My old TUF didn't make this behavior at all.


I didn't notice any vibrations at all, though I had the card in a temporary and rubber-insulated mount position while in air-cooled mode...btw., with air-cooling, the 3 fans have 2 different speeds and readouts (the inner one also spins opposite to dampen noise and vibration). The only problem I encountered while air-cooled was that above 1.1v / 485 W +-, hotspot got a bit too toasty for my liking even with fans at max. Other than that, I am not really an expert on air-cooled 4090s as the card was always destined for a 1320x63 loop, replacing the 3090 Strix that is moving to another workstation.


----------



## Sheyster

keikei said:


> Hey guys, I managed to snag an msi gaming trio & got a ATX 3.0 PCIe 5.0 600W Triple 8 Pin to 12VHPWR 16 Pin Power Cable from moddiy. Any advice for the installation?








12VHPWR Cable Guide – CableMod







cablemod.com





Just follow the guidelines above with the cable.


----------



## Zero989

Day 502 of no leaked Neptune OC V BIOS or XOC 1000W BIOS


----------



## KingEngineRevUp

Alphacool knows what's up


----------



## dk_mic

also received the corsair 12vhpwr cable today.. surely someone chewed on that one

































noticed slightly less droop on the 12V rails, min max with the adapter on a superposition run was 11.785 - 12.106
with that cable it was 11.799 - 12.180. Is that just variance or is there an impact? PSU is HX1500i


----------



## Sheyster

dk_mic said:


> also received the corsair 12vhpwr cable today.. surely someone chewed on that one


It's not pretty but hopefully it works well.


----------



## bmagnien

TechPowerUp GPU-Z v2.51.0 Released


TechPowerUp today released the latest version of TechPowerUp GPU-Z, the popular graphics sub-system information, diagnostic, and monitoring utility. Version 2.51.0 adds full support for the new NVIDIA GeForce RTX 4080 graphics card, including support for BIOS extraction from RTX 4090 and RTX...




www.techpowerup.com





Version 2.51.0 adds full support for the new NVIDIA GeForce RTX 4080 graphics card, *including support for BIOS extraction from RTX 4090 and RTX 4080*


----------



## Nizzen

Zero989 said:


> Day 502 of no leaked Neptune OC V BIOS or XOC 1000W BIOS


😇


----------



## bmagnien

Nizzen said:


> 😇
> View attachment 2583601


You gonna share that or?


----------



## Zero989

Nizzen said:


> 😇
> View attachment 2583601


Are you allowed to share


----------



## dk_mic

Sheyster said:


> It's not pretty but hopefully it works well.


yeah i guess it will. also have a cablemod on the way, but just an adapter, not a full cable.
can't wait to waterblock this thing, not happy with temps. also have to do something against the coilwhine _eyes at glue gun_
have an ekwb msi block coming end of november/december


----------



## J7SC

Nizzen said:


> 😇
> View attachment 2583601


1.) ...You're likely going get a PM tsunami
2.)_ 'I love the smell of napalm burnt silicon in the morning'_


----------



## yzonker

bmagnien said:


> TechPowerUp GPU-Z v2.51.0 Released
> 
> 
> TechPowerUp today released the latest version of TechPowerUp GPU-Z, the popular graphics sub-system information, diagnostic, and monitoring utility. Version 2.51.0 adds full support for the new NVIDIA GeForce RTX 4080 graphics card, including support for BIOS extraction from RTX 4090 and RTX...
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Version 2.51.0 adds full support for the new NVIDIA GeForce RTX 4080 graphics card, *including support for BIOS extraction from RTX 4090 and RTX 4080*


Finally! Now we can just u/l bios images to the database hopefully.


----------



## Sheyster

Zero989 said:


> Are you allowed to share





bmagnien said:


> You gonna share that or?


Yes please!


----------



## Zero989

J7SC said:


> 2.)_ 'I love the smell of napalm burnt silicon in the morning'_


Less competition for me. 🤔

I also see it's for ASUS Strix.


----------



## J7SC

Zero989 said:


> Less competition for me. 🤔
> 
> I also see it's for ASUS Strix.


...probably the only one XOC vbios which might fit other cards, w/extreme caution on the PL (the Galax XOC has 2x 12VHPWR). 

Wouldn't it be funny though if Vince/KingPin would release an 'unofficial' XOC for their not-a-4090 engineering samples which could work w/ other vendors' models - all of a sudden, 3DM top spots would be full of 'EVGA' tags even though they never officially sold a RTX4K...


----------



## Zero989

J7SC said:


> ...probably the only one XOC vbios which might fit other cards, w/extreme caution on the PL (the Galax XOC has 2x 12VHPWR).
> 
> Wouldn't it be funny though if Vince/KingPin would release an 'unofficial' XOC for their not-a-4090 engineering samples which could work w/ other vendors' models - all of a sudden, 3DM top spots would be full of 'EVGA' tags even though they never officially sold a RTX4K...


Next Gen Graphics beating 4090s what could it be 🤔

I want memory voltage and timings access more than a 1000W BIOS though.


----------



## dr/owned

I just wish they would stop with these stupid OC / Silent switches and give me a OC / LN2 switch that has the 1000W baked in. Trip a fuse or something when it's used to cut the warranty...don't care. "Here's our 2000W VRM design check it out!" [completely worthless with a 520W bios limit]

Although NV probably wouldn't approve a design that shipped with 1000W.


----------



## theilya

I am sure this was asked before, but I am having issues finding search thread location. 
Anyhow, I have the Zotac AMP Extreme which seems to be limited to 110%...

Is there a bios available to bump it up to 133% and / or how do I boost voltage to 1.1v?


----------



## SilenMar

Sajin1337 said:


> I heard from a reliable source that the memory junction temps are junk data, so they mean nothing, and can't be relied on...
> 
> 
> 
> https://forums.evga.com/FindPost/3586866


It's definitely not junk data. After I put better thermal pads on the memory, the GPU core and Hotspot temperature rises about 2C but the Memory Junction temps gets 1C lower. It also takes longer time for the Memory Junction to rise up. The heat dissipate capacity of the AIO is just as low as 500W instead of 600W. If you cool the memory, the core will be heated more. 

If you flash a 600W BIOS, the AIO card needs to have the fans to spin a lot faster and a lot louder than the air cooled card in order to keep the same temperature. 

Here is the genius solution of Gigabyte to "solve" the temp instead of putting better effort at the actual cooling: 
Gigabyte simply uses a worse thermal pads on the memory and on VRAM, then show the sensor for the hot spot that is closest to the die. The temperatures will appear fine for the core and that hot spot but everything else is heating up.


----------



## tryout1

So i can call myself a proud member of a 4090 owner now since tuesday, got my Gainward Phantom 4090 for an absolute nice and unscalped price of 1929€ but only today i had some time to play around with it, but boy if i didn't lose the silicon lottery this time i dunno what, probably one of the worst of actually THE worst 4090 in terms of OC i saw yet but well the card is extremely silent and no coil whine at all and oc is just a nice bonus from an already overpowered card, especially for my case.

This result is the max i can squeeze out of it, flashed the Palit Gamerock OC bios on it, put another 12vhwpr cable in to unlock 500w and put in some numbers:

NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5900X,ASUSTeK COMPUTER INC. ROG CROSSHAIR VII HERO (3dmark.com)

+1500 on mem seems to be the average overclock anyways but my core totally sucks, tried 1.1 with about 2980mhz on core in CP2077, i could even finish alt+tab back in my drivers crashed lol


----------



## SilenMar

J7SC said:


> The cold-plate obviously differs from your AIO setup, but apparently so does the left side of the VRM (?) on the PCB, unless Gigabyte made a running change on the PCBs, per lower-right pic


The PCB is different. Gaming OC has 20 phases. Waterforce below has 24 phases to the core. It doesn't matter if it cannot be cooled.
The AIO plate doesn't even dare to cool the 4-phanse VRMs of the memory at all. Memory VRMs are just left open to nothing. The north part of core VRM is also left partially covered.


















The Gigabyte guys in the board design and in the thermal design are not the same department. They don't communicate to each other. Crap like this should've never happened.


----------



## cheddardonkey

tryout1 said:


> So i can call myself a proud member of a 4090 owner now since tuesday, got my Gainward Phantom 4090 for an absolute nice and unscalped price of 1929€ but only today i had some time to play around with it, but boy if i didn't lose the silicon lottery this time i dunno what, probably one of the worst of actually THE worst 4090 in terms of OC i saw yet but well the card is extremely silent and no coil whine at all and oc is just a nice bonus from an already overpowered card, especially for my case.
> 
> This result is the max i can squeeze out of it, flashed the Palit Gamerock OC bios on it, put another 12vhwpr cable in to unlock 500w and put in some numbers:
> 
> NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5900X,ASUSTeK COMPUTER INC. ROG CROSSHAIR VII HERO (3dmark.com)
> 
> +1500 on mem seems to be the average overclock anyways but my core totally sucks, tried 1.1 with about 2980mhz on core in CP2077, i could even finish alt+tab back in my drivers crashed lol


 GIven that your 3D mark score is above average. I wouldnt say you lost anything.. my mem doesnt reach 1500 without crashing.. the max I can do is about 1425 on the mem with the clock maxing out at 3030mhz.. performance difference is negligable from what I've seen and mem oc is far more beneficial than gpu clock oc


----------



## KedarWolf

Logically, at least on Corsair PSU, it doesn't matter if you have two or three from the PSU, it's still the same amount of current to each individual wire, it does not change the number of wires in the cables themselves, just the number of wires is spread out from the three outputs instead of two.

So it doesn't reduce the amount of current of the individual wires, it just reduces the amount of current pulled from each individual 8-pin connector in total.
And I know each Corsair Type 4 8-pin connector can pull a total 288W, so for two, a maximum of 576W, and the PCI-e slot itself provides up to 75W.

This is why 2 8-pin connectors are just fine and each wire itself it actually pulls the same amount of current no matter if you use 2 to PSU or 3 to PSU connectors. It doesn't change the actual number of wires going from the PSU to the power connector itself.

If we actually get the 1000W BIOS, three into PSU is better as it divides the 1000W between 3 PSU connectors, but it won't change the fact each wire will be pulling like 40% more current in each wire no matter if we use 3 into PSU or 2 into PSU as the total number of wires running from the PSU connectors to the power plug in the video card is the same.

And yes, I am just thinking logically and won't be butt hurt if someone corrects me for being wrong.


----------



## cheddardonkey

...


----------



## cheddardonkey

SilenMar said:


> Let me help you dodge a bullet.
> 
> *Don't buy Gigabyte 4090 Extreme Waterforce *
> 
> This is one of the cards I can buy. Another one is Strix. Others are all sold out. They have been used for two weeks. Here is my impression.
> 
> 1. Gigabyte is built like plastic. The shroud looks cheap. The density of PCB is rather loose. The 360mm radiator is in fact made of light aluminum, not copper. The card looks rather cheap in comparison. The fans are not new. They seem to be refurbished from previous 3090Ti. I can still see the debris left on the blades washed by electronic cleaning fluid.
> 
> 2. It's not exactly quiet when the fans go to 1300rpm. The fans are 3000rpm. They are only slightly quiet at 900rpm which is 30% of the max speed. And the fans keep spinning at 30% all the time. They don't stop. In comparison, ASUS Strix let the fan stop when idle. Strix is also quieter when at the same load.
> 
> 3. The default fans of Gigabyte hit to max like a jet every now and then under load. Something triggered the annoying behavior. BIOS update didn't fix it. When you control the fans through the software like MSI afterburner. The fans actually go to the double speed of the set parameter.
> 
> 4. The power is limited to 500W. After the BIOS update, the power is limited even down to 490W. No overclocking protentional. The website of the official Gigabyte says it is OC version. But the package of the box has no mention of the OC version. I guess it is just a normal PCB despite the premium price.
> 
> 5. The temperature reaches to 58c-63C easily at 490W. Anymore than that the fan will spin louder.
> 
> The conclusion:
> 
> The value of this card is bad. It is not comparable to the ASUS Strix in terms of almost everything such as build quality, noise level, OC performance.
> 
> Strix is built like a piece of art in comparison. The premium product is all about the user experience with as little annoyance as possible. The Gigabyte AIO 4090 has the bottom-tier user experience. Gigabyte didn't test everything.
> 
> Edit:
> 6. Ironically, the culprit of the fans hitting the max is due to some components in certain area are not properly cooled. They can heat up 100C threshold multiple times. No more BIOS updates can fix high temperature like this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *So don't buy this GPU. A thin piece of loose bare copper won't cut it when other components need to be cooled down properly.*


Whoa... RMA your card, there is a problem with it. VRAM gets nowhere close to that on my waterforce and I'm not unhappy with it, I dont have the fan issues or temp issues you are experiencing. . I have tested several bios on this card also and have pulled 600w loads (with the bios that support it), NEVER reaching past 78 on VRAM and that was in the most extreme unrealistic furmark case. Generally under heaving gaming, GPU hovers between 55-61. Hot spot around 70 and VRAM around 74, again in intense gaming sessions, fans stay low rpm, never spinning up the way you described. Hopefully they get you sorted..


----------



## motivman

Cablemod cable finally came in... PC looks so much cleaner now. Now I just need a waterblock....


----------



## tryout1

cheddardonkey said:


> GIven that your 3D mark score is above average. I wouldnt say you lost anything.. my mem doesnt reach 1500 without crashing.. the max I can do is about 1425 on the mem with the clock maxing out at 3030mhz.. performance difference is negligable from what I've seen and mem oc is far more beneficial than gpu clock oc


Mhm, i wait for alphacool to release the waterblock for the Gainward, maybe it helps abit, atm i just trying some CP2077 at 1.05v i can only max out at about 2850mhz that's with a +130core offset, i mean i will 100% undervolt this card anyways but ngl i'm still curious what this card can do. At max fans it sits at about 59 °C at 2880mhz, so i hope i can get some speed bins (1-2) out of it with watercooling.


----------



## SilenMar

cheddardonkey said:


> Whoa... RMA your card, there is a problem with it. VRAM gets nowhere close to that on my waterforce and I'm not unhappy with it, I dont have the fan issues or temp issues you are experiencing. . I have tested several bios on this card also and have pulled 600w loads (with the bios that support it), NEVER reaching past 78 on VRAM and that was in the most extreme unrealistic furmark case. Generally under heaving gaming, GPU hovers between 55-61. Hot spot around 70 and VRAM around 74, again in intense gaming sessions, fans stay low rpm, never spinning up the way you described. Hopefully they get you sorted..


It's cheap made card with a layer of cooper that cannot cool everything. You don't see the temperature on VRM, not the VRAM/memory. 
It cannot cool VRM, memory, core at the same time.


----------



## AvengedRobix

EK opened preorder for Zotac WB.. available from January... but too much expensive


----------



## cheddardonkey

SilenMar said:


> The PCB is different. Gaming OC has 20 phases. Waterforce below has 24 phases to the core. It doesn't matter if it cannot be cooled.
> The AIO plate doesn't even dare to cool the 4-phanse VRMs of the memory at all. Memory VRMs are just left open to nothing.
> View attachment 2583626
> 
> View attachment 2583627
> 
> 
> The Gigabyte guys in the board design and in the thermal design are not the same department. They don't communicate to each other. Crap like this should've never happened.


I was toying around with the idea of pulling mine apart to have a look at component coverage but nothing besides curiosity has pushed me to. your posts are poking my curiousity and I feel bad that your card is having problems. That block also does not look exactly the same as what is posted on the website or perhaps its flipped 180 degrees in your picture?


----------



## Mucho

theilya said:


> I am sure this was asked before, but I am having issues finding search thread location.
> Anyhow, I have the Zotac AMP Extreme which seems to be limited to 110%...
> 
> Is there a bios available to bump it up to 133% and / or how do I boost voltage to 1.1v?


I have a Trinity with the Gigabyte OC Bios running, there you can bump it up to 133%


----------



## GRABibus

To give all necessary juice to my lovely Gigabyte Gaming OC, ordered today ROG MAXIMUS Z790 HERO + 13900K + GSKILL 32GB 7600MHz CL36.
New build as of December


----------



## mirkendargen

dk_mic said:


> also received the corsair 12vhpwr cable today.. surely someone chewed on that one
> 
> View attachment 2583593
> View attachment 2583594
> View attachment 2583595
> View attachment 2583596
> View attachment 2583597
> 
> 
> noticed slightly less droop on the 12V rails, min max with the adapter on a superposition run was 11.785 - 12.106
> with that cable it was 11.799 - 12.180. Is that just variance or is there an impact? PSU is HX1500i


I also see the 12V droop ever so slightly less on my Corsair cable vs. my Moddiy cable. I never tried the supplied adapter to compare with it. It's so tiny a difference I don't worry about it though.

And my Corsair cable also had the nicks in the insulation, it seemed like the way they zip tied it up cut in a little bit.




KedarWolf said:


> Logically, at least on Corsair PSU, it doesn't matter if you have two or three from the PSU, it's still the same amount of current to each individual wire, it does not change the number of wires in the cables themselves, just the number of wires is spread out from the three outputs instead of two.
> 
> So it doesn't reduce the amount of current of the individual wires, it just reduces the amount of current pulled from each individual 8-pin connector in total.
> And I know each Corsair Type 4 8-pin connector can pull a total 288W, so for two, a maximum of 576W, and the PCI-e slot itself provides up to 75W.
> 
> This is why 2 8-pin connectors are just fine and each wire itself it actually pulls the same amount of current no matter if you use 2 to PSU or 3 to PSU connectors. It doesn't change the actual number of wires going from the PSU to the power connector itself.
> 
> If we actually get the 1000W BIOS, three into PSU is better as it divides the 1000W between 3 PSU connectors, but it won't change the fact each wire will be pulling like 40% more current in each wire no matter if we use 3 into PSU or 2 into PSU as the total number of wires running from the PSU connectors to the power plug in the video card is the same.
> 
> And yes, I am just thinking logically and won't be butt hurt if someone corrects me for being wrong.


Nope this is all right and what I've been saying for awhile now. People are too hung up on # of connectors and not thinking about what 2 vs. 3 vs. 4 connectors actually means electrically when it's all the same 6 +12V/6 ground on the other end.


----------



## SilenMar

cheddardonkey said:


> I was toying around with the idea of pulling mine apart to have a look at component coverage but nothing besides curiosity has pushed me to. your posts are poking my curiousity and I feel bad that your card is having problems. That block also does not look exactly the same as what is posted on the website or perhaps its flipped 180 degrees in your picture?
> 
> View attachment 2583634


It is not just me have the same thing happening on this Waterforce 4090. The website has a slightly different plate. The actual plate doesn't cover everything. The thermal pad requires the thickness of 1.28mm which is abnormal.









Don't expect the QC is anywhere good. Look at how the original right side thermal paste is printed outside of the supposed area.


----------



## cheddardonkey

SilenMar said:


> It is not just me have the same thing happening on this Waterforce 4090. The website has a slightly different plate.
> 
> Don't expect the QC is anywhere good. Look at how the original right side thermal paste is printed outside of the supposed area.
> View attachment 2583641


----------



## doom3crazy

Sheyster said:


> If it's the newer one please post it up. I have the older one.
> 
> I'm currently on the V2 ASUS Strix BIOS that Kedarwolf posted up (the 120% PL/600w OC version) as a daily driver for gaming. I'm just running it at default settings (500w 100% PL). So far so good!


are you using an msi card!? I was thinking the asus bios wouldn't work on these. Someone said DP stopped working.


----------



## J7SC

SilenMar said:


> The PCB is different. Gaming OC has 20 phases. Waterforce below has 24 phases to the core. It doesn't matter if it cannot be cooled.
> The AIO plate doesn't even dare to cool the 4-phanse VRMs of the memory at all. Memory VRMs are just left open to nothing.
> View attachment 2583626
> 
> View attachment 2583627
> 
> 
> The Gigabyte guys in the board design and in the thermal design are not the same department. They don't communicate to each other. Crap like this should've never happened.





SilenMar said:


> It is not just me have the same thing happening on this Waterforce 4090. The website has a slightly different plate.
> 
> Don't expect the QC is anywhere good. Look at how the original right side thermal paste is printed outside of the supposed area.
> View attachment 2583641


...I can appreciate your frustration - for what it is worth, the 'basic design and chip' of my Gaming-OC were/are great - but there were some issues w/ assembly quality. Nothing that took very long to fix, and at the end of the day, I paid US$1,619 for it - I knew I would water-cool it with a loop ready and waiting anyway...but still, that shouldn't have really happened, much less so on the more expensive variants you showed.

...I have several other Gigabyte GPUs (2x 2080 Ti Waterforce WB, 1x G-OC 6900XT, other) without a single issue and great quality - but this time around, the quality really seems to have been affected (GPU price crash ? Covid / post-Covid China lockdowns and supply chains ?).


----------



## SilenMar

J7SC said:


> ...I can appreciate your frustration - for what it is worth, the 'basic design and chip' of my Gaming-OC were/are great - but there were some issues w/ assembly quality. Nothing that took very long to fix, and at the end of the day, I paid US$1,619 for it - I knew I would water-cool it with a loop ready and waiting anyway...but still, that shouldn't have really happened, much less so on the more expensive variants you showed.
> 
> ...I have several other Gigabyte GPUs (2x 2080 Ti Waterforce WB, 1x G-OC 6900XT, other) without a single issue and great quality - but this time around, the quality really seems to have been affected (GPU price crash ? Covid / post-Covid China lockdowns and supply chains ?).


Not exactly. I have another Strix 4090 which is thousands miles of better than this crap. Gigabyte is just inferior. However, I can still sell it over $2000 at this particular time.


----------



## jootn2kx

Sheyster said:


> If it's the newer one please post it up. I have the older one.
> 
> I'm currently on the V2 ASUS Strix BIOS that Kedarwolf posted up (the 120% PL/600w OC version) as a daily driver for gaming. I'm just running it at default settings (500w 100% PL). So far so good!


Nice how does it compare to the gigabyte OC one?


----------



## ttnuagmada

what kind of water-cooled temps are people getting?


----------



## SilenMar

ttnuagmada said:


> what kind of water-cooled temps are people getting?


AIO is generally high 60 on the core, high 80 on the memory at 500W.


----------



## Nico67

Still looking for a Strix 4090 waterblock, and there are looking to be quite a few options. However I am starting to see some questionable or problematic design choices.

1/ short unrestricted one sided return - Alphacool and Optimus
Water will take the least restrictive path, so low pressure around long path, as was seen in Debauer's Inno3D waterblock review. Bykski takes the reliable old school choice of one path split and rejoining, EK at least tries to balance path lengths.

2/ Block end IO connector - Aquacomputer (although has side option also), Watercool and Bitspower
Ram clearance issues if you pipe from CPU side, especially if you have a ram fan. Blocks are shorter and generally ports are around the ram slots. Phanteks at least went with IO ports on the outer edge, they just need to sell internationally from there store 

3/ Steel jet plate - lots
This is probably sujective, but I think it probably doesn't help with Nickel plating wear/ galvanic corrosion issues. Starting to see alot more plexi/ plastic jet plates used, EK for example.

4/ Block finish machining flaws - EK, Phanteks ... probably more.
This again is fairly subjective, and is probably only important for the core surface wear paste is used. Also it largely depends on your reason for watercooling in the first place. If its just for aesthetics and noise, then its not as bigger concern.

5/ Nickel Plating issues - EK...
I really have only had experience with EK having this issue, and its been getting worse generationally, and maybe the plexi jet plates will fix that?

Sure makes for a hard choice, I just don't trust EK quality anymore and Bykski still use steel jet plates, everything else is to hard to get or not released yet


----------



## KingEngineRevUp

Nico67 said:


> Still looking for a Strix 4090 waterblock, and there are looking to be quite a few options. However I am starting to see some questionable or problematic design choices.
> 
> 1/ short unrestricted one sided return - Alphacool and Optimus
> Water will take the least restrictive path, so low pressure around long path, as was seen in Debauer's Inno3D waterblock review. Bykski takes the reliable old school choice of one path split and rejoining, EK at least tries to balance path lengths.
> 
> 2/ Block end IO connector - Aquacomputer (although has side option also), Watercool and Bitspower
> Ram clearance issues if you pipe from CPU side, especially if you have a ram fan. Blocks are shorter and generally ports are around the ram slots. Phanteks at least went with IO ports on the outer edge, they just need to sell internationally from there store
> 
> 3/ Steel jet plate - lots
> This is probably sujective, but I think it probably doesn't help with Nickel plating wear/ galvanic corrosion issues. Starting to see alot more plexi/ plastic jet plates used, EK for example.
> 
> 4/ Block finish machining flaws - EK, Phanteks ... probably more.
> This again is fairly subjective, and is probably only important for the core surface wear paste is used. Also it largely depends on your reason for watercooling in the first place. If its just for aesthetics and noise, then its not as bigger concern.
> 
> 5/ Nickel Plating issues - EK...
> I really have only had experience with EK having this issue, and its been getting worse generationally, and maybe the plexi jet plates will fix that?
> 
> Sure makes for a hard choice, I just don't trust EK quality anymore and Bykski still use steel jet plates, everything else is to hard to get or not released yet


If you're going to block this generation and probably every generation here on, it'll be for aesthetics and low noise. Don't expect much performance gain. So get the prettiest block that gives you a raging hard on when you look at it.


----------



## dr/owned

Nico67 said:


> 3/ Steel jet plate - lots
> This is probably sujective, but I think it probably doesn't help with Nickel plating wear/ galvanic corrosion issues. Starting to see alot more plexi/ plastic jet plates used, EK for example.


They're stainless steel jet plates so it's as compatible as anything else in the loop. The pump is stainless steel for example. The plastic ones are probably for aesthetic and/or cheaper cost.


----------



## dr/owned

I'll be trying the 4090 Block assembly then tonight. Gonna give these pads a try since they're fairly cheap:









Still on the fence if I want to do liquid metal or not. Liquid metal such a pain to get off and I'd have to kapton tape around the die. I'm leaning towards "no" just to be a bit lazy. It was great / still working on the 3090 TUF I did at launch. The EVGA 3090 wasn't so great hotspot but I used Kritical thermal pads that are waaaay too hard.


----------



## J7SC

ttnuagmada said:


> what kind of water-cooled temps are people getting?


...air-cooled, water-cooled Giga-G-OC , gaming oc settings.


----------



## PhuCCo

dr/owned said:


> I'll be trying the 4090 Block assembly then tonight. Gonna give these pads a try since they're fairly cheap:
> View attachment 2583671
> 
> 
> Still on the fence if I want to do liquid metal or not. Liquid metal such a pain to get off and I'd have to kapton tape around the die. I'm leaning towards "no" just to be a bit lazy. It was great / still working on the 3090 TUF I did at launch. The EVGA 3090 wasn't so great hotspot but I used Kritical thermal pads that are waaaay too hard.


I tried liquid metal and imo it just isn't worth it for the blocks that are available ATM. The interface material doesn't seem to be the limiting factor at least in my experience. KP extreme to liquid metal did absolutely nothing for my EK FE block


----------



## KingEngineRevUp

dr/owned said:


> I'll be trying the 4090 Block assembly then tonight. Gonna give these pads a try since they're fairly cheap:
> View attachment 2583671
> 
> 
> Still on the fence if I want to do liquid metal or not. Liquid metal such a pain to get off and I'd have to kapton tape around the die. I'm leaning towards "no" just to be a bit lazy. It was great / still working on the 3090 TUF I did at launch. The EVGA 3090 wasn't so great hotspot but I used Kritical thermal pads that are waaaay too hard.


Did you lose the stock pads or something? We have a number of post not just here, but other places where using after market pads that are too hard or not the right size ruining core temperatures..


----------



## Sheyster

doom3crazy said:


> are you using an msi card!? I was thinking the asus bios wouldn't work on these. Someone said DP stopped working.


Not MSI, Giga-G-OC... Since the ASUS card has the 2xHDMI outs, any card with 3xDP will probably lose one of the DP outputs.


----------



## PhuCCo

Looks like Bykski (Granzon) has a FE block for the same price as EK.. it looks interesting. Who's gonna take one for the team? 🤣


https://a.aliexpress.com/_mN85tFi


----------



## Sheyster

jootn2kx said:


> Nice how does it compare to the gigabyte OC one?


It's got a slightly higher OC and a 500w default power limit vs. 450w for the Gigabyte BIOS. Both are 600w max PL. There actually isn't much difference.


----------



## ttnuagmada

J7SC said:


> ...air-cooled, water-cooled Giga-G-OC , gaming oc settings.
> View attachment 2583679


What block do you have and what kind of setup? looks pretty beefy.


----------



## KedarWolf

KingEngineRevUp said:


> Did you lose the stock pads or something? We have a number of post not just here, but other places where using after market pads that are too hard or not the right size ruining core temperatures..


I strongly recommend replacing stock pads on blocks with the right size GELID Extreme pads.

On an Optimus block, I'd keep the full-sized entire backplate stock pad though, and just replace the front water block pads.

They have decent wm/k and are very soft and malleable and you get great core contact and great results with them.


----------



## BigMack70

Well... shoot. After a few days of zero black screen crashes, several today. Randomly from desktop and at menus in Halo MCC. Hours and hours of gaming with no crashes and then randomly it starts happening. 

I guess I need to decide if I need to replace components or not. Still think it's not a GPU issue; there's zero temperature problems and the crashes aren't correlated to GPU load. Think its motherboard or software/driver related.


----------



## cheddardonkey

SilenMar said:


> The memory junction temperature at just 450W is still at 80C even if the copper plate is properly placed assuming every memory chip is uniformly contacted by the plate.
> View attachment 2583500
> 
> There is thermal throttle at 110C for the internal memory temperature. By that time it is probably too late.


I still think you have a unique problem or I just got lucky somehow.. vram junction doesnt rise above 78 for me and that is in the most extreme conditions.


SilenMar said:


> Let me help you dodge a bullet.
> 
> *Don't buy Gigabyte 4090 Extreme Waterforce *
> 
> This is one of the cards I can buy. Another one is Strix. Others are all sold out. They have been used for two weeks. Here is my impression.
> 
> 1. Gigabyte is built like plastic. The shroud looks cheap. The density of PCB is rather loose. The 360mm radiator is in fact made of light aluminum, not copper. The card looks rather cheap in comparison. The fans are not new. They seem to be refurbished from previous 3090Ti. I can still see the debris left on the blades washed by electronic cleaning fluid.
> 
> 2. It's not exactly quiet when the fans go to 1300rpm. The fans are 3000rpm. They are only slightly quiet at 900rpm which is 30% of the max speed. And the fans keep spinning at 30% all the time. They don't stop. In comparison, ASUS Strix let the fan stop when idle. Strix is also quieter when at the same load.
> 
> 3. The default fans of Gigabyte hit to max like a jet every now and then under load. Something triggered the annoying behavior. BIOS update didn't fix it. When you control the fans through the software like MSI afterburner. The fans actually go to the double speed of the set parameter.
> 
> 4. The power is limited to 500W. After the BIOS update, the power is limited even down to 490W. No overclocking protentional. The website of the official Gigabyte says it is OC version. But the package of the box has no mention of the OC version. I guess it is just a normal PCB despite the premium price.
> 
> 5. The temperature reaches to 58c-63C easily at 490W. Anymore than that the fan will spin louder.
> 
> The conclusion:
> 
> The value of this card is bad. It is not comparable to the ASUS Strix in terms of almost everything such as build quality, noise level, OC performance.
> 
> Strix is built like a piece of art in comparison. The premium product is all about the user experience with as little annoyance as possible. The Gigabyte AIO 4090 has the bottom-tier user experience. Gigabyte didn't test everything.
> 
> Edit:
> 6. Ironically, the culprit of the fans hitting the max is due to some components in certain area are not properly cooled. They can heat up 100C threshold multiple times. No more BIOS updates can fix high temperature like this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *So don't buy this GPU. A thin piece of loose bare copper won't cut it when other components need to be cooled down properly.*


I still think you got a bad card man.. You seem stuck on the design being the problem but my stats arent anything like yours. I think its more a manufacturing quality issue more than a design issue.


----------



## KingEngineRevUp

PhuCCo said:


> Looks like Bykski (Granzon) has a FE block for the same price as EK.. it looks interesting. Who's gonna take one for the team? 🤣
> 
> 
> https://a.aliexpress.com/_mN85tFi


Hey that block actually looks nice. But it's the cost of an EKWB one.


----------



## arvinz

KingEngineRevUp said:


> Hey that block actually looks nice. But it's the cost of an EKWB one.


Looks really good actually. I'd get this before the EK to be honest. I'm just unsure how the inlet/outlet ports being on the right side will work with my loop setup.

I'd love to see s review of this.


----------



## J7SC

ttnuagmada said:


> What block do you have and what kind of setup? *looks pretty beefy*.


...definitely not vegan  ...I used the existing loop for the X570 / 3090 Strix (which is now awaiting its new home in another workstation here). The custom loop has 1320x63 rads (triple core), 2x D5s, push-pull Arctic P12s, and the 4090 GPU block is a Bykski. Wiring and tube clean-up planned for this weekend as I did an 'open loop surgery' (= didn't drain the loop, just what was between a secondary reservoir and the 3090 Strix).


----------



## SilenMar

cheddardonkey said:


> I still think you have a unique problem or I just got lucky somehow.. vram junction doesnt rise above 78 for me and that is in the most extreme conditions.
> 
> 
> I still think you got a bad card man.. You seem stuck on the design being the problem but my stats arent anything like yours. I think its more a manufacturing quality issue more than a design issue.
> 
> 
> View attachment 2583695


It's more the like your card doesn't even run at the max power. The max power is 500W. The RAM uses only 10GB. 
Set the memory at least 3,000Mhz, run 30 minutes Resident Evil 2 or 3 at 4K with 130% resolution scale to get the max power even you already use the updated 490W BIOS.


----------



## KingEngineRevUp

KedarWolf said:


> I strongly recommend replacing stock pads on blocks with the right size GELID Extreme pads.
> 
> On an Optimus block, I'd keep the full-sized entire backplate stock pad though, and just replace the front water block pads.
> 
> They have decent wm/k and are very soft and malleable and you get great core contact and great results with them.


But for what reason? Why does the memory have to be chilled so much? A few users here have said cooling the memory too much actually hurts performance since they need some warmth to run optimally?

So what's the point of getting memory lower if it can hurt performance? Like the 3090 Ti


----------



## motivman

I love the way my build looks right now... not sure if I even want a waterblock. If you guys had a choice, for the FE, will you get the EKWB waterblock, or the corsair block? Anyone have experience with corsair? Their FE block seems to be in stock and cheaper than EKWB.

*EDIT** seems like the corsair block has sold out 



https://www.corsair.com/us/en/Categories/Products/Custom-Cooling/Blocks/GPU-Blocks/Hydro-X-Series-XG7-RGB-40-SERIES-GPU-Water-Block/p/CX-9020019-WW


----------



## J7SC

...fyi, in case that hasn't been posted yet here on the 0.05% cablegate, > here and...


----------



## motivman

J7SC said:


> ...fyi, in case that hasn't been posted yet here on the 0.05% cablegate, > here and...


In other words, NOOBS that cannot plug in a simple cable ALL THE WAY, SMH


----------



## J7SC

It's weekend time , and to wind down from a busy week, I recommend s.th. like this (love that channel), especially if you have a nice big OLED to go with your 4090...or any other good HDR / Dolby capable card.


----------



## Arizor

Yeah as I and others already predicted, it's a combination of user error and poor design, the fact Steve has to demonstrate and remind users to "push EXTRA HARD at the last bit" is a design oversight.

They need to push that redesign through ASAP to include the sensor that won't start on improper contact.


----------



## J7SC

Arizor said:


> Yeah as I and others already predicted, it's a combination of user error and poor design, the fact Steve has to demonstrate and remind users to "push EXTRA HARD at the last bit" is a design oversight.
> 
> They need to push that redesign through ASAP to include the sensor that won't start on improper contact.


...there might also be some production tolerances at play (quality), and a better safety 'latch' would help. FYI, a simple trick is to shine a flashlight from the back of the connector and see if any light makes it through...


----------



## KingEngineRevUp

motivman said:


> I love the way my build looks right now... not sure if I even want a waterblock. If you guys had a choice, for the FE, will you get the EKWB waterblock, or the corsair block? Anyone have experience with corsair? Their FE block seems to be in stock and cheaper than EKWB.
> 
> *EDIT** seems like the corsair block has sold out
> 
> 
> 
> https://www.corsair.com/us/en/Categories/Products/Custom-Cooling/Blocks/GPU-Blocks/Hydro-X-Series-XG7-RGB-40-SERIES-GPU-Water-Block/p/CX-9020019-WW
> 
> 
> 
> View attachment 2583711


I was thinking of the Corsair block but for my preference, I don't like the way it looks. If you get it, let us know how it is!


----------



## dr/owned

Port Royal on the Phanteks block. Inlet temperature is hard locked at 28C and doesn't increase over time:









Arctic TP3 pads are kinda interesting. Sticky and feel more like a putty than a pad. Very soft as advertised.


----------



## KingEngineRevUp

We got some optimus results.



https://www.reddit.com/gallery/yz3ftj



300-400W with Delta 9-12. 530W Delta 16. So not much greater than Bykski and EK but still leading.


----------



## Arizor

Love the look of optimus, performance is just the cherry on top.

Not that I’m bothering to water cool this gen I think; now I’ve repasted and fixed the screw tightness I’m surprised to see it go past 60C in games with everything maxxed. Resale value should hold better this way too when I do grab that Ti next year (I would say no but we all know in our heart of hearts we’ll probably end up grabbing one 😂).



KingEngineRevUp said:


> We got some optimus results.
> 
> 
> 
> https://www.reddit.com/gallery/yz3ftj
> 
> 
> 
> 300-400W with Delta 9-12. 530W Delta 16. So not much greater than Bykski and EK but still leading.


----------



## mirkendargen

dr/owned said:


> I'll be trying the 4090 Block assembly then tonight. Gonna give these pads a try since they're fairly cheap:
> View attachment 2583671
> 
> 
> Still on the fence if I want to do liquid metal or not. Liquid metal such a pain to get off and I'd have to kapton tape around the die. I'm leaning towards "no" just to be a bit lazy. It was great / still working on the 3090 TUF I did at launch. The EVGA 3090 wasn't so great hotspot but I used Kritical thermal pads that are waaaay too hard.


Those pads in 1.5mm form worked great for me on a Bykski block.

Am I understanding right that you're getting a 32C delta on the Phanteks block though? That must be a bad mount or something, it's at least 50% worse than the other blocks.


----------



## dr/owned

mirkendargen said:


> Those pads in 1.5mm form worked great for me on a Bykski block.
> 
> Am I understanding right that you're getting a 32C delta on the Phanteks block though? That must be a bad mount or something, it's at least 50% worse than the other blocks.


Yeah I'm not sure what's up with that. I'll probably try remounting or maybe the lack of a real jet plate is a problem. 37C delta at 650W.


----------



## cheddardonkey

SilenMar said:


> It's more the like your card doesn't even run at the max power. The max power is 500W. The RAM uses only 10GB.
> Set the memory at least 3,000Mhz, run 30 minutes Resident Evil 2 or 3 at 4K with 130% resolution scale to get the max power even you already use the updated 490W BIOS.


The card runs at whatever is asked for it up to its max which yes is 500w with the factory bios, so below its hitting 497w in this example and mem at 3ghz.. I dont play resident evil, but I'll try some other comparable game scaling. .


----------



## Madness11

Guys where i can find 1000w bios? For 4090 ))


----------



## Xian65

dr/owned said:


> Port Royal on the Phanteks block. Inlet temperature is hard locked at 28C and doesn't increase over time:
> View attachment 2583740
> 
> 
> Arctic TP3 pads are kinda interesting. Sticky and feel more like a putty than a pad. Very soft as advertised.
> 
> View attachment 2583754





mirkendargen said:


> Those pads in 1.5mm form worked great for me on a Bykski block.
> 
> Am I understanding right that you're getting a 32C delta on the Phanteks block though? That must be a bad mount or something, it's at least 50% worse than the other blocks.





dr/owned said:


> Yeah I'm not sure what's up with that. I'll probably try remounting or maybe the lack of a real jet plate is a problem. 37C delta at 650W.


I’ve mounted the Phanteks block onto my Giga OC and am also getting the 30°C delta at approx 530W. Initially I thought it was a bad mount but seeing your results has me concerned, especially since I cancelled the EK block and went for this one for the port positions, despite it still being $450 AUD..

I’ll be pissed if it’s just a crap design for that price.


----------



## dr/owned

Xian65 said:


> I’ve mounted the Phanteks block onto my Giga OC and am also getting the 30°C delta at approx 530W. Initially I thought it was a bad mount but seeing your results has me concerned, especially since I cancelled the EK block and went for this one for the port positions, despite it still being $450 AUD..
> 
> I’ll be pissed if it’s just a crap design for that price.


ill slam together a remount later today hopefully. 

the math says their block may be bowing around the die. 2.5mm void and the metal frame of the die was 2.65mm tall on my 3090 (I’m assuming 4090 the same package dimensions). The die itself sits at 2.9mm. So that’s 0.4mm of interference fit. Bykski was .25mm, no interference with the metal frame.


----------



## Nizzen

Madness11 said:


> Guys where i can find 1000w bios? For 4090 ))


The bios finds you 

Asus 4090 strix with connected EVC2, and it's done...


----------



## doom3crazy

Sheyster said:


> Not MSI, Giga-G-OC... Since the ASUS card has the 2xHDMI outs, any card with 3xDP will probably lose one of the DP outputs.


are you saying I should be able to flash my msi card with an asus bios? I will just have to play with the DP ports until I find one that works?


----------



## th3illusiveman

I got a Cablemod 12VHPWR cable but they include a stupid shroud which really doesn't seem very compatible with my FE card. I could never be sure the cable was 100 seated with the shroud seemingly hitting the FE cooler and obstructing the top view so i just removed it. Even if it doesn't look as good, i was able to make sure the cable was 100% inserted which is all that matters. We'll see how it goes but going from the horrendous adapter to one cable really cleans things up.


----------



## tryout1

So, i played around a bit and found on my E tier chip following to be stable,
0.9v 2565mhz core
0.96v 2715mhz core
1.05v 2835mhz core

I even tightened the screws a bit, they seemed to be a tiny bit loose which lowered temps, with UV 0.9v i was sitting around 52 °C now and Hotspot temps were 57 °C in CP2077. Still bit envious that everybody gets that mental barrier 3ghz mark but it is what it is.


----------



## Sheyster

th3illusiveman said:


> I got a Cablemod 12VHPWR cable but they include a stupid shroud which really doesn't seem very compatible with my FE card. I could never be sure the cable was 100 seated with the shroud seemingly hitting the FE cooler and obstructing the top view so i just removed it. Even if it doesn't look as good, i was able to make sure the cable was 100% inserted which is all that matters. We'll see how it goes but going from the horrendous adapter to one cable really cleans things up.


I removed the shroud on mine as well. It moves down over the connector very easily.



doom3crazy said:


> are you saying I should be able to flash my msi card with an asus bios? I will just have to play with the DP ports until I find one that works?


Yes, only one of them should be disabled.


----------



## Xian65

dr/owned said:


> ill slam together a remount later today hopefully.
> 
> the math says their block may be bowing around the die. 2.5mm void and the metal frame of the die was 2.65mm tall on my 3090 (I’m assuming 4090 the same package dimensions). The die itself sits at 2.9mm. So that’s 0.4mm of interference fit. Bykski was .25mm, no interference with the metal frame.


Just did my own repaste as it was really bothering me. When I pulled it apart, I did notice that the pad for the rear most coils was sort of squished sideways so I don't know if that was affecting it. Tightened everything down and repasted with Kryonaut Extreme vs the regular Kryonaut originally and now it's showing more like a 24°C delta between water and core but that's straight after puttiing it in and not letting the bubbles dissipate or heat cycle etc. Hopefully it will get better but at least it's a far better start than before.


----------



## Benni231990

So now GPU-Z has an update and we can upload all the 4090 Bios

Has anybody the Neptune OC 630watt and can it upload here or to GPU-Z?


----------



## alasdairvfr

J7SC said:


> ...I can appreciate your frustration - for what it is worth, the 'basic design and chip' of my Gaming-OC were/are great - but there were some issues w/ assembly quality. Nothing that took very long to fix, and at the end of the day, I paid US$1,619 for it - I knew I would water-cool it with a loop ready and waiting anyway...but still, that shouldn't have really happened, much less so on the more expensive variants you showed.





J7SC said:


> ...air-cooled, water-cooled Giga-G-OC , gaming oc settings.
> View attachment 2583679


Your stock hotspot temps on that card are (were) much higher than mine, my hotspot peaks around 80C with ambient temps between 22-24C, you are right about the assembly issues if the thermal pads/paste is sometimes uneven or not correctly applied.


----------



## alasdairvfr

Does anyone here play Uncharted: Legacy of Thieves Collection? It runs great for me but I'm getting some odd flashing, like a single frame of white (or rarely red) semi-frequently. Tried stock clocks, DLSS on/off doesn't seem to affect it. It doesn't seem to happen in other games but I've been mostly playing Uncharted as of late.


----------



## dante`afk

why are all so hungry for a 1000w bios? it doesnt do anything for you without voltmod, absolute obsolete.



dr/owned said:


> Port Royal on the Phanteks block. Inlet temperature is hard locked at 28C and doesn't increase over time:
> View attachment 2583740
> 
> 
> Arctic TP3 pads are kinda interesting. Sticky and feel more like a putty than a pad. Very soft as advertised.


thats water? even air has better temps.



alasdairvfr said:


> Does anyone here play Uncharted: Legacy of Thieves Collection? It runs great for me but I'm getting some odd flashing, like a single frame of white (or rarely red) semi-frequently. Tried stock clocks, DLSS on/off doesn't seem to affect it. It doesn't seem to happen in other games but I've been mostly playing Uncharted as of late.


no issues here


----------



## Panchovix

tryout1 said:


> So, i played around a bit and found on my E tier chip following to be stable,
> 0.9v 2565mhz core
> 0.96v 2715mhz core
> 1.05v 2835mhz core
> 
> I even tightened the screws a bit, they seemed to be a tiny bit loose which lowered temps, with UV 0.9v i was sitting around 52 °C now and Hotspot temps were 57 °C in CP2077. Still bit envious that everybody gets that mental barrier 3ghz mark but it is what it is.


Don't worry, I probably got a D tier chip but with an F tier memory lol (I can do 3030Mhz on games but at 1.1V so not worth, 2940Mhz at 1.05 otherwise); gonna try to tight the screws since I feel I kinda have high temps with my TUF. 
It is what it is, and can't return it because I just want it here on Chile haha


----------



## PhuCCo

motivman said:


> I love the way my build looks right now... not sure if I even want a waterblock. If you guys had a choice, for the FE, will you get the EKWB waterblock, or the corsair block? Anyone have experience with corsair? Their FE block seems to be in stock and cheaper than EKWB.
> 
> *EDIT** seems like the corsair block has sold out
> 
> 
> 
> https://www.corsair.com/us/en/Categories/Products/Custom-Cooling/Blocks/GPU-Blocks/Hydro-X-Series-XG7-RGB-40-SERIES-GPU-Water-Block/p/CX-9020019-WW
> 
> 
> 
> View attachment 2583711


I have a Corsair XG7 on a 3090FE and the delta temps aren't great. I was getting like a 20C delta between water and core temps at 400W, and that is on Ampere. For comparison, my 3090FE Alphacool block is getting a 10C delta at 400W. It doesn't seem like they changed much since then in terms of flow path and density so I'm leaning toward meh performance.


----------



## dante`afk




----------



## Zero989

dante`afk said:


> why are all so hungry for a 1000w bios? it doesnt do anything for you without voltmod, absolute obsolete.


Not true. I can't even hit the 600W limit, and throttle at 540W in Cyberpunk 2077 4K.


----------



## Benni231990

Zero989 said:


> Not true. I can't even hit the 600W limit, and throttle at 540W in Cyberpunk 2077 4K.



Why your cards all Throttle my Suprim X Air with 600watt Liquid Bios never throttle my highest peak was 567 in port Royal


----------



## Zero989

Benni231990 said:


> Why your cards all Throttle my Suprim X Air with 600watt Liquid Bios never throttle my highest peak was 567 in port Royal


Port Royal is not Cyberpunk 4K w/ RT lol


----------



## Benni231990

i play eFootball 2023 on 4k 144fps with 560-565watt xD no throttle

so CP2077 is easy


----------



## yzonker

Occasionally the Frame Chaser dude does something interesting. I'll save you form needing to watch the video. The interesting part is the large varience he found in power draw between the cards. Even the same exact model. Shows how these chips can vary a lot in how much they "leak". Same as what DerBauer showed with CPUs a while ago. 

I thought of this after reading some of the recent posts. For example my chip is apparently very efficient with less leakage than most. Even at 4k max settings, DLSS off, my card barely hits 500w in CP at the voltage limit. Port Royal never even hits 500w. 

Just food for thought. 



Spoiler: Frame Chaser vid


----------



## Zero989

yzonker said:


> Occasionally the Frame Chaser dude does something interesting. I'll save you form needing to watch the video. The interesting part is the large varience he found in power draw between the cards. Even the same exact model. Shows how these chips can vary a lot in how much they "leak". Same as what DerBauer showed with CPUs a while ago.
> 
> I thought of this after reading some of the recent posts. For example my chip is apparently very efficient with less leakage than most. Even at 4k max settings, DLSS off, my card barely hits 500w in CP at the voltage limit. Port Royal never even hits 500w.
> 
> Just food for thought.
> 
> 
> 
> Spoiler: Frame Chaser vid


Are you able to bench CB2077 4K + Psycho RT no DLSS above 3100Mhz all the way through?

I saw that video but who cares when you could end up passing up on a 4090 doing 2000+ more on the memory overclock. Also not showing the VF curves is lame.


----------



## yzonker

Zero989 said:


> Are you able to bench CB2077 4K + Psycho RT no DLSS above 3100Mhz all the way through?
> 
> I saw that video but who cares when you could end up passing up on a 4090 doing 2000+ more on the memory overclock. Also not showing the VF curves is lame.


I doubt it. I suspect I would be in the 3050-3100 range. That's the downside to a low leak chip I think. Core doesn't clock quite as high. 

FWIW, I tried to duplicate his test with SotTR and got 3105-3120. Kinda a weak test though admittedly.

Well like I said, the interesting part is the power draw. I wasn't trying to say everything he did was good because it isn't.


----------



## Zero989

yzonker said:


> I doubt it. I suspect I would be in the 3050-3100 range. That's the downside to a low leak chip I think. Core doesn't clock quite as high.
> 
> FWIW, I tried to duplicate his test with SotTR and got 3105-3120. Kinda a weak test though admittedly.
> 
> Well like I said, the interesting part is the power draw. I wasn't trying to say everything he did was good because it isn't.


I thought low leakage was best for air/liquid? And high leaked was for LN2? This is literally what GPU-Z used to imprint with their ASIC quality reads, as far back as GTX Maxwell.

I'm so confused lol. Anyway, my point with the BIOS is that I throttle at 540W in CB2077 4K, Down to 3060Mhz, and this is BS. Lower leakage would be fun to play with, probably a god of undervolting.

Port Royal I go up to as high as 530W I think? That ASUS Strix BIOS let's me to go 572W. Neptune OC V BIOS should resolve all my issues.


----------



## Sheyster

yzonker said:


> Occasionally the Frame Chaser dude does something interesting. I'll save you form needing to watch the video. The interesting part is the large varience he found in power draw between the cards. Even the same exact model. Shows how these chips can vary a lot in how much they "leak". Same as what DerBauer showed with CPUs a while ago.
> 
> I thought of this after reading some of the recent posts. For example my chip is apparently very efficient with less leakage than most. Even at 4k max settings, DLSS off, my card barely hits 500w in CP at the voltage limit. Port Royal never even hits 500w.
> 
> Just food for thought.
> 
> 
> 
> Spoiler: Frame Chaser vid


Yup.. this has been known for a long time. Back in the "ASIC" rating days of GPU-Z, low ASIC chips were very leaky but were often the best overclockers with high voltage and exotic cooling (chillers and LN2).


----------



## motivman

yzonker said:


> Occasionally the Frame Chaser dude does something interesting. I'll save you form needing to watch the video. The interesting part is the large varience he found in power draw between the cards. Even the same exact model. Shows how these chips can vary a lot in how much they "leak". Same as what DerBauer showed with CPUs a while ago.
> 
> I thought of this after reading some of the recent posts. For example my chip is apparently very efficient with less leakage than most. Even at 4k max settings, DLSS off, my card barely hits 500w in CP at the voltage limit. Port Royal never even hits 500w.
> 
> Just food for thought.
> 
> 
> 
> Spoiler: Frame Chaser vid


Yeah, I have noticed the same thing also. I have gone through four 4090's now, and the worst offender for power draw was my 4090 Gaming OC. The thing was drawing almost 570W in Port Royal, while my current FE draws at most 520W in Port royal. Also, that card for some reason had a high difference between core clock and effective clock. For example, 3000mhz core clock, will be like 2900mhz effective clock. on my FE, core clock to effective clock difference is at most 30-40mhz. The memory on the Gaming OC overclocked like a beast though, +1800 stable in afterburner, but at the end of the day, my FE with +1600 memory overclock was faster because its effective clock was higher. Felt bad sending the Gaming OC back. I have a fifth 4090 coming next week (Gaming OC bought directly from Gigabyte US store), lets play the silicone lottery one more time


----------



## KingEngineRevUp

PhuCCo said:


> Looks like Bykski (Granzon) has a FE block for the same price as EK.. it looks interesting. Who's gonna take one for the team? 🤣
> 
> 
> https://a.aliexpress.com/_mN85tFi


This block does look pretty cool. I guess this is considered Bykski premium line.

1. Comes with new single IO bracket
2. Backplate rear is CNC to have some type of fin arrangement, but it's ruined because the cover ends up covering the find

Hopefully someone is brave enough to try it.


----------



## KingEngineRevUp

tryout1 said:


> So, i played around a bit and found on my E tier chip following to be stable,
> 0.9v 2565mhz core
> 0.96v 2715mhz core
> 1.05v 2835mhz core
> 
> I even tightened the screws a bit, they seemed to be a tiny bit loose which lowered temps, with UV 0.9v i was sitting around 52 °C now and Hotspot temps were 57 °C in CP2077. Still bit envious that everybody gets that mental barrier 3ghz mark but it is what it is.


Sounds like you're core is on par with mine. My stable no crashing is +165. It's more important to overclock memory if anything, as the trend shows memory gives the most gains. Probably because at stock, the memory is bottlenecking the core.


----------



## qfg77

ShadowYuna said:


> My Bykski 4090 Gaming OC Block has arrived. Pretty good performance.
> View attachment 2581565
> 
> 
> View attachment 2581566
> 
> 
> +250 core / 1400 Memory Timespy Extreme
> View attachment 2581567
> 
> 
> Core Temp never exceed 55 degree. Very happy with the block also my Gaming OC does not have coil whine so this is my best satisfaction.





ShadowYuna said:


> My Bykski 4090 Gaming OC Block has arrived. Pretty good performance.
> View attachment 2581565
> 
> 
> View attachment 2581566
> 
> 
> +250 core / 1400 Memory Timespy Extreme
> View attachment 2581567
> 
> 
> Core Temp never exceed 55 degree. Very happy with the block also my Gaming OC does not have coil whine so this is my best satisfaction.


Does your Bykski block make an annoying sound when fully loaded? Bizarrely, my Phanteks 4090 Gigabyte OC makes a loud whining noise when fully loaded. It was so bad I had to return the block.


----------



## Alemancio

tryout1 said:


> So, i played around a bit and found on my E tier chip following to be stable,
> 0.9v 2565mhz core
> 0.96v 2715mhz core
> 1.05v 2835mhz core





Panchovix said:


> Don't worry, I probably got a D tier chip but with an F tier memory lol


How do you calculate these tiers?


----------



## yzonker

qfg77 said:


> Does your Bykski block make an annoying sound when fully loaded? Bizarrely, my Phanteks 4090 Gigabyte OC makes a loud whining noise when fully loaded. It was so bad I had to return the block.


You mean coil whine from the card? That's probably going to happen with any block is my guess.


----------



## yzonker

Zero989 said:


> I thought low leakage was best for air/liquid? And high leaked was for LN2? This is literally what GPU-Z used to imprint with their ASIC quality reads, as far back as GTX Maxwell.
> 
> I'm so confused lol. Anyway, my point with the BIOS is that I throttle at 540W in CB2077 4K, Down to 3060Mhz, and this is BS. Lower leakage would be fun to play with, probably a god of undervolting.
> 
> Port Royal I go up to as high as 530W I think? That ASUS Strix BIOS let's me to go 572W. Neptune OC V BIOS should resolve all my issues.


Low leakage is nice for gaming if your room heats up like mine does. Probably biggest benefit.


----------



## SilenMar

yzonker said:


> Except that's wrong. The junction temp varies directly with how well the mem chips are cooled. And I've seen examples, including this one (and my own 3090 when it was air cooled), where the card will ramp the fans hard when the junction temp goes over 100C. The cards are obviously using that temp reading as part of the fan control.


This is exactly the reason why the parameter matters enough to show on software such as HWinfo64. 
I just did a test. The fans simply ramp up at max speed when the memory reaches 100C without thermal pad.
It's amazing to see people talk and talk without doing any things instead of using some crap information from a general support forum.


----------



## mirkendargen

qfg77 said:


> Does your Bykski block make an annoying sound when fully loaded? Bizarrely, my Phanteks 4090 Gigabyte OC makes a loud whining noise when fully loaded. It was so bad I had to return the block.


Blocks don't make noises, lol. You're hearing coil whine that you didn't hear before either because the fan noise covered it, or it had thermal pads somewhere that dampened the vibrations.


----------



## mirkendargen

KingEngineRevUp said:


> This block does look pretty cool. I guess this is considered Bykski premium line.
> 
> 1. Comes with new single IO bracket
> 2. Backplate rear is CNC to have some type of fin arrangement, but it's ruined because the cover ends up covering the find
> 
> Hopefully someone is brave enough to try it.
> View attachment 2583823


It's interesting they made the fin area a lot bigger than the Bykski block, seeing how small it was compared to the other brand blocks was the first thing I noticed (although it doesn't seem to be hampering the performance). And including a single slot bracket is nice for improving physical compatibility...

But *** is with the styling? It literally looks like a boring quad m.2 riser card or something...


----------



## dante`afk

Zero989 said:


> Not true. I can't even hit the 600W limit, and throttle at 540W in Cyberpunk 2077 4K.


read again what you wrote.

if you cant even hit 600w, what is 1000w going to do for you without voltmod?

watch derb8auer vid on 1000w xbox bios.


----------



## changboy

alasdairvfr said:


> Does anyone here play Uncharted: Legacy of Thieves Collection? It runs great for me but I'm getting some odd flashing, like a single frame of white (or rarely red) semi-frequently. Tried stock clocks, DLSS on/off doesn't seem to affect it. It doesn't seem to happen in other games but I've been mostly playing Uncharted as of late.


I dont have that issue, game look nice and no frame white or red. I finish the game, good game.


----------



## lawson67

Zero989 said:


> Are you able to bench CB2077 4K + Psycho RT no DLSS above 3100Mhz all the way through?
> 
> I saw that video but who cares when you could end up passing up on a 4090 doing 2000+ more on the memory overclock. Also not showing the VF curves is lame.


Reading this i also wonder what members cards read at stock in GPU-Z, no overclocking, i found that my fastest and highest clocking RX 6900 XT had a high stock frequency, here my Zotac Amp Extreme stock , with this card i can hit + 270 in Timespy and my Vram scales all the way to 2000mhz, what standard clocks have other members got?, here is my standard clocks which come in higher than Zotac slight overclock


----------



## doom3crazy

Sheyster said:


> I removed the shroud on mine as well. It moves down over the connector very easily.
> 
> 
> 
> Yes, only one of them should be disabled.


So it worked. However, I noticed in gpuz it was reporting running at pci express x16 1,1 and then when I clicked to do the render test, it was staying at 1.1 vs with the default bios or the suprim x air bios, it pops up to 4.0 Does this mean the asus bios isn't working properly? or is gpuz just not displaying it correctly? Is there another sure fire way to verify whether its actually operating at pci 4.0 ?

EDIT: Never mind. I think it's running normal. It appears to down clock when not using full power or whatever(which I don't think I have ever realized)


----------



## Krzych04650

dante`afk said:


> read again what you wrote.
> 
> if you cant even hit 600w, what is 1000w going to do for you without voltmod?
> 
> watch derb8auer vid on 1000w xbox bios.


He cannot hit 600W because those 600W BIOSes are not actually 600W, or software power readings are wrong. I am seeing the same thing; the card starts power throttling at around 540-560W depending on the application and that 600W value is never reached, not because the card doesn't need this much, but because it starts power throttling well before actually hitting 600W.

Throttling is only slight and only in heaviest games, so 630W Neptune BIOS would just about solve it, though something like 675W that would take full advantage of 600W connector + 75W PCI-E slot would be most optimal.


----------



## lawson67

doom3crazy said:


> So it worked. However, I noticed in gpuz it was reporting running at pci express x16 1,1 and then when I clicked to do the render test, it was staying at 1.1 vs with the default bios or the suprim x air bios, it pops up to 4.0 Does this mean the asus bios isn't working properly? or is gpuz just not displaying it correctly? Is there another sure fire way to verify whether its actually operating at pci 4.0 ?


Mine is showing at x16.1.1 also but i am sure in the last version it was 16x4.0 , maybe its a bug??

Edit changes to 16x4.0 when i run heaven, it must downclock when not under load


----------



## yzonker

Interesting. The 600w MSI bios seems to be a lot faster in Speedway. As soon as I flashed it on, my score jumped a bunch. First run was higher than anything I had run before,

(Sysinfo wasn't on, I was just testing)

Unknown GPU video card benchmark result - Unknown, (3dmark.com)

So then I put more effort in and hit this finally!

11242

NVIDIA GeForce RTX 4090 video card benchmark result - Intel Core i9-13900K Processor,Micro-Star International Co., Ltd. MPG Z690 EDGE WIFI DDR4 (MS-7D31) (3dmark.com)

BUT, PR sucks on that bios as best I can tell and couldn't think about hitting my high score.

@Zero989 Did you find this to be true as well? I noticed your high Speedway score is on MSI, but your PR score is on Asus.


----------



## Zero989

yzonker said:


> @Zero989 Did you find this to be true as well? I noticed your high Speedway score is on MSI, but your PR score is on Asus.


I posted about this but people don't seem to care I guess? 

I said ASUS BIOS for Port Royal and MSI BIOS for Speedway. ASUS BIOS got me around 121xx in Speedway.


----------



## yzonker

Zero989 said:


> I posted about this but people don't seem to care I guess?
> 
> I said ASUS bios for Port Royal and MSI BIOS for Speedway. ASUS BIOS got me around 121xx in Speedway.


Damn, sorry I just missed that post. This thread has been moving fast. Tough to read them all. Glad I could re-invent the wheel. Thanks for confirming.


----------



## Zero989

yzonker said:


> Damn, sorry I just missed that post. This thread has been moving fast. Tough to read them all. Glad I could re-invent the wheel. Thanks for confirming.


My daily ritual is: COLORFUL-iGame GeForce RTX 4090 Neptune OC-V

Then tech support --> BIOS 

I don't see the XOC BIOS leaking just yet. Neptune OC V BIOS will likely get uploaded by TechPowerUp.


----------



## yzonker

Zero989 said:


> My daily ritual is: COLORFUL-iGame GeForce RTX 4090 Neptune OC-V
> 
> Then tech support --> BIOS
> 
> I don't see the XOC BIOS leaking just yet. Neptune OC V BIOS will likely get uploaded by TechPowerUp.


Do you see very significant run to run varience? Seems like mine gets a lot less consistent as I drop my water temp below 20C. And then I see serious regression when I go below 10C. I was actually testing bios to see if that changed any and stumbled on to the MSI bios for Speedway.


----------



## Zero989

yzonker said:


> Do you see very significant run to run varience? Seems like mine gets a lot less consistent as I drop my water temp below 20C. And then I see serious regression when I go below 10C. I was actually testing bios to see if that changed any and stumbled on to the MSI bios for Speedway.


Maybe up to 20 points. I can also see artifacts sometimes in Speedway @ 1650+ VRAM. They look like Christmas lights, just green and red glowing from the wall. Should also test with HAGS on/off. I never redid my pads for VRAM on MSI due to it being so picky with temperature, seems like it's best to let it run on the warmer side. That is probably why I don't get regression of any kind. Sometimes when I set VRAM too high and try to load Speedway I just get a black screen and have to hardware reboot. That's my experience thus far lol.


----------



## yzonker

Zero989 said:


> Maybe up to 20 points. I can also see artifacts sometimes in Speedway @ 1650+ VRAM. They look like Christmas lights, just green and red glowing from the wall. Should also test with HAGS on/off. I never redid my pads for VRAM on MSI due to it being so picky with temperature, seems like it's best to let it run on the warmer side. That is probably why I don't get regression of any kind. Sometimes when I set VRAM too high and try to load Speedway I just get a black screen and have to hardware reboot. That's my experience thus far lol.


I've been watching every run and haven't seen anything. But small dots might get missed. 

The MSI bios isn't all of it. Looks like that bios forcing me to move one monitor to a different HDMI port and leaving the 2nd monitor unplugged helps too despite the 2nd monitor being powered off AND disabled in NVCP.

So it's 2 things that bumped my score.


----------



## yzonker

Zero989 said:


> Maybe up to 20 points. I can also see artifacts sometimes in Speedway @ 1650+ VRAM. They look like Christmas lights, just green and red glowing from the wall. Should also test with HAGS on/off. I never redid my pads for VRAM on MSI due to it being so picky with temperature, seems like it's best to let it run on the warmer side. That is probably why I don't get regression of any kind. Sometimes when I set VRAM too high and try to load Speedway I just get a black screen and have to hardware reboot. That's my experience thus far lol.


Also, I had previously tested HAGS off in TS and saw a big drop in score. Just tried PR and SW and saw the same.


----------



## Zero989

yzonker said:


> Also, I had previously tested HAGS off in TS and saw a big drop in score. Just tried PR and SW and saw the same.


Good to know. I wonder if that XOC BIOS will result in lower scores vs. Liquid Suprim X 600W since it's by ASUS. 

FFS need a BIOS editor and memory timings access + voltage controls.


----------



## Panchovix

Alemancio said:


> How do you calculate these tiers?


Just random I think? I mean, A/S is the best and F is the worse.
My core is below average (2940Mhz at 1.05V) so I ranked it D, and my VRAM can only do +1100, so I ranked it F.


----------



## yzonker

Zero989 said:


> Good to know. I wonder if that XOC BIOS will result in lower scores vs. Liquid Suprim X 600W since it's by ASUS.
> 
> FFS need a BIOS editor and memory timings access + voltage controls.


Yea could be it would be worse. Hard to say. I'm a little surprised there is any difference, but there definitely is. I just ran this though with the Strix V2 bios that @KedarWolf shared. Less core clock too than the fastest run. So pretty close really. Was mostly the monitor. 









I scored 11 192 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





MSI, 









I scored 11 242 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

Sorry for the flurry of posts. Last one. Managed to finally edge past my previous PR run. Really just wanted to back it up to help show the original was legit.









I scored 29 028 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## jlodvo

Hello need advise which is the better choice of this 3, going to add a waterblock, plan to get the tuf non oc, is thier a huge diff on performance? price diff is kinda huge for strix and tuf non oc

RTX 4090 Asus TUF (non-OC) = 2000usd
RTX 4090 Asus TUF OC = 2350usd
RTX 4090 Asus Strix OC = 2600usd

and does a 4090 gpu waterblock with active backplate cooling like the ekwb benifit the 4090?

also which waterblock has the best design and performance would you recommend thanks


----------



## Zero989

yzonker said:


> Yea could be it would be worse. Hard to say. I'm a little surprised there is any difference, but there definitely is. I just ran this though with the Strix V2 bios that @KedarWolf shared. Less core clock too than the fastest run. So pretty close really. Was mostly the monitor.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 192 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> MSI,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 242 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Run it again, but use 1.5v, try for 6.1Ghz 2 cores, 6Ghz 6 cores and 5.3 ring. It should help a tiny bit. I have yet to run it with the past 3 drivers so I'll try it too.

With ASUS vbios I never got close to 11200.


----------



## yzonker

jlodvo said:


> Hello need advise which is the better choice of this 3, going to add a waterblock, plan to get the tuf non oc, is thier a huge diff on performance? price diff is kinda huge for strix and tuf non oc
> 
> RTX 4090 Asus TUF (non-OC) = 2000usd
> RTX 4090 Asus TUF OC = 2350usd
> RTX 4090 Asus Strix OC = 2600usd
> 
> and does a 4090 gpu waterblock with active backplate cooling like the ekwb benifit the 4090?
> 
> also which waterblock has the best design and performance would you recommend thanks


I don't think there is any difference and mostly just down to the lottery. And the base TUF can be flashed with the Strix bios.

Active backplate is a complete waste of money on the 4090 IMO. The memory doesn't even like being really cool.


----------



## motivman

jlodvo said:


> Hello need advise which is the better choice of this 3, going to add a waterblock, plan to get the tuf non oc, is thier a huge diff on performance? price diff is kinda huge for strix and tuf non oc
> 
> RTX 4090 Asus TUF (non-OC) = 2000usd
> RTX 4090 Asus TUF OC = 2350usd
> RTX 4090 Asus Strix OC = 2600usd
> 
> and does a 4090 gpu waterblock with active backplate cooling like the ekwb benifit the 4090?
> 
> also which waterblock has the best design and performance would you recommend thanks


RTX 4090 Asus TUF (non-OC) = 2000usd


----------



## yzonker

Zero989 said:


> Run it again, but use 1.5v, try for 6.1Ghz 2 cores, 6Ghz 6 cores and 5.3 ring. It should help a tiny bit. I have yet to run it with the past 3 drivers so I'll try it too.
> 
> With ASUS vbios I never got close to 11200.


I did a bunch of testing. HT on/off, all core overclocks, e cores off, etc.. and found best performance at least in PR was with the CPU running in adaptive mode. 

Weirdly takes a small hit (50 pts) if I deviate from that with any of the things I tried that I listed above. Admittedly I only tested PR.


----------



## th3illusiveman

So with my new cable Afterburner is showing a 133% power limit and the voltage can now be adjusted. How high is safe for a quick bench run? FE model. it ranges from 0mv to 100mv. Of course i will always keep an eye on temps during the run. Just trying to crack 3Ghz for the heck of it.

I'm having flashbacks of when i toasted my old GTX 460 when i was new to PC building chasing a high bench score and turning up the dials to max without thinking lol.


----------



## Arizor

temperatures being ok of course, you can max out both PL and volt without worrying for benchmarks, and probably most other things.

that said as you probably know, massively diminishing returns in games, I’d recommend an undervolt for anything outside of benching.



th3illusiveman said:


> So with my new cable Afterburner is showing a 133% power limit and the voltage can now be adjusted. How high is safe for a quick bench run? FE model. it ranges from 0mv to 100mv. Of course i will always keep an eye on temps during the run. Just trying to crack 3Ghz for the heck of it.
> 
> I'm having flashbacks of when i toasted my old GTX 460 when i was new to PC building chasing a high bench score and turning up the dials to max without thinking lol.


----------



## Xian65

qfg77 said:


> Does your Bykski block make an annoying sound when fully loaded? Bizarrely, my Phanteks 4090 Gigabyte OC makes a loud whining noise when fully loaded. It was so bad I had to return the block.


Might be a generalisation but I think most blocks expose additional coil whine. The Phanteks block I have certainly does and almost every EK block I’ve had through the old 780 to the 3090 has done so as well. Ironically the only one I couldn’t hear was the Gigabyte 1080Ti Waterblock edition I had years ago.


----------



## KingEngineRevUp

th3illusiveman said:


> So with my new cable Afterburner is showing a 133% power limit and the voltage can now be adjusted. How high is safe for a quick bench run? FE model. it ranges from 0mv to 100mv. Of course i will always keep an eye on temps during the run. Just trying to crack 3Ghz for the heck of it.
> 
> I'm having flashbacks of when i toasted my old GTX 460 when i was new to PC building chasing a high bench score and turning up the dials to max without thinking lol.


I second undervolting the card.

I'm doing a comfortable 0.950 @ 2715 MHz and +1500 on memory. Achieved with a +165 and then flattening.


----------



## Sheyster

doom3crazy said:


> EDIT: Never mind. I think it's running normal. It appears to down clock when not using full power or whatever(which I don't think I have ever realized)


If you select "Prefer Maximum Performance" in NVCP > Manage 3D settings >Power Management Mode, it won't down clock. You might need to re-boot before you see this behavior.


----------



## Nico67

KingEngineRevUp said:


> This block does look pretty cool. I guess this is considered Bykski premium line.
> 
> 1. Comes with new single IO bracket
> 2. Backplate rear is CNC to have some type of fin arrangement, but it's ruined because the cover ends up covering the find
> 
> Hopefully someone is brave enough to try it.
> View attachment 2583823


Now that one has a sensible outlet design near the outer edge of the card 


mirkendargen said:


> It's interesting they made the fin area a lot bigger than the Bykski block, seeing how small it was compared to the other brand blocks was the first thing I noticed (although it doesn't seem to be hampering the performance). And including a single slot bracket is nice for improving physical compatibility...


A lot more and thinner fins, 0.3mm. Guess this is aimed at performance rather than Aesthetics


----------



## B1gD4ddy

does someone know the power limit of the trinity oc? and the ventus? and sg 1click oc?


----------



## th3illusiveman

Arizor said:


> temperatures being ok of course, you can max out both PL and volt without worrying for benchmarks, and probably most other things.
> 
> that said as you probably know, massively diminishing returns in games, I’d recommend an undervolt for anything outside of benching.





KingEngineRevUp said:


> I second undervolting the card.
> 
> I'm doing a comfortable 0.950 @ 2715 MHz and +1500 on memory. Achieved with a +165 and then flattening.


My intent is to 100% run with a UV profile for the cards life, this is just for benching and epeen lol. Just wanna see that sweet 3Ghz at least once

EDIT: After giving it the beans, it looks like it tops out at (+230hmz) which is about 3GHz with ~80mv (max 65c temp and 73c hotspot) - funny thing is power didnt really go above 500w in PR. - seems like a mediocre core but it doesn't matter since it will be UV anyways.


----------



## J7SC

Finally got around to a bit of benching fun w/the 5950X (PBO only, not fixed OC). The Gigabyte G-OC with the basic Bykski block easily holds > 3100 (depending on bench, 3135+). What is weird though is the re-think I have to do on VRAM...it seems to want to have at least 40C _minimum_ to get to ~ +1500, but at 43 C general GPU temp, there is a step down on the core...a really narrow range to operate in...took me a few hours to figure that one out as my system gets down to below 40C at 20C ambient. Also, leaving HDR on seems to not hurt a thing.

I haven't subbed all of it yet as I plan on a few more back-up runs w/slightly higher temps for the VRAM...needless to add that I won't sub the Port Royal 30,337 and the 29,898 scores (greyed out below) as there were, ahem, some artefacts , though 29K++ is doable.


----------



## jootn2kx

After the update of GPU Z I can see memory temperature is now showing up but in msi afterburner not yet.
Any workaround to get memory temperature to work in riva tuner statistics?


----------



## Nizzen

For people that think they have a good score in Port royal....

"3D benching as we know it is broken. Plain and simple. What are we going to do about it? RTX-4090's are great cards but they also bug very easily. We need to be better users. If you see your score jump 1k point after raising your mem clock 15mhz that is not normal and not a "lucky" run. When your entire screen is tan from an artifact spanning your entire monitor and your score increases 10% it is not a "lucky run." In my opinion after a first warning we should issue a vacation for the user.

After 2 minutes of research it is quite easy to see a bugged run. "
















Giant 3D elephant in the room no one wants to talk about...(why is it always me bringing this stuff to light?)


3D benching as we know it is broken. Plain and simple. What are we going to do about it? RTX-4090's are great cards but they also bug very easily. We need to be better users. If you see your score jump 1k point after raising your mem clock 15mhz that is not normal and not a "lucky" run. When your...



community.hwbot.org






Around 28000 points, and the normal scaling to be gone with cards on air...
Edit: Around 29k with the newest driver....


----------



## KedarWolf

You can get decent scores not bugged too.

This was a totally legit run on my 7950x and another run on my older 5950x I had before wasn't much less.









I scored 29 200 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 28 983 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





But yes, bugged runs suck.


----------



## Nizzen

KedarWolf said:


> You can get decent scores not bugged too.
> 
> This was a totally legit run on my 7950x and another run on my older 5950x I had before wasn't much less.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 200 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 983 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> But yes, bugged runs suck.


Show med the scaling with the memory OC. +500 +1000 +1500
I've got over 31k in port Royal, but the scaling is out of the world.

Where are people getting extra 1000 points out of nowhere. This is the point...

Score scaling with core and memory is easy to show, if people want. People don't want to do that, because they want their bugged run to be legit.
On hwbot, this is a big problem. Hall of Fame is full of bugged runs, and almost impossible to moderate like others are saying.


----------



## tryout1

Alemancio said:


> How do you calculate these tiers?


sry for the late reply, but i didn't calculate anything tbh  i just put my personal feelings toward it and based on experience what i saw so far what these cards can get, a friend of mine which owns a zotac trinity, has like almost the same average CP2077 core, i think they said like ~2950mhz at 1.1v. I'm aware my core get's about like 40mhz more on just rasterization but for me stable means stable and not just having the biggest clocks.

The weird part is as soon as i put in 1.1v or 100% on the voltage slider in afterburner, i noticed my clock and effective clock starts to widen, like most people including me here noticed it stays normally in the 30-40mhz range but as soon as i put in 1.1v it stretches to 80-100mhz, even at stock clocks (no offset used and just 100% voltage slider).


----------



## ShadowYuna

qfg77 said:


> Does your Bykski block make an annoying sound when fully loaded? Bizarrely, my Phanteks 4090 Gigabyte OC makes a loud whining noise when fully loaded. It was so bad I had to return the block.


No my Bykski block is quite as before. There is coil whine noise if I put ear next to the card when full load. When case close I can not hear. This also same on Bykski WB as well.


----------



## Edge0fsanity

Has anyone made their own 12vhpwr cable yet? I have everything I need to make one but was unsure of a couple of things. 
1. Can't seem to find much on how the sense pins need to be wired. Seems like both of these just need a ground from the PSU? 
2. 2 or 3 plugs to the PSU? Have a 1600w superflower se. Seems like 2 would work just fine given they're all pulling on the same rail. If we get a 1000w bios I will run that.


----------



## Benni231990

Nizzen said:


> Around 28000 points, and the normal scaling to be gone with cards on air...


Yes that is my opinon to my 28600 is absolute correct when i see other here with lower Core and RAM but have over 29k it is not true


----------



## yzonker

This must be what I'm running in to with my chiller. Yesterday I found that there is a big drop in score/efficiency when I hit around 10C water. But it does seem to vary from bench to bench as to when this hits. in 3DMark, PR seems to take the hit the earliest.






Giant 3D elephant in the room no one wants to talk about...(why is it always me bringing this stuff to light?)


3D benching as we know it is broken. Plain and simple. What are we going to do about it? RTX-4090's are great cards but they also bug very easily. We need to be better users. If you see your score jump 1k point after raising your mem clock 15mhz that is not normal and not a "lucky" run. When your...



community.hwbot.org





This makes it really tricky for me to get the best score as I have to get the water temp just right.



Nizzen said:


> Show med the scaling with the memory OC. +500 +1000 +1500
> I've got over 31k in port Royal, but the scaling is out of the world.
> 
> Where are people getting extra 1000 points out of nowhere. This is the point...
> 
> Score scaling with core and memory is easy to show, if people want. People don't want to do that, because they want their bugged run to be legit.
> On hwbot, this is a big problem. Hall of Fame is full of bugged runs, and almost impossible to moderate like others are saying.


We have several people here now that have posted scores around 29k. I'm convinced those are legit after all the runs I've made as well. I can barely break 29k, but high 28's are very easy. If I have time today, I'll do your mem scaling test. Scores won't be quite 29k probably due to the difficulty in getting there, but I should easily be about to do 28.7k or 28.9k with some test runs.

And I've already been there in regards to plotting fps from the runs. The 3DMark files saved on your local drive are just zip files with a csv in there. 

The graph below is my original 29011 run in PR plotted with a 28.8k run I did later on. I was looking at this trying to determine why I couldn't get back to 29k. My first thought was possibly one small section was artifacted and boosted my score, but you can see the plots lay on top of each other reasonably well. The sample rate from the 3DMark file is only 1 Hz, so in areas where fps changes quickly, they don't line up too well, but in more constant sections you can see where the 28.8k run is lagging slightly.

In the end, the difference is mostly the regression I'm seeing by going too low on water temp.

Red is 28.8k, blue is 29k.


----------



## Nizzen

yzonker said:


> This must be what I'm running in to with my chiller. Yesterday I found that there is a big drop in score/efficiency when I hit around 10C water. But it does seem to vary from bench to bench as to when this hits. in 3DMark, PR seems to take the hit the earliest.
> 
> 
> 
> 
> 
> 
> Giant 3D elephant in the room no one wants to talk about...(why is it always me bringing this stuff to light?)
> 
> 
> 3D benching as we know it is broken. Plain and simple. What are we going to do about it? RTX-4090's are great cards but they also bug very easily. We need to be better users. If you see your score jump 1k point after raising your mem clock 15mhz that is not normal and not a "lucky" run. When your...
> 
> 
> 
> community.hwbot.org
> 
> 
> 
> 
> 
> This makes it really tricky for me to get the best score as I have to get the water temp just right.
> 
> 
> 
> We have several people here now that have posted scores around 29k. I'm convinced those are legit after all the runs I've made as well. I can barely break 29k, but high 28's are very easy. If I have time today, I'll do your mem scaling test. Scores won't be quite 29k probably due to the difficulty in getting there, but I should easily be about to do 28.7k or 28.9k with some test runs.
> 
> And I've already been there in regards to plotting fps from the runs. The 3DMark files saved on your local drive are just zip files with a csv in there.
> 
> The graph below is my original 29011 run in PR plotted with a 28.8k run I did later on. I was looking at this trying to determine why I couldn't get back to 29k. My first thought was possibly one small section was artifacted and boosted my score, but you can see the plots lay on top of each other reasonably well. The sample rate from the 3DMark file is only 1 Hz, so in areas where fps changes quickly, they don't line up too well, but in more constant sections you can see where the 28.8k run is lagging slightly.
> 
> In the end, the difference is mostly the regression I'm seeing by going too low on water temp.
> 
> Red is 28.8k, blue is 29k.
> 
> View attachment 2583994


Pleace try 3000mhz core and +0mhz mem +500mem +1000 mem +1500 mem.


----------



## Tideman

Panchovix said:


> Just random I think? I mean, A/S is the best and F is the worse.
> My core is below average (2940Mhz at 1.05V) so I ranked it D, and my VRAM can only do +1100, so I ranked it F.


Then mine must be an F- at only +700..


----------



## Nizzen

Same ~3045 core on every run
27584p +0 mem
27989p +500mem
28353p +1000mem
28639p +1500mem
28790p +1700mem

This is the scaling for me with 4090 strix OC.
A bit better with now driver, but you guys get the point of this scaling


----------



## LordGurciullo

This will either help people do this or hopefully have a laugh .
Thanks for watching! Stay Zorz out there!


----------



## Zero989

526.98 no bueno in Overwatch. I get weird driver crashes and crashes that claim my driver has been hijacked.









Overwatch has crashed in the graphics driver


Also getting same issue with 13900K and RTX 4090. Tried multiple Nvidia drivers.




us.forums.blizzard.com


----------



## yzonker

Nizzen said:


> Same ~3045 core on every run
> 27584p +0 mem
> 27989p +500mem
> 28353p +1000mem
> 28639p +1500mem
> 28790p +1700mem
> 
> This is the scaling for me with 4090 strix OC.
> A bit better with now driver, but you guys get the point of this scaling
> 
> 
> 
> View attachment 2584003
> 
> 
> View attachment 2584002
> 
> View attachment 2584001
> 
> View attachment 2584000


I guess I don't understand. That looks perfectly reasonable and you're approaching 29k.


----------



## Nizzen

yzonker said:


> I guess I don't understand. That looks perfectly reasonable and you're approaching 29k.


Looks like "lucky" runs is 29k+ 
As long as the scaling is pretty constant, the score is ok.


----------



## jootn2kx

Highest I can get with my full stable gaming profile  gainward non gs model.
Mostly i'm around 3045 core and +2000memory 
Not that bad for a cheap card I guess lol


----------



## Nizzen

KedarWolf said:


> You can get decent scores not bugged too.
> 
> This was a totally legit run on my 7950x and another run on my older 5950x I had before wasn't much less.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 200 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 983 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> But yes, bugged runs suck.


Ln2 vs Air:
Almost same score...








Result







www.3dmark.com


----------



## GRABibus

Nizzen said:


> For people that think they have a good score in Port royal....
> 
> "3D benching as we know it is broken. Plain and simple. What are we going to do about it? RTX-4090's are great cards but they also bug very easily. We need to be better users. If you see your score jump 1k point after raising your mem clock 15mhz that is not normal and not a "lucky" run. When your entire screen is tan from an artifact spanning your entire monitor and your score increases 10% it is not a "lucky run." In my opinion after a first warning we should issue a vacation for the user.
> 
> After 2 minutes of research it is quite easy to see a bugged run. "
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Giant 3D elephant in the room no one wants to talk about...(why is it always me bringing this stuff to light?)
> 
> 
> 3D benching as we know it is broken. Plain and simple. What are we going to do about it? RTX-4090's are great cards but they also bug very easily. We need to be better users. If you see your score jump 1k point after raising your mem clock 15mhz that is not normal and not a "lucky" run. When your...
> 
> 
> 
> community.hwbot.org
> 
> 
> 
> 
> 
> 
> Around 28000 points, and the normal scaling to be gone with cards on air...
> Edit: Around 29k with the newest driver....


OK.
And for those who think they have a bad score ?


----------



## GRABibus

Nizzen said:


> Looks like "lucky" runs is 29k+
> As long as the scaling is pretty constant, the score is ok.


I can reproduce 29K+ (29000-29100) on PR, on air with my Gigabyte gaming OC with 5950X with last drivers each time I bench (No lucky runs)
No artefacts, +1650MHz on memory (As of 1700MHz my score starts to decrease).









I scored 29 077 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Regular runs here :








3DMark.com search


3DMark.com search




www.3dmark.com





As you can see, my card boosts high only with +225MHz on Core (+240MHz i crash most of the times).
This is a good sample with high boost clock out of the box.


----------



## GRABibus

J7SC said:


> Finally got around to a bit of benching fun w/the 5950X (PBO only, not fixed OC). The Gigabyte G-OC with the basic Bykski block easily holds > 3100 (depending on bench, 3135+). What is weird though is the re-think I have to do on VRAM...it seems to want to have at least 40C _minimum_ to get to ~ +1500, but at 43 C general GPU temp, there is a step down on the core...a really narrow range to operate in...took me a few hours to figure that one out as my system gets down to below 40C at 20C ambient. Also, leaving HDR on seems to not hurt a thing.
> 
> I haven't subbed all of it yet as I plan on a few more back-up runs w/slightly higher temps for the VRAM...needless to add that I won't sub the Port Royal 30,337 and the 29,898 scores (greyed out below) as there were, ahem, some artefacts , though 29K++ is doable.
> View attachment 2583956


Links for PR ?


----------



## yzonker

Nizzen said:


> Ln2 vs Air:
> Almost same score...
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


That's not surprising when I consider,

1. My card takes a 200-300pt hit in PR when my water temp drops below 10C. I'm assuming the LN2 run will see the same or more.

2. Core clock on the 4090 does not scale well. There's only a 150mhz difference in core between those 2 runs. After taking a hit from the low temps, a lot of that gain is lost.

Reality is that cold benching on the 4090 just isn't as good as in the past.


----------



## lawson67

Zodiac released a new bios so i flashed it now my card stutters like a ***** in some games, to flash back the old bios do i have to i have to go to device manager and uninstall the card before using NVflash? or is there a certain point i have to disable the card in device manager (IE) after backing up the old bios? and before you flash the new bios?, ive watched loads of videos on NVflash all do different things, ive been team red for years untill 4090 so can flash any AMD card but don't want to get it wrong with my 4090 any help would be apricated

Also why were all the new 4090 bios released what does it fix? zotac give me no information at all, so wondering if i even need the new bios update?
thanks for any help


----------



## dante`afk

Mem doesn’t like cold at all, for me at least

I have to go down like 300mhz on memory with cold water


----------



## KingEngineRevUp

J7SC said:


> Finally got around to a bit of benching fun w/the 5950X (PBO only, not fixed OC). The Gigabyte G-OC with the basic Bykski block easily holds > 3100 (depending on bench, 3135+). What is weird though is the re-think I have to do on VRAM...it seems to want to have at least 40C _minimum_ to get to ~ +1500, but at 43 C general GPU temp, there is a step down on the core...a really narrow range to operate in...took me a few hours to figure that one out as my system gets down to below 40C at 20C ambient. Also, leaving HDR on seems to not hurt a thing.
> 
> I haven't subbed all of it yet as I plan on a few more back-up runs w/slightly higher temps for the VRAM...needless to add that I won't sub the Port Royal 30,337 and the 29,898 scores (greyed out below) as there were, ahem, some artefacts , though 29K++ is doable.
> View attachment 2583956


Yes use rivatuner and HWINFO64 instead. You can also project your water temperature sensors and other things.

Imo it's better, but takes more work to customize.


----------



## yzonker

lawson67 said:


> Zodiac released a new bios so i flashed it now my card stutters like a *** in some games, to flash back the old bios do i have to i have to go to device manager and uninstall the card before using NVflash? or is there a certain point i have to disable the card in device manager (IE) after backing up the old bios? and before you flash the new bios?, ive watched loads of videos on NVflash all do different things, ive been team red for years untill 4090 so can flash any AMD card but don't want to get it wrong with my 4090 any help would be apricated
> 
> Also why were all the new 4090 bios released what does it fix? zotac give me no information at all, so wondering if i even need the new bios update?
> thanks for any help


NVFlash disables the card before the "--protectoff" and the actual flash command, but I always manually disable it just before the flash command anyway. But since it re-enables the card after the "--protectoff" command, there's no point in doing it manually sooner. And no, no reason to update if you are not having any issues. I think they are mostly patching them for compatibility (black screening, etc...).


----------



## KingEngineRevUp

So what does 3dmark have to do? Do they need to just wipe all the 4090 scores from the top leaderboards for a clean slate?


----------



## Nizzen

KingEngineRevUp said:


> So what does 3dmark have to do? Do they need to just wipe all the 4090 scores from the top leaderboards for a clean slate?


Noone knows yet. They are talking about forcing ecc=on to run benchmarks.


----------



## Benni231990

Nizzen said:


> They are talking about forcing ecc=on to run benchmarks.


with ECC on say goodbye to 29-30k 

we will see 27 when you have luck in the Memory silicon lottery


----------



## KingEngineRevUp

Benni231990 said:


> with ECC on say goodbye to 29-30k
> 
> we will see 27 when you have luck in the Memory silicon lottery


But what if that's what the cards are realistically running at? Like if we do game benchmark with ECC on and off, what if it doesn't impact actual games?

If that's true then that is the right way for us to do it.


----------



## Benni231990

I have great news i have "Maybe" a Source how will upload the Neptune 630watt bios 




KingEngineRevUp said:


> But what if that's what the cards are realistically running at? Like if we do game benchmark with ECC on and off, what if it doesn't impact actual games?


In my opinion i think maybe +1400-1500 is realistic everthing above is silicon lottery on this point the EEC will start and you get perfomance lost when you have bad ram


----------



## KingEngineRevUp

Benni231990 said:


> I have great news i have "Maybe" a Source how will upload the Neptune 630watt bios
> 
> 
> 
> In my opinion i think maybe +1400-1500 is realistic everthing above is silicon lottery on this point the EEC will start and you get perfomance lost when you have bad ram


Any links to discussions?


----------



## Panchovix

Tideman said:


> Then mine must be an F- at only +700..


Wow, we got really unlucky there with the VRAM  and now imagine with ECC enabled, probably near stock VRAM levels lol


----------



## narrn2761

Generally, what is the highest and safest 24/7 memory OC one can expect on a TUF OC?

Currently I am at +1200 memory. Wondering how much I can go higher for daily gaming use?


----------



## Tideman

Panchovix said:


> Wow, we got really unlucky there with the VRAM  and now imagine with ECC enabled, probably near stock VRAM levels lol


Yeah I'm pretty shocked actually. I'm not a bencher but with vram this bad, I'm bound to be taking a hit in games as well..


----------



## keikei

Hey guys, would ya'll recommend me upgrade a 3700X if I game @4k/144hz or just hold off? I hear this card even bottlenecks @ that res to an extent.


----------



## GRABibus

keikei said:


> Hey guys, would ya'll recommend me upgrade a 3700X if I game @4k/144hz or just hold off? I hear this card even bottlenecks @ that res to an extent.


You will bottleneck your 4090 severely even at 4K.
An upgrade to 5800X3D could be a good option.


----------



## shadow85

just tried +1400 mem on my TUF OC, instantly artifacted and crashed in-game.
I am on stock bios. Am I suppose to flash a different BIOS to get 1400+ mem?


----------



## pat182

STRIX pretty much stock, just max slider and +100 core without any fan or temperature optimisation, looks like im scoring more than average . pretty happy


----------



## Panchovix

Tideman said:


> Yeah I'm pretty shocked actually. I'm not a bencher but with vram this bad, I'm bound to be taking a hit in games as well..


At least you're not a bencher, I love to bench but my VRAM kills any hope lol. And yeah, at +500 is the bigger jump, at +1000 the jump from +500 is a little smaller.




keikei said:


> Hey guys, would ya'll recommend me upgrade a 3700X if I game @4k/144hz or just hold off? I hear this card even bottlenecks @ that res to an extent.


Whatever you don't crash or get artifacts, I do +1000Mhz (but because my max is +1100Mhz), on my past 3080 I did +1500 for about 2 years and had no issues,


----------



## gaddster

alasdairvfr said:


> Does anyone here play Uncharted: Legacy of Thieves Collection? It runs great for me but I'm getting some odd flashing, like a single frame of white (or rarely red) semi-frequently. Tried stock clocks, DLSS on/off doesn't seem to affect it. It doesn't seem to happen in other games but I've been mostly playing Uncharted as of late.


I do and after changing my display to 10bpc from 12bpc in Nv Cpanel it stopped completely.


----------



## gaddster

SilenMar said:


> It's definitely not junk data. After I put better thermal pads on the memory, the GPU core and Hotspot temperature rises about 2C but the Memory Junction temps gets 1C lower. It also takes longer time for the Memory Junction to rise up. The heat dissipate capacity of the AIO is just as low as 500W instead of 600W. If you cool the memory, the core will be heated more.
> 
> If you flash a 600W BIOS, the AIO card needs to have the fans to spin a lot faster and a lot louder than the air cooled card in order to keep the same temperature.
> 
> Here is the genius solution of Gigabyte to "solve" the temp instead of putting better effort at the actual cooling:
> Gigabyte simply uses a worse thermal pads on the memory and on VRAM, then show the sensor for the hot spot that is closest to the die. The temperatures will appear fine for the core and that hot spot but everything else is heating up.



So it looks like Gigabyte learned nothing after also using junk thermal pads on their 3090's. Had the misfortune of owning one, fixed it be replacing the pads and got huge reductions in temps. When I took it apart they had left so much greasy residue, really low quality.. Very much the same, core was ok but mem temps were over 100c etc..


----------



## AdamK47

I see this problem way too often here. People post about clock speeds in terms of additional boost clocks set through MSI Afterburner or like tools. 

Stating you can get +200 core is meaningless if you don't mention what card and what BIOS you use since cards have a wide range in the base boost clock speed. Even then, most on here don't know right away what the base boost in MHz each card/BIOS comes with. Example, Gaming X Trio starts at 2610MHz, Asus TUF at 2595MHz, and Suprim Liquid X at 2640MHz. Those are just the OC cards. The cards at reference 2520MHz are of course going to do +200 since by default they are set so low.

The same thing with power percentages. Some BIOS start out at 450W and some 480W. Stating the additional percentage is pointless without knowing the starting point.

/rant


----------



## Sheyster

Nizzen said:


> People don't want to do that, because they want their bugged run to be legit.
> *On hwbot, this is a big problem. Hall of Fame is full of bugged runs, and almost impossible to moderate like others are saying.*


If something is hackable, people will exploit it, plain and simple. This happens with anything competitive, it's just human nature. Always has been and always will be.

I know this sounds pessimistic AF, but I guess I'm just getting old and losing faith in humanity.


----------



## Sheyster

narrn2761 said:


> Generally, what is the highest and safest 24/7 memory OC one can expect on a TUF OC?
> 
> Currently I am at +1200 memory. Wondering how much I can go higher for daily gaming use?


You don't need to go higher for gaming. Memory scaling (FPS) after +1200 is very small.

My gaming OC is 3000 core with +1000 memory, and I usually don't even bother with most games. The 4090 is a beast at stock clocks!


----------



## ianann

Hi, first post and a bit late to the party but here are some loose measurements I got from my ek-waterblocked 4090 FE. 

Just installed my FE Block (non-active) yesterday and did a quick 20-minute Port Royale Stresstest.
After that, HWinfo reported as follows:


GPU Temperature max: 46,1°C
GPU Memory Junction Temperature max: 44,0°C
GPU Hot Spot Temperature max: 54,1°C
GPU Rail Power max: ~562W
Water temperature max: ~30°C
Flow Rate about 160 l/h (for acoustic reasons, if I go "all in" with about 330 l/h temps drop a bit more)

When I run OCCT Stresstest, Temps are 'skyrocketing' at

GPU Temperature max: 53,8°C
GPU Memory Junction Temperature max: 46,0°C
GPU Hot Spot Temperature max: 68,0°C
GPU Rail Power max: ~620W
Still okay I guess as I most likely won't ever torture my card with the latter more than for just stability tests. 
What's obvious is, that the core really pushes out a lot of heat when under heavy load. Good to see though, it's still manageable.

Idling GPU hovers around the water temperature while Hotspot is about 7K hotter. Interesting: Memory Juction shows a lower temperature than water, which is most likely due to coarse measurings at lower temps I guess?

Overall I wasn't too happy with the ekwb quality in the last years, parts got ridiculously pricey while quality decreased substantially. It was the only available in europe in the last week so I went with it and purchased directly from their website. Quality-wise it's fine though, great looking, nice 1-slot-bracket, some philips-screwheads look crazily warped but worked as intended. Looks like they're even cutting price/quality on screws here.

Btw: The "new" and fixed blocks have a "II" engraved on the cooler. The first batch of blocks were a little off hence they put some 0,8mm pads and a manual sheet in the box.

I went with alphacool Apex on the Die and alphacool Rise Ultra Soft 7W/mK on the other components. Purchased some Arctiv TP-3 for some future testing though.

Still waiting for my native 12VHPWR cable to arrive someday soon.

Cheers.


----------



## PhuCCo

ianann said:


> Hi, first post and a bit late to the party but here are some loose measurements I got from my ek-waterblocked 4090 FE.
> 
> Just installed my FE Block (non-active) yesterday and did a quick 20-minute Port Royale Stresstest.
> After that, HWinfo reported as follows:
> 
> 
> GPU Temperature max: 46,1°C
> GPU Memory Junction Temperature max: 44,0°C
> GPU Hot Spot Temperature max: 54,1°C
> GPU Rail Power max: ~562W
> Water temperature max: ~30°C
> Flow Rate about 160 l/h (for acoustic reasons, if I go "all in" with about 330 l/h temps drop a bit more)
> 
> When I run OCCT Stresstest, Temps are 'skyrocketing' at
> 
> GPU Temperature max: 53,8°C
> GPU Memory Junction Temperature max: 46,0°C
> GPU Hot Spot Temperature max: 68,0°C
> GPU Rail Power max: ~620W
> Still okay I guess as I most likely won't ever torture my card with the latter more than for just stability tests.
> What's obvious is, that the core really pushes out a lot of heat when under heavy load. Good to see though, it's still manageable.
> 
> Idling GPU hovers around the water temperature while Hotspot is about 7K hotter. Interesting: Memory Juction shows a lower temperature than water, which is most likely due to coarse measurings at lower temps I guess?
> 
> Overall I wasn't too happy with the ekwb quality in the last years, parts got ridiculously pricey while quality decreast substantially. It was the only available in europe in the last week so I went with it and purchased directly from tiehr website. Quality-wise its fine though, great looking, nice 1-slot-bracket, some philips-screwheads look crazily warped but worked as intended. Looks like they're even cutting price/quality on screws here.
> 
> Btw: The "new" and fixed blocks have a "II" engraved on the cooler. The first batch of blocks were a little off hence they put some 0,8mm pads and a manual sheet in the box.
> 
> I went with alphacool Apex on the Die and alphacool Rise Ultra Soft 7W/mK on the other components. Purchased some Arctiv TP-3 for some future testing though.
> 
> Still waiting for my native 12VHPWR cable to arrive someday soon.
> 
> Cheers.


So the proper EK block doesn't improve deltas.. you're about 16C delta at 400W which matches the rest of us that have tested the first revision. Thank you for posting results


----------



## J7SC

Sheyster said:


> If something is hackable, people will exploit it, plain and simple. This happens with anything competitive, it's just human nature. Always has been and always will be.
> 
> I know this sounds pessimistic AF, but I guess I'm just getting old and losing faith in humanity.





Nizzen said:


> Show med the scaling with the memory OC. +500 +1000 +1500
> I've got over 31k in port Royal, but the scaling is out of the world.
> 
> Where are people getting extra 1000 points out of nowhere. This is the point...
> 
> Score scaling with core and memory is easy to show, if people want. People don't want to do that, because they want their bugged run to be legit.
> On hwbot, this is a big problem. *Hall of Fame is full of bugged runs, and almost impossible to moderate like others are saying.*


Now hold on just a damn minute; I am laughing out loud at some of the hypocrisy of this. As a former HWbot elite (sub-zero) member years back...


Spoiler














 ...the issue at HWbot is that bugged runs by newbies with their 4090 upset the fierce internal rivalry by top overclockers that incidentally also have corporate support at stake.

While I have been out of the sub-zero game for a long time, I have even recently seen top sub-zero overclockers sub highly artefact-laden results per their own vids. Also, there at least used to be 'special' versions of NVInspector not available for general public that clearly changed the workload (and visual appearance) of a bench so much that it was hard to recognize...long before Port Royal ever appeared on the scene. Prior, folks used to complain about 'ES' hardware being allowed in, or NVCP's performance settings vs AMD, or..., or...

To reiterate my earlier posts, I am not condoning subbing bugged runs - otherwise I would have for example PR runs at well over 30.3k at 3DM. But to have to listen to current HWBot members and others whining about the newbies with their RTX4K upsetting their applecart is a bit much to take...I am not the habit of posting old but private PMs, but I still have some of those from currently active top-10 HWBot oc'ers asking me to make certain runs with a given hardware-combo so that would it block one of their competitor's point progress in a for-$ competitive event. It's a fierce competitive environment.

I have said before that the policing (and fixing) of bugged run issues can only be carried out by 3DMark staff themselves....after all, their system:
a.) is their software
b.) gives out the 'valid' checkmark (or not)
c.) they collect far more minute information per individual run than even the user sees / has access to, and 
d.) have ALL that information for ALL users.

This vigilante stuff by individuals whose nose is out of joint because of alleged 'HWBot interference' is hilarious. I would think that 3DM staff is already working on a newer Port Royal and/or Systeminfo version as we speak. 

Since I have several legit 29K++ runs not subbed yet (it is a good idea to keep back-ups for a rainy day), they may go stale if I don't sub them in time when a new 3DM version comes out...so be it; the primary purpose of my 4090 is to power and fill the 4K120 OLED, and besides, if everyone else has to run the new version, it is level-playing field again (for a while at least).

In the meantime, it is up to the user to be honest and not sub runs which had noticeable artefacts - even if they're sure that it is not a level-playing field as other do sub them. Finally, the 4090s really do behave very differently than even the 3090s, 2080 Tis re. temps, especially around VRAM vs. core. So LN2 and such may get you higher core speed, but screws with your VRAM settings.


----------



## ianann

It seriously is a shame it looks like both watercool and aquacomputer are still in an engineering-state, leaving only EKWB, Bykski and alphacool to release at least some blocks. If not made in Germany, at least made in europe with ekwb. Tzzzz ... 

Anyone already used those new Arctic TP-3 Pads?


----------



## Nizzen

J7SC said:


> Now hold on just a damn minute; I am laughing out loud at some of the hypocrisy of this. As a former HWbot elite (sub-zero) member years back...
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2584044
> 
> 
> 
> ...the issue at HWbot is that bugged runs by newbies with their 4090 upset the fierce internal rivalry by top overclockers that incidentally also have corporate support at stake.
> 
> While I have been out of the sub-zero game for a long time, I have even recently seen top sub-zero overclockers sub highly artefact-laden results per their own vids. Also, there at least used to be 'special' versions of NVInspector not available for general public that clearly changed the workload (and visual appearance) of a bench so much that it was hard to recognize...long before Port Royal ever appeared on the scene. Prior, folks used to complain about 'ES' hardware being allowed in, or NVCP's performance settings vs AMD, or..., or...
> 
> To reiterate my earlier posts, I am not condoning subbing bugged runs - otherwise I would have for example PR runs at well over 30.3k at 3DM. But to have to listen to current HWBot members and others whining about the newbies with their RTX4K upsetting their applecart is a bit much to take...I am not the habit of posting old but private PMs, but I still have some of those from currently active top-10 HWBot oc'ers asking me to make certain runs with a given hardware-combo so that would it block one of their competitor's point progress in a for-$ competitive event. It's a fierce competitive environment.
> 
> I have said before that the policing (and fixing) of bugged run issues can only be carried out by 3DMark staff themselves....after all, their system:
> a.) is their software
> b.) gives out the 'valid' checkmark (or not)
> c.) they collect far more minute information per individual run than even the user sees / has access to, and
> d.) have ALL that information for ALL users.
> 
> This vigilante stuff by individuals whose nose is out of joint because of alleged 'HWBot interference' is hilarious. I would think that 3DM staff is already working on a newer Port Royal and/or Systeminfo version as we speak.
> 
> Since I have several legit 29K++ runs not subbed yet (it is a good idea to keep back-ups for a rainy day), they may go stale if I don't sub them in time when a new 3DM version comes out...so be it; the primary purpose of my 4090 is to power and fill the 4K120 OLED, and besides, if everyone else has to run the new version, it is level-playing field again (for a while at least).
> 
> In the meantime, it is up to the user to be honest and not sub runs which had noticeable artefacts - even if they're sure that it is not a level-playing field as other do sub them. Finally, the 4090s really do behave very differently than even the 3090s, 2080 Tis re. temps, especially around VRAM vs. core. So LN2 and such may get you higher core speed, but screws with your VRAM settings.


From *Leeghoofd* 
"However what I find more worrying is that some of our "experienced" clockers posted their bugged results at the Hall of Fame.... they should lead by example, not like this... E-peen over common sense"


----------



## KingEngineRevUp

ianann said:


> Hi, first post and a bit late to the party but here are some loose measurements I got from my ek-waterblocked 4090 FE.
> 
> Just installed my FE Block (non-active) yesterday and did a quick 20-minute Port Royale Stresstest.
> After that, HWinfo reported as follows:
> 
> 
> GPU Temperature max: 46,1°C
> GPU Memory Junction Temperature max: 44,0°C
> GPU Hot Spot Temperature max: 54,1°C
> GPU Rail Power max: ~562W
> Water temperature max: ~30°C
> Flow Rate about 160 l/h (for acoustic reasons, if I go "all in" with about 330 l/h temps drop a bit more)
> 
> When I run OCCT Stresstest, Temps are 'skyrocketing' at
> 
> GPU Temperature max: 53,8°C
> GPU Memory Junction Temperature max: 46,0°C
> GPU Hot Spot Temperature max: 68,0°C
> GPU Rail Power max: ~620W
> Still okay I guess as I most likely won't ever torture my card with the latter more than for just stability tests.
> What's obvious is, that the core really pushes out a lot of heat when under heavy load. Good to see though, it's still manageable.
> 
> Idling GPU hovers around the water temperature while Hotspot is about 7K hotter. Interesting: Memory Juction shows a lower temperature than water, which is most likely due to coarse measurings at lower temps I guess?
> 
> Overall I wasn't too happy with the ekwb quality in the last years, parts got ridiculously pricey while quality decreased substantially. It was the only available in europe in the last week so I went with it and purchased directly from their website. Quality-wise it's fine though, great looking, nice 1-slot-bracket, some philips-screwheads look crazily warped but worked as intended. Looks like they're even cutting price/quality on screws here.
> 
> Btw: The "new" and fixed blocks have a "II" engraved on the cooler. The first batch of blocks were a little off hence they put some 0,8mm pads and a manual sheet in the box.
> 
> I went with alphacool Apex on the Die and alphacool Rise Ultra Soft 7W/mK on the other components. Purchased some Arctiv TP-3 for some future testing though.
> 
> Still waiting for my native 12VHPWR cable to arrive someday soon.
> 
> Cheers.


Those are pretty good results. That thermal paste must be working for you. Seems to take 2-3C off of your core temperatures.

I used MX-6 so getting 2-3C hotter than you.


----------



## ianann

KingEngineRevUp said:


> Those are pretty good results. That thermal paste must be working for you. Seems to take 2-3C off of your core temperatures.
> 
> I used MX-6 so getting 2-3C hotter than you.


I’d consider 2-3°C for application tolerance. xD


----------



## Sheyster

J7SC said:


> In the meantime, it is up to the user to be honest and not sub runs which had noticeable artefacts - even if they're sure that it is not a level-playing field as other do sub them.


I would not count on that. I agree with everything else you say though.

At the end of the day, anything competitive (especially with money and sponsor support involved) has to be policed and regulated to a certain degree. eSports, professional sports of all kinds, really anything competitive. There is a reason EAC, BattlEye, Ricochet, Warden (and others) exist in PC gaming. It's pretty clear something is needed here, it's the wild west right now. Even then, there will still be people taking advantage given the opportunity. Remember Lance Armstrong?   Another recent example is Hans Niemann in top-tier competition chess.


----------



## PhuCCo

ianann said:


> I’d consider 2-3°C for application tolerance. xD


Memory temps seem to only output readings in 2C increments and that's probably why your memory temps were reading lower than the water temp. Like the readings will go from 50C to 52C and skip 51C. Not insanely accurate, but nice to have.

Edit: Didn't reply to your correct post


----------



## ianann

PhuCCo said:


> Memory temps seem to only output readings in 2C increments and that's probably why your memory temps were reading lower than the water temp. Like the readings will go from 50C to 52C and skip 51C. Not insanely accurate, but nice to have.
> 
> Edit: Didn't reply to your correct post


No worries.  
Good to know regarding mem temps.
Looks like the block does a good job keeping them cool and every module has a good touch.


----------



## AdamK47

Sheyster said:


> I would not count on that. I agree with everything else you say though.
> 
> At the end of the day, anything competitive (especially with money and sponsor support involved) has to be policed and regulated to a certain degree. eSports, professional sports of all kinds, really anything competitive. There is a reason EAC, BattlEye, Ricochet, Warden (and others) exist in PC gaming. It's pretty clear something is needed here, it's the wild west right now. Even then, there will still be people taking advantage given the opportunity. Remember Lance Armstrong?   Another recent example is Hans Niemann in top-tier competition chess.


There is a lot of posturing and face-saving when it comes to overclocking. Some need to continually convey their superiority over their peers in the hobby. It's a silly existance.

I can say, at least for me, it doesn't matter. I can't get 3GHz stable in stress testing on my 4090. I'm not going to be one of those that may be in a similar boat with a 4090, but project to others that the GPU is rock solid stable at speeds over 3GHz when that isn't true. Just to simply one-up others on the Internet and save face. I'm sure most can benchmark and post scores well above the limit of stability. I don't do that. Stability first. Benchmarking second.


----------



## Arizor

shadow85 said:


> just tried +1400 mem on my TUF OC, instantly artifacted and crashed in-game.
> I am on stock bios. Am I suppose to flash a different BIOS to get 1400+ mem?


Flashing BIOS won't do anything to magically allow higher mem clocks, sadly. It's all just part of the lottery.

I run undervolted on my TUF (non-OC) mem is +1750 stable as a rock, changing BIOS does nothing, it's just luck of the draw.


----------



## vigorito

Why AB doesent allow clocking memory higher then 2000mhz,im rock stable at +2000,but after i input values +2100 or +2200 AB just correct number to 2000 again.


----------



## chispy

Guys i'm having a weird issue , i left my video card running overnight playing some soft music on youtube while i slept and did not had a problems for the ~8-10 hours that it was On playing youtube videos. But suddenly when i woke up and stopped the card from playing music on youtbe i got a black screen for about 10-15 seconds followed by a reboot







.

After that black screen episode it has not stopped doing it :/ , it might take 1 minute or 5 minutes or 10 or 1-2 hours but it's doing it very often. Same symptoms every time , while on iddle or browsing the web or watching youtube the screen loose video signal i can only see the black screen on the screen for approximate 10-15 seconds followed by a whole system reboot.


** Update **
I have found the culprit of the problem and it's related to the driver power state , in the nvidia driver: Power Management Mode if i select normal it will do the weird black screen and reboot thing , but if i select maximim performance i won't do it never again.

Anyone else with this problem ?

I already updated the gpu to the latest Bios using msi center , it updated from Bios 220 to Bios 230 using msi center tool , even after that Bios update the dang gpu still doing it if i select back the normal mode. I guess my gpu does not like to be idle and/or at low power state ?

Can anyone share an opinion or help on this issue , please ? i only have 29 more days of warranty with the brick and mortar store to get a full refund.
Should i wait for an update of the new drivers or new Bios fixing this issue ? What could couse this issue ?

By the way i played cyberpunk2077 for ~3 hours and Forza H5 for ~4 hours at 4k max settings without a single problem. That's 7 hours non stop gaming on the card * 

msi gaming trio 4090 ( vanilla non x model )
brand new w11 installation with latest driver.


----------



## vigorito

maybe you should update firmare of the card NV release few days ago






NVIDIA GPU UEFI Firmware Update Tool | NVIDIA







nvidia.custhelp.com


----------



## Zero989

vigorito said:


> Why AB doesent allow clocking memory higher then 2000mhz,im rock stable at +2000,but after i input values +2100 or +2200 AB just correct number to 2000 again.


try precision x1


chispy said:


> Guys i'm having a weird issue , i left my video card running overnight playing some soft music on youtube while i slept and did not had a problems for the ~8-10 hours that it was On playing youtube videos. But suddenly when i woke up and stopped the card from playing music on youtbe i got a black screen for about 10-15 seconds followed by a reboot
> 
> 
> 
> 
> 
> 
> 
> .
> 
> After that black screen episode it has not stopped doing it :/ , it might take 1 minute or 5 minutes or 10 or 1-2 hours but it's doing it very often. Same symptoms every time , while on iddle or browsing the web or watching youtube the screen loose video signal i can only see the black screen on the screen for approximate 10-15 seconds followed by a whole system reboot.
> 
> 
> ** Update **
> I have found the culprit of the problem and it's related to the driver power state , in the nvidia driver: Power Management Mode if i select normal it will do the weird black screen and reboot thing , but if i select maximim performance i won't do it never again.
> 
> Anyone else with this problem ?
> 
> I already updated the gpu to the latest Bios using msi center , it updated from Bios 220 to Bios 230 using msi center tool , even after that Bios update the dang gpu still doing it if i select back the normal mode. I guess my gpu does not like to be idle and/or at low power state ?
> 
> Can anyone share an opinion or help on this issue , please ? i only have 29 more days of warranty with the brick and mortar store to get a full refund.
> Should i wait for an update of the new drivers or new Bios fixing this issue ? What could couse this issue ?
> 
> By the way i played cyberpunk2077 for ~3 hours and Forza H5 for ~4 hours at 4k max settings without a single problem. That's 7 hours non stop gaming on the card *
> 
> msi gaming trio 4090 ( vanilla non x model )
> brand new w11 installation with latest driver.


disable hardware acceleration and see if it goes away


----------



## mirkendargen

AdamK47 said:


> I see this problem way too often here. People post about clock speeds in terms of additional boost clocks set through MSI Afterburner or like tools.
> 
> Stating you can get +200 core is meaningless if you don't mention what card and what BIOS you use since cards have a wide range in the base boost clock speed. Even then, most on here don't know right away what the base boost in MHz each card/BIOS comes with. Example, Gaming X Trio starts at 2610MHz, Asus TUF at 2595MHz, and Suprim Liquid X at 2640MHz. Those are just the OC cards. The cards at reference 2520MHz are of course going to do +200 since by default they are set so low.
> 
> The same thing with power percentages. Some BIOS start out at 450W and some 480W. Stating the additional percentage is pointless without knowing the starting point.
> 
> /rant


You're totally right about comparing offsets between cards being meaningless, but it's actually more complex than just comparing BIOSes too. Different cards that are the same model with the same BIOS still have differences in their default +0 V/F curve, you can look at it in AB. The only real way to compare is to open up the default curve on both cards, see what frequency it is at 1.05V or 1.1V, whatever you're planning to run at, and then apply whatever offset.

Or just say what actual freq/voltage your card is stable at


----------



## ttnuagmada

Man i must have gotten a dud in terms of VRAM. I artifact at anything over +1300.


----------



## Panchovix

ttnuagmada said:


> Man i must have gotten a dud in terms of VRAM. I artifact at anything over +1300.


I can't even go above 1200 without crashing entirely lol, feelsbadman.


----------



## J7SC

AdamK47 said:


> There is a lot of posturing and face-saving when it comes to overclocking. Some need to continually convey their superiority over their peers in the hobby. It's a silly existance.
> 
> I can say, at least for me, it doesn't matter. I can't get 3GHz stable in stress testing on my 4090. I'm not going to be one of those that may be in a similar boat with a 4090, but project to others that the GPU is rock solid stable at speeds over 3GHz when that isn't true. Just to simply one-up others on the Internet and save face. I'm sure most can benchmark and post scores well above the limit of stability. I don't do that. Stability first. Benchmarking second.


...well said - not only posturing but sometimes stalking, of sorts at least ! Subbing results that were bugged or claiming 'stable' super high speeds as 24/7 is, as I alluded to above, nothing new. At the end of the day, it is a moral / religious question along the lines of 'be honest, treat others as you want to be treated yourself...'

If you get a 'dud' for example re. clocks, it is not punishment of some sort and probably won't make a difference at all re. regular (ie. gaming) use, it just matters for benching and posting shiny results in social media (I don't exclude myself from that). Still, in the past, I learned a lot more with the 'duds' about oc'ing than with a golden sample. The _nicest overclocking experience_ I get is competing against myself - like when I figured out a problem, changed cooling and some system settings - and improved on_ my own _previous efforts. I obviously don't know how other folks' systems even with the same basic components are configured, what vbios, what CPU/RAM tweaks, what back-ground apps are running etc.

---
Below is a quick view of GPU-Z during a Superposition run. As mentioned, my water-cooling system is quite extensive and while the GPU temp is great, the VRAM temp is just below where it needs to be for best clocks - this is one of those things I need to figure out to beat my own scores as right now, I am trading a bit of GPU speed for a bit of VRAM speed. BTW, on max wattage shown below with this water-cooled conversion, it is lower by about 30W-40W or so compared to earlier air-cooled results. I think that may be the 3x fan and RGB power budget allotment. With prior gens, going to water-cooling (and also w/o card driven RGB) just put these allotments into the general PL budget, but apparently not this gen ...Neptune vBios to the rescue ?


----------



## vigorito

Panchovix said:


> I can't even go above 1200 without crashing entirely lol, feelsbadman.


vcore +199 cant go further(strix)


----------



## chispy

vigorito said:


> maybe you should update firmare of the card NV release few days ago
> 
> 
> 
> 
> 
> 
> NVIDIA GPU UEFI Firmware Update Tool | NVIDIA
> 
> 
> 
> 
> 
> 
> 
> nvidia.custhelp.com





Zero989 said:


> try precision x1
> 
> 
> disable hardware acceleration and see if it goes away



Thank you guys for the tips and help. I already updated the gpu bios ( both ) to the latest one thru msi center software , it did upgraded from bios 220 to bios 230 and still having the black screen problems while on iddle or using normal mode in the driver. I'm force to use maximum performance otherwise it will black screen soon or later.

I have tried both with hags ON and with hags OFF with the same outcome of black screen issue. The video card runs great while gaming , benchmarking and at desktop iddle , browsing the web , youtube without ptoblems with the maximum performance driver setting , if i change it to normal i'm back to black screens galore.

Any help , guidance and opinion will be greatly appreciate it since this is the only rtx 4090 that i could find locally on a brick and mortar store for msrp price and they do not have anymore 4090s.

I also tried the new nvidia utilty to update the firmware and it says i already have it installed ( since i updated the bios make sense ).


----------



## KingEngineRevUp

PhuCCo said:


> So the proper EK block doesn't improve deltas.. you're about 16C delta at 400W which matches the rest of us that have tested the first revision. Thank you for posting results


They are Delta 14C at 400W according to their HWINFO64. Seems the APEX thermal paste works pretty nicely.

I'm at 16C-17Cish at 400W.


----------



## gamerMwM

Sheyster said:


> You don't need to go higher for gaming. Memory scaling (FPS) after +1200 is very small.
> 
> My gaming OC is 3000 core with +1000 memory, and I usually don't even bother with most games.


Thank you for saying this. I've been mulling replacing my 4090 FE which can barely do +1100-1200 stable on Memory. I have a Corsair FE 4090 block (hopefully on the way soon) that I preordered - and I didn't want to spend the time blocking a card with poor VRAM. 

But I've been trying to talk myself out of playing the silicon lottery as my core is good at +225 stable gaming and my highest Port Royal score is close to 28K.

I'm not really worried about getting the highest benchmark scores, just want to be able to run a good daily Core/Memory OC. I'm still learning at all this, but if what you say is true and there isn't much actual gain (FPS) after +1200 then I guess I'm not missing much gaming performance anyway.


----------



## narrn2761

Sheyster said:


> You don't need to go higher for gaming. Memory scaling (FPS) after +1200 is very small.
> 
> My gaming OC is 3000 core with +1000 memory, and I usually don't even bother with most games. The 4090 is a beast at stock clocks!


Yea, but I keep hearing that mem OCs yield some good gains, like upto 6-8%.

So if it is free 6-8% gains in fps, then I am definitely interested in grabbing them, especially for how much we have to pay for one of these cards. But only if there isn't any concerning pitfalls to doing these mem OCs ofcourse.



Arizor said:


> Flashing BIOS won't do anything to magically allow higher mem clocks, sadly. It's all just part of the lottery.
> 
> I run undervolted on my TUF (non-OC) mem is +1750 stable as a rock, changing BIOS does nothing, it's just luck of the draw.


How did you get stable +1750 on TUF? Is it because you undervolted, or is just luck of the draw with the chip?

I tried +1400 and crashed instantly, can't be due to thermals because it wasn't even no where near hot enough yet. I thought undervolting is only to make thermals better, or does it also increase stability?


----------



## KingEngineRevUp

gamerMwM said:


> Thank you for saying this. I've been mulling replacing my 4090 FE which can barely do +1100-1200 stable on Memory. I have a Corsair FE 4090 block (hopefully on the way soon) that I preordered - and I didn't want to spend the time blocking a card with poor VRAM.
> 
> But I've been trying to talk myself out of playing the silicon lottery as my core is good at +225 stable gaming and my highest Port Royal score is close to 28K.
> 
> I'm not really worried about getting the highest benchmark scores, just want to be able to run a good daily Core/Memory OC. I'm still learning at all this, but if what you say is true and there isn't much actual gain (FPS) after +1200 then I guess I'm not missing much gaming performance anyway.


At the end of the day after I find out my maximum OC... Every card I have I end up just undervolting LOL. I live in a desert climate so summers are hot. Last thing I want is a hot ass room.

Honestly, you're not missing out on much. Block it and be happy!


----------



## dante`afk

PhuCCo said:


> So the proper EK block doesn't improve deltas.. you're about 16C delta at 400W which matches the rest of us that have tested the first revision. Thank you for posting results


all blocks are pretty much the same nowadays, your bykski 100$ block or your optimus/EKWB 500$ block.

they will give you the same cooling capability, margings 0-2c.


having said that, it also depends on your radiator setup / ambient temp. I just got a delta of 10c with 400w and about 18c with 580w with the EKWB block


----------



## Sheyster

narrn2761 said:


> Yea, but I keep hearing that mem OCs yield some good gains, like upto 6-8%.


Mem OC does give gains, but it's diminishing returns after +1200 (or heck even +1000). That 1-2 percent after +1200 doesn't scale well in games. For benchmarks, of course every little bit helps with those leaderboards.  At the end of the day, the LN2 competitive overclockers will all be at the top anyway.


----------



## Sheyster

KingEngineRevUp said:


> At the end of the day after I find out my maximum OC... Every card I have I end up just undervolting LOL. I live in a desert climate so summers are hot. Last thing I want is a hot ass room.
> 
> Honestly, you're not missing out on much. Block it and be happy!


I'm in So-Cal, hot here as well! Undervolted SLi Titan X Maxwell cards were a glorious thing a while back!









I'm still on the fence about undervolting the GB-G-OC 4090 I have. Temps are already great at stock voltage.


----------



## Arizor

narrn2761 said:


> Yea, but I keep hearing that mem OCs yield some good gains, like upto 6-8%.
> 
> So if it is free 6-8% gains in fps, then I am definitely interested in grabbing them, especially for how much we have to pay for one of these cards. But only if there isn't any concerning pitfalls to doing these mem OCs ofcourse.
> 
> 
> 
> How did you get stable +1750 on TUF? Is it because you undervolted, or is just luck of the draw with the chip?
> 
> I tried +1400 and crashed instantly, can't be due to thermals because it wasn't even no where near hot enough yet. I thought undervolting is only to make thermals better, or does it also increase stability?


luck of the draw. Undervolt just helps with thermals and wattage.


----------



## KingEngineRevUp

Sheyster said:


> I'm in So-Cal, hot here as well! Undervolted SLi Titan X Maxwell cards were a glorious thing a while back!
> View attachment 2584109
> 
> 
> I'm still on the fence about undervolting the GB-G-OC 4090 I have. Temps are already great at stock voltage.


Also So Cal, not too far from microcenter haha. I'm sure we'v maybe crossed paths if you were at the 30 series launches.


----------



## cheddardonkey

Can anyone confirm if the Aorus Master and the Aorus Waterforce have the same PCB?


----------



## bmagnien

thank me later: RTX4090_NEPTUNE_BIOS.rar


----------



## mirkendargen

cheddardonkey said:


> Can anyone confirm if the Aorus Master and the Aorus Waterforce have the same PCB?


From the pictures people have shown here it looks like yes in terms of power, but the Waterforce doesn't have the fan headers soldered on.


----------



## mirkendargen

bmagnien said:


> thank me later: RTX4090_NEPTUNE_BIOS.rar


Got any idea how to extract it? 7zip doesn't read it properly and there isn't a listed command line switch in the /? for it.


----------



## bmagnien

mirkendargen said:


> Got any idea how to extract it? 7zip doesn't read it properly and there isn't a listed command line switch in the /? for it.


Winrar worked for me


----------



## mirkendargen

bmagnien said:


> Winrar worked for me


I mean the executables inside the RAR, it's one of those bundled flashers that won't flash if you don't already match the device ID, so you need to extract the actual BIOS somehow.


----------



## cheddardonkey

mirkendargen said:


> From the pictures people have shown here it looks like yes in terms of power, but the Waterforce doesn't have the fan headers soldered on.


Thanks, interesting I was going to sell my waterforce to build a custom loop and seek out another 4090 model but with what you've shared I may not have to. I didnt think there were any blocks that would fit my card,


----------



## bmagnien

mirkendargen said:


> I mean the executables inside the RAR, it's one of those bundled flashers that won't flash if you don't already match the device ID, so you need to extract the action BIOS somehow.


I was afraid of that. I asked for the raw .rom file but they only had this available. Still working on a backup plan if folks can’t figure how to make this work.


----------



## bmagnien

mirkendargen said:


> I mean the executables inside the RAR, it's one of those bundled flashers that won't flash if you don't already match the device ID, so you need to extract the actual BIOS somehow.


How about Universal Extractor 2


----------



## mirkendargen

bmagnien said:


> How about Universal Extractor 2


No good, I'm looking in the data portion as hex and found the start of the ROM, I'm just looking for a good indicator of the end to extract it.


----------



## narrn2761

Sheyster said:


> Mem OC does give gains, but it's diminishing returns after +1200 (or heck even +1000). That 1-2 percent after +1200 doesn't scale well in games. For benchmarks, of course every little bit helps with those leaderboards.  At the end of the day, the LN2 competitive overclockers will all be at the top anyway.


Well it looks like my TUF OC doesn't like to go over +1200 mem. Tried 1250, and saw heaps of artifacting in PUBG.

Funny thought, because I ran +1300 mem in Dying Light 2 without issues for hours. But PUBG did not like anything over 1200 within just 20 mins of play. But +1200 seems to be fine.


----------



## gamerMwM

Sheyster said:


> Mem OC does give gains, but it's diminishing returns after +1200 (or heck even +1000). That 1-2 percent after +1200 doesn't scale well in games. For benchmarks, of course every little bit helps with those leaderboards.  At the end of the day, the LN2 competitive overclockers will all be at the top anyway.


I ran some tests in Port Royal on my 4090 FE and under "Detailed scores" my Graphics test FPS at +225 Core +0 Memory was 125.28. I got 126.91 FPS at +500 Memory, 128.38 FPS at +1000, and at the highest I could OC my Ram (without artifacts) at +1200 I got 128.66 FPS (under Detailed Scores).

So from 0 to +500 I gained 2 additional FPS, the next 500 (+1000) got me close to 1.5 more FPS on top of that, and then the final +200 (+1200 overall) gave me less than half an extra frame.

Total FPS gain from 0 to +1200 was about 3.5 additional FPS.

If I could clock to +1500 maybe that would net another frame, and then perhaps 1.5 more to plus 2000 from there. So in not having a golden chip, I'm maybe losing out on 2-3 FPS tops.

Edit: Core OC +225 is netting about 4 FPS by itself. Max Memory as stated above is getting another 3.5 so Total OC on my card nets 7.5 FPS in Port Royal (max Voltage and Power limits, tested with fans at 90%) 4090 FE


----------



## J7SC

cheddardonkey said:


> Can anyone confirm if the Aorus Master and the Aorus Waterforce have the same PCB?





mirkendargen said:


> From the pictures people have shown here it looks like yes in terms of power, but the Waterforce doesn't have the fan headers soldered on.


The Aorus Master seems to have an identical PCB to the Gigabyte Gaming OC (pic below from DerBauer / YT). Also, the same water-blocks are advertised to fit both the Master and the Gaming OC. The Aorus Waterforce on the other hand has a slightly different PCB.


----------



## mirkendargen

You're welcome kids, enjoy.


----------



## J7SC

mirkendargen said:


> View attachment 2584135
> View attachment 2584136
> 
> 
> You're welcome kids, enjoy.


Noice, Tx ! Once renamed .rom, will this work with the latest NVflash offered by TPU ?


----------



## mirkendargen

J7SC said:


> Noice, Tx ! Once renamed .rom, will this work with the latest NVflash offered by TPU ?


Certainly should, I flashed it with NVflash.

Now the bad news about it, it didn't change my effective clocks or power consumption noticabley if at all, so the theory of there being some phantom power limiter is a nope, or it's something at a lower level than the BIOS.


----------



## cheddardonkey

mirkendargen said:


> View attachment 2584135
> View attachment 2584136
> 
> 
> You're welcome kids, enjoy.


THANK YOU. this is going on the only other 360 AIO soon! The Waterforce is getting one more bios load. Results soon


----------



## J7SC

mirkendargen said:


> Certainly should, I flashed it with NVflash.
> 
> Now the bad news about it, it didn't change my effective clocks or power consumption noticabley if at all, so the theory of there being some phantom power limiter is a nope, or it's something at a lower level than the BIOS.


Since my Giga-G-OC dropped peak power consumption by 30 W - 40W since I water-cooled it w/o any other ill-effects, I wonder whether NVidia & Co segmented the PL budget more compared to earlier gens... as mentioned above, along the lines of a given segmented amount for fans, RGB and in the case of AIO, pumps, while no net PL gain for the core / VRAM once the signals for fans, RGB and pumps go silent. Just guessing at this stage, though. I might give the Neptune OC bios a try next weekend or so. In any case, tx for sharing it here.


----------



## mirkendargen

J7SC said:


> Since my Giga-G-OC dropped peak power consumption by 30 W - 40W since I water-cooled it w/o any other ill-effects, I wonder whether NVidia & Co segmented the PL budget more compared to earlier gens... as mentioned above, along the lines of a given segmented amount for fans, RGB and in the case of AIO, pumps, while no net PL gain for the core / VRAM once the signals for fans, RGB and pumps go silent. Just guessing at this stage, though. I might give the Neptune OC bios a try next weekend or so. In any case, tx for sharing it here.


Yeah I just spent an hour benching it, no improvement at all in clocks achievable on memory or core, no improvement on scores at the same clocks at 1.05V or 1.1V, no reason to bother installing it except that you can.

But oh well, we tried!


----------



## newls1

mirkendargen said:


> Yeah I just spent an hour benching it, no improvement at all in clocks achievable on memory or core, no improvement on scores at the same clocks at 1.05V or 1.1V, no reason to bother installing it except that you can.
> 
> But oh well, we tried!


THANK YOU FOR SAVING ME THE TIME!


----------



## J7SC

mirkendargen said:


> Yeah I just spent an hour benching it, no improvement at all in clocks achievable on memory or core, no improvement on scores at the same clocks at 1.05V or 1.1V, no reason to bother installing it except that you can.
> 
> But oh well, we tried!


...sooner or later, someone might leak a true XOC vbios, though hopefully, there won't be too much carnage when folks decide to run that on air...besides, one would likely also need voltage control tools from ElmorLabs and such to get past the 1.1V hard limit.

I am still amazed though how efficient these 4090s are...with _no extra_ PL on the slider, mine will max at 445 W in benches, but most games will only suck back around 380 W or so in my setup.


----------



## vigorito

So if we clock vram over +1200 scaling is bad in games and after that number there is no impact or benefits for gaming? i cant see any fps increase with +2000 vram and +199 core over stock settings


----------



## hnizdo

SilenMar said:


> A cheap thin layer of copper won't cut it. It is just bad at cooling.
> View attachment 2583534
> 
> 
> I even replaced the thermal pads with more premium ones. Nothing has changed.
> View attachment 2583535
> 
> 
> 500W cannot be cooled with a design like this while the Strix runs like thousands of miles better.


Thats bad message.

So MSI 4090 liquid is better choice, if I want AiO?


----------



## th3illusiveman

vigorito said:


> So if we clock vram over +1200 scaling is bad in games and after that number there is no impact or benefits for gaming? i cant see any fps increase with +2000 vram and +199 core over stock settings


Its probably diminishing returns at play - at some point, the cores doing all they can and more bandwidth doesn't help. Still, if your card can do it, and you 100% confirm it's artifact free, why not enjoy your extra 1 or 2 fps lol. Just gotta make sure its stable, cause i think running an unstable memory OC for a long duration may cause permanent damage.


----------



## newls1

vigorito said:


> So if we clock vram over +1200 scaling is bad in games and after that number there is no impact or benefits for gaming? i cant see any fps increase with +2000 vram and +199 core over stock settings


@+2000 you are almost certainly kicking in ECC and performance loss. Everyones card will be different, so only you can take the time needed to invest in running a certain bench and increasing mem clocks until score no longer improves or starts a negative fall in points. Easy to do, but very time consuming. Couple weeks back when i took delivery of my 4090 i started with a blank slate. I opened up port royal, and just increased memory slider in MSI AB +100 at a time LEAVING CORE SPEED ALONE, and jotted down the score. I did this 13 times until i got to +1300 and stopped cause i was so tired of seeing port royal running and to give me gpu a break but performance was still scaling as my scores were still climbing. try this with your card and report back


----------



## Benni231990

HERE THE NEPTUNE OC 630Watt BIOS!!!!!!!!!!!









iColor RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com





I Upload it have fun 

But the Fancurve is the same **** like the 600 watt MSI bios only 2400 RPM on 100%!!!!!!!!!!!

So only with waterblock this bios is good


----------



## vigorito

its Artefact free i didnt notice anything all goes smoothly as hell 4k 144hz all ultra,native,7600x paired with 4090 strix,i thought that i can get 6-8% in gaming but obvious i cant,i wanted to test +2100 +2200 with precision X1 tonight but its pointless,i just leave it stock,for gaming i think ill even roll the slider to 70-80%


----------



## Azazil1190

Guys is this score normal for my clock's?
Or a bit low?








I scored 28 821 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 16384 MB, 64-bit Windows 11}




www.3dmark.com


----------



## ttnuagmada

So my preliminary testing/playing around is telling me that these things aren't near as power hungry as ampere. I pushed mine to the highest stable benchmark clocks I could get with the power limit and voltage slider maxed and was only hitting low-mid 500w. This similar to what everyone else is seeing? That Samsung 8nm must have been absolute trash. My 3090 Strix is power limited no matter what bios I had on it save for the 1000w


----------



## Nizzen

Azazil1190 said:


> Guys is this score normal for my clock's?
> Or a bit low?
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 821 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 16384 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Normal


----------



## bmagnien

mirkendargen said:


> View attachment 2584135
> View attachment 2584136
> 
> 
> You're welcome kids, enjoy.


Nice collab - ty! Definitely something still limiting power consumption - presumably voltage although gpu-z reads perf cap reason as pwr. Wouldn’t go past 106% on this bios when it should go up to 114%. 582.5w max in furmark, up 5 watts from the gigaoc 600w bios.


----------



## Baasha

have you guys tried Miles Morales with every setting cranked up (including Ray Tracing) without DLSS and DLSS 3 (aka "frame generation")? There are scenes where it dips below 60fps with the 4090!


----------



## bmagnien

Baasha said:


> have you guys tried Miles Morales with every setting cranked up (including Ray Tracing) without DLSS and DLSS 3 (aka "frame generation")? There are scenes where it dips below 60fps with the 4090!


It’s cpu bound. Turn on frame gen. Alleviates cpu bottleneck and you’ll hit 120fps with the gpu still not breaking a sweat (under 300w for me). For someone thinking I wouldn’t be bottlenecked by a 5800x3d for at least a year or so, frame gen has been a godsend. Allows to turn off super sampling completely and often times add dlaa- all with nearly double the fps


----------



## Fire2

what vbios are you guys with suprims's running? the 600w liquidX worth it or does itr mess up fan speeds?
or strix poss?


----------



## dr/owned

Fire2 said:


> what vbios are you guys with suprims's running? the 600w liquidX worth it or does itr mess up fan speeds?
> or strix poss?


I haven't seen a 600W LiquidX bios. The one on TPU is 530W. Supposedly the 600W only shipped on some early cards.


----------



## KedarWolf

Neptune BIOS wouldn't run Crysis 3 Remaster graphics maxed out at +188, +1553 memory. DXGI errors.

Original V1 Strix BIOS runs it at +220 +1607.


----------



## Krzych04650

mirkendargen said:


> Certainly should, I flashed it with NVflash.
> 
> Now the bad news about it, it didn't change my effective clocks or power consumption noticabley if at all, so the theory of there being some phantom power limiter is a nope, or it's something at a lower level than the BIOS.


Flashed it on my Inno3D X3 OC and it worked as expected, it allowed for at least 20W more in games compared to 600W Liquid X BIOS (1 next to mV indicates power limit is hit)









Problem on air is that the fan RPM is capped to 2400 RPM so the card cannot cool itself now (left picture is at 3200 RPM or thereabouts), but that will go away with waterblock, and much lower temps will cause lower power draw as well, so maybe it will stop throttling.

There are definitely some phantom limiters though as it doesn't really want to past that 102% PL point, it always stops at 102% in both in games and Furmark even when set to 114%. On MSI 600W BIOS it would stop at 113% no matter what, even though max was 125%.

It works better than Gigabyte and MSI BIOSes on my card for one other reason though and that is the fact that it is not losing signal during boot. I have external HDD connected and there is a black screen with just "-" for some seconds before entry boot screen appears, and original Inno3D and Neptune BIOS show it properly, while Gigabyte and MSI would lose signal twice during that period and miss the screen where you can enter BIOS for that reason.


----------



## yzonker

dr/owned said:


> I haven't seen a 600W LiquidX bios. The one on TPU is 530W. Supposedly the 600W only shipped on some early cards.


VGA Bios Collection: MSI RTX 4090 24 GB | TechPowerUp


----------



## Sheyster

Fire2 said:


> what vbios are you guys with suprims's running? the 600w liquidX worth it or does itr mess up fan speeds?
> or strix poss?


Folks have reported it doesn't max out the fan speeds at 100%. Try the Strix BIOS. Note that with that BIOS you will lose one of the DP outputs.


----------



## mirkendargen

KedarWolf said:


> Neptune BIOS wouldn't run Crysis 3 Remaster graphics maxed out at +188, +1553 memory. DXGI errors.
> 
> Original V1 Strix BIOS runs it at +220 +1607.


+0 is 30mhz higher on the Neptune BIOS, talk in absolute clocks not offsets. And +188 and +220 don't make sense, it changes in 15mhz increments so you're actually +195 and +225, which is 30mhz.

I'm surprised multiple people are seeing higher power usage. Did you actually have percap reason - power before? No matter what I do with a 600W-630W power limit my perfcap reason is reliability voltage and I max out ~560-570W.


----------



## Nizzen

Got the free cable from Seasonic today 
Used this link the release day of 4090:
https://seasonic.com/cable-request/


----------



## Baasha

bmagnien said:


> It’s cpu bound. Turn on frame gen. Alleviates cpu bottleneck and you’ll hit 120fps with the gpu still not breaking a sweat (under 300w for me). For someone thinking I wouldn’t be bottlenecked by a 5800x3d for at least a year or so, frame gen has been a godsend. Allows to turn off super sampling completely and often times add dlaa- all with nearly double the fps


Yes, playing it with DLSS3 is an amazing experience and hardly feel any induced-latency that people keep talking about (reviewers etc.).

Still, with a 12900KF @ 5.20Ghz, having a game dip below 60fps with the 4090 at 4K seems insane. I don't think the 13900K would fare that much better (?).


----------



## Sheyster

Nizzen said:


> Got the free cable from Seasonic today
> Used this link the release day of 4090:
> https://seasonic.com/cable-request/


That cable looks MUCH nicer than the Corsair cable!


----------



## Sheyster

KedarWolf said:


> Neptune BIOS wouldn't run Crysis 3 Remaster graphics maxed out at +188, +1553 memory. DXGI errors.
> 
> Original V1 Strix BIOS runs it at +220 +1607.


I'm not going to try the Neptune BIOS since I'm on air (due to the low fan speed limits). I'm actually on the V2 Strix BIOS you posted. How has that one worked out for you so far?


----------



## dr/owned

yzonker said:


> VGA Bios Collection: MSI RTX 4090 24 GB | TechPowerUp


Is there a way to browse the unofficial bios uploads?


----------



## Fire2

12VHPWR Cable - Promotion ended ;(

looks very nice though


----------



## dk_mic

Sheyster said:


> Folks have reported it doesn't max out the fan speeds at 100%. Try the Strix or GB-G-OC BOIS. With Strix you will lose one of the DP outputs.


GB Gaming OC BIOS (600W) on MSI Gaming Trio doesnt max out out fan speeds either (~ 2400 RPM max). Probably the same is true for Suprim


----------



## Sheyster

dr/owned said:


> Is there a way to browse the unofficial bios uploads?











TechPowerUp


Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.




www.techpowerup.com





Under the GPU Brands drop-down, choose "Unverified Uploads".


----------



## Sheyster

dk_mic said:


> GB Gaming OC BIOS (600W) on MSI Gaming Trio doesnt max out out fan speeds either (~ 2400 RPM max). Probably the same is true for Suprim


I was not aware of that, thank you for pointing it out! I will edit my post.


----------



## Zero989

KedarWolf said:


> Neptune BIOS wouldn't run Crysis 3 Remaster graphics maxed out at +188, +1553 memory. DXGI errors.
> 
> Original V1 Strix BIOS runs it at +220 +1607.


For me asus bios is better for overclocking memory. No diff with core found yet. Asus is prob using looser timings. There was an app to test bandwidth but I'd have to dig it up.


----------



## yzonker

Zero989 said:


> For me asus bios is better for overclocking memory. No diff with core found yet. Asus is prob using looser timings. There was an app to test bandwidth but I'd have to dig it up.


This shows bandwidth,

(91) I've created an application for testing Video memory stability via Vulkan | Overclock.net


----------



## dk_mic

KedarWolf said:


> Neptune BIOS wouldn't run Crysis 3 Remaster graphics maxed out at +188, +1553 memory. DXGI errors.
> 
> Original V1 Strix BIOS runs it at +220 +1607.


did you check memory temperature when running this? people have found out that it affects mem oc
also, differences in max core frequencies offsets could be due to different base frequencies


----------



## cheddardonkey

..


----------



## dk_mic

yzonker said:


> This shows bandwidth,
> 
> (91) I've created an application for testing Video memory stability via Vulkan | Overclock.net


it fantastic for checking stability, but it doesnt max out bandwidth, so it's not a benchmark tool (yet)


----------



## J7SC

...worth reiterating that on well-cooled GPUs, there seems to be a 'minimum' VRAM temp (> 40 C) that gives best results, at least on my Giga G-OC w/stock vbios. This doesn't mean overheating VRAM, but at below 40 C, artefact-free value is ~ 1418-1431 MHz, but at around 45 C and above, it is ~ 1500 -1513 MHz...don't know the best / most efficient and artefact-free 'max' temp though.


----------



## Zero989

yzonker said:


> This shows bandwidth,
> 
> (91) I've created an application for testing Video memory stability via Vulkan | Overclock.net











GitHub - kruzer/poclmembench: calculates your gpu memory speed


calculates your gpu memory speed. Contribute to kruzer/poclmembench development by creating an account on GitHub.




github.com


----------



## BigMack70

OK, I'm out of ideas and ready to start replacing parts. I keep black screening randomly. There's no rhyme or reason to it - it can happen at a game or at idle on desktop. This never happened when I was on my Z390/9900Ks setup. It started occurring after upgrading to Z790/13700k.

Because it can happen at idle, I'm inclined not to think it's the power supply.
Because it didn't happen before upgrading my platform, I'm inclined not to think it's the graphics card.
Because it happens with the CPU at stock and with memory at identical settings I ran on Z390, I'm inclined not to think it's CPU/RAM.

That leaves motherboard. How dumb is it to buy a new board - maybe a z690 that's been out in the wild longer?


----------



## mirkendargen

BigMack70 said:


> OK, I'm out of ideas and ready to start replacing parts. I keep black screening randomly. There's no rhyme or reason to it - it can happen at a game or at idle on desktop. This never happened when I was on my Z390/9900Ks setup. It started occurring after upgrading to Z790/13700k.
> 
> Because it can happen at idle, I'm inclined not to think it's the power supply.
> Because it didn't happen before upgrading my platform, I'm inclined not to think it's the graphics card.
> Because it happens with the CPU at stock and with memory at identical settings I ran on Z390, I'm inclined not to think it's CPU/RAM.
> 
> That leaves motherboard. How dumb is it to buy a new board - maybe a z690 that's been out in the wild longer?


Got a different PCIE slot you can try (even if it isn't full lanes) just to try and rule that out?

Anything at all in the event log when it happens?

Try running the RAM at stupid slow JEDEC speed?

Latest mobo BIOS installed?


----------



## GRABibus

J7SC said:


> ...well said - not only posturing but sometimes stalking, of sorts at least ! Subbing results that were bugged or claiming 'stable' super high speeds as 24/7 is, as I alluded to above, nothing new. At the end of the day, it is a moral / religious question along the lines of 'be honest, treat others as you want to be treated yourself...'
> 
> If you get a 'dud' for example re. clocks, it is not punishment of some sort and probably won't make a difference at all re. regular (ie. gaming) use, it just matters for benching and posting shiny results in social media (I don't exclude myself from that). Still, in the past, I learned a lot more with the 'duds' about oc'ing than with a golden sample. The _nicest overclocking experience_ I get is competing against myself - like when I figured out a problem, changed cooling and some system settings - and improved on_ my own _previous efforts. I obviously don't know how other folks' systems even with the same basic components are configured, what vbios, what CPU/RAM tweaks, what back-ground apps are running etc.
> 
> ---
> Below is a quick view of GPU-Z during a Superposition run. As mentioned, my water-cooling system is quite extensive and while the GPU temp is great, the VRAM temp is just below where it needs to be for best clocks - this is one of those things I need to figure out to beat my own scores as right now, I am trading a bit of GPU speed for a bit of VRAM speed. BTW, on max wattage shown below with this water-cooled conversion, it is lower by about 30W-40W or so compared to earlier air-cooled results. I think that may be the 3x fan and RGB power budget allotment. With prior gens, going to water-cooling (and also w/o card driven RGB) just put these allotments into the general PL budget, but apparently not this gen ...Neptune vBios to the rescue ?
> View attachment 2584079


Which cable do you use ?


----------



## GRABibus

Azazil1190 said:


> Guys is this score normal for my clock's?
> Or a bit low?
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 821 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 16384 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Be happy


----------



## Sheyster

BigMack70 said:


> OK, I'm out of ideas and ready to start replacing parts. I keep black screening randomly. There's no rhyme or reason to it - it can happen at a game or at idle on desktop. This never happened when I was on my Z390/9900Ks setup. It started occurring after upgrading to Z790/13700k.
> 
> Because it can happen at idle, I'm inclined not to think it's the power supply.
> Because it didn't happen before upgrading my platform, I'm inclined not to think it's the graphics card.
> Because it happens with the CPU at stock and with memory at identical settings I ran on Z390, I'm inclined not to think it's CPU/RAM.
> 
> That leaves motherboard. How dumb is it to buy a new board - maybe a z690 that's been out in the wild longer?


Does it happen when you have Prefer Maximum Performance selected in NCVP power mode?


----------



## GRABibus

Sheyster said:


> That cable looks MUCH nicer than the Corsair cable!


I have used both CableMod's one and Corsair one.

Corsair one gives me weird behavior where my VRAM overclocks makes some games crashing, in game menu, like MW2.
With CableMod's cable, I can easily add +200MHz on my Vram OC and I am game stable.

Probably I was on the razor's edge of stability, but I wanted to mention this here.

I use the CableMod cable currently.


----------



## Sheyster

GRABibus said:


> I have used both CableMod's one and Corsair one.
> 
> Corsair one gives me weird behavior where my VRAM overclocks makes some games crashing, in game menu, like MW2.
> With CableMod's cable, I can easily add +200MHz on my Vram OC and I am game stable.
> 
> Probably I was on the razor's edge of stability, but I wanted to mention this here.
> 
> I use the CableMod cable currently.


I'm on the CableMod cable as well (4 x 8-pin). That's an interesting observation you pointed out. It'll be interesting to see if others see the same behavior with the Corsair cable.


----------



## BigMack70

mirkendargen said:


> Got a different PCIE slot you can try (even if it isn't full lanes) just to try and rule that out?
> 
> Anything at all in the event log when it happens?
> 
> Try running the RAM at stupid slow JEDEC speed?


The only other PCI-e slot on my board (MSI Z790 Edge DDR4) is a PCI-e 4.0 x4 slot that I wouldn't use normally, though I guess I could seat the card there and just see if it crashes. But even if that works, it would still mean replacing the board.

I have not tried running the RAM at 2133 JEDEC. I could do that for a bit.

It does not show anything in the event log that has any meaning to me - there are errors related to "unable to start a DCOM server" related to Microsoft Edge, and a lot of warnings about "this application-specific permission settings do not grant Local Activation permission for the COM server application with CLSID... and APPID..." but nothing that looks relevant to a crash beyond "the previous system shutdown was unexpected".



Sheyster said:


> Does it happen when you have Prefer Maximum Performance selected in NCVP power mode?


Yes, I have tried just about everything I can think of as far as tweaking NVCP settings, windows settings, and BIOS PCI-e settings. Nothing seems to make any difference. It's infuriating - sometimes I can go many hours of gaming over a couple days without any crashes, and then sometimes it crashes every 10 minutes for no reason even from idle desktop.


----------



## Sheyster

BigMack70 said:


> The only other PCI-e slot on my board (MSI Z790 Edge DDR4) is a PCI-e 4.0 x4 slot that I wouldn't use normally, though I guess I could seat the card there and just see if it crashes. But even if that works, it would still mean replacing the board.
> 
> I have not tried running the RAM at 2133 JEDEC. I could do that for a bit.
> 
> It does not show anything in the event log that has any meaning to me - there are errors related to "unable to start a DCOM server" related to Microsoft Edge, and a lot of warnings about "this application-specific permission settings do not grant Local Activation permission for the COM server application with CLSID... and APPID..." but nothing that looks relevant to a crash beyond "the previous system shutdown was unexpected".
> 
> 
> 
> Yes, I have tried just about everything I can think of as far as tweaking NVCP settings, windows settings, and BIOS PCI-e settings. Nothing seems to make any difference. It's infuriating - sometimes I can go many hours of gaming over a couple days without any crashes, and then sometimes it crashes every 10 minutes for no reason even from idle desktop.


Windows 10 or 11? I know you're frustrated but maybe we can come up with something here! Have you checked DXDIAG logs?


----------



## mirkendargen

BigMack70 said:


> The only other PCI-e slot on my board (MSI Z790 Edge DDR4) is a PCI-e 4.0 x4 slot that I wouldn't use normally, though I guess I could seat the card there and just see if it crashes. But even if that works, it would still mean replacing the board.


Even if the end result is knowing you need to replace the board, it's nice to *know* before you actually go through with replacing it


----------



## BigMack70

Sheyster said:


> Windows 10 or 11? I know you're frustrated but maybe we can come up with something here! Have you checked DXDIAG logs?


Windows 11 21H2 build 22000.1219

What does dxdiag log? I have not checked that. I don't see anything meaningful on the normal dxdiag readout.


----------



## Sheyster

BigMack70 said:


> Windows 11 21H2 build 22000.1219
> 
> What does dxdiag log? I have not checked that. I don't see anything meaningful on the normal dxdiag readout.


It should have everything related to DirectX API errors saved off. I believe you have to click on the "save all information" to get the info. It could provide some insight.

Are you using the high performance power plan in Win11?


----------



## yzonker

BigMack70 said:


> The only other PCI-e slot on my board (MSI Z790 Edge DDR4) is a PCI-e 4.0 x4 slot that I wouldn't use normally, though I guess I could seat the card there and just see if it crashes. But even if that works, it would still mean replacing the board.
> 
> I have not tried running the RAM at 2133 JEDEC. I could do that for a bit.
> 
> It does not show anything in the event log that has any meaning to me - there are errors related to "unable to start a DCOM server" related to Microsoft Edge, and a lot of warnings about "this application-specific permission settings do not grant Local Activation permission for the COM server application with CLSID... and APPID..." but nothing that looks relevant to a crash beyond "the previous system shutdown was unexpected".
> 
> 
> 
> Yes, I have tried just about everything I can think of as far as tweaking NVCP settings, windows settings, and BIOS PCI-e settings. Nothing seems to make any difference. It's infuriating - sometimes I can go many hours of gaming over a couple days without any crashes, and then sometimes it crashes every 10 minutes for no reason even from idle desktop.


Have you tried downclocking the VRAM below default? Unstable VRAM definitely causes black screening and can happen at idle. 

Have you tried different cables? Even if it worked with your old card doesn't mean it's not the cable. 

Sorry if this has been covered, I haven't followed all of your posts probably.


----------



## dk_mic

GRABibus said:


> I have used both CableMod's one and Corsair one.
> 
> Corsair one gives me weird behavior where my VRAM overclocks makes some games crashing, in game menu, like MW2.
> With CableMod's cable, I can easily add +200MHz on my Vram OC and I am game stable.
> 
> Probably I was on the razor's edge of stability, but I wanted to mention this here.
> 
> I use the CableMod cable currently.


I can't imagine that it's really the cable making a difference in mem overclocking, but who knows. My MSI Gaming X Trio maxes out at +1000, no matter which cable (have the Corsair and original adapter) or BIOS. I am hoping a waterblock might move that limit a bit, as it gets quite warm, but I think there's just a bad chip in memory. It's no dealbreaker to run only +1000 though


----------



## BigMack70

Sheyster said:


> It should have everything related to DirectX API errors saved off. I believe you have to click on the "save all information" to get the info. It could provide some insight.
> 
> Are you using the high performance power plan in Win11?


Yes, high performance power mode is used; PCI-e link state power management set to "off".

I attached the dxdiag "save all info" readout. Nothing in it means much to me by way of diagnostic.

I am currently running my memory at JEDEC settings. Going to do that for a bit and see if I get any crashing. If I do, then I'll try down-clocking my GPU memory and see if that helps, but it would be alarming if the graphics card is degrading that quickly that I could run it for a few weeks overclocked (+1000 MHz mem) with zero problems and then it can't even run stock. There is no discernable difference in the frequency of crashes between running my GPU at stock settings and at OC settings (+180 / +1000). 



yzonker said:


> Have you tried downclocking the VRAM below default? Unstable VRAM definitely causes black screening and can happen at idle.
> 
> Have you tried different cables? Even if it worked with your old card doesn't mean it's not the cable.
> 
> Sorry if this has been covered, I haven't followed all of your posts probably.


I have not tried different cables. I'd have to purchase them, as it's using all 4 of the PCI-e cables that my Corsair AX1000i has available. I have double and triple checked the connectors on both the GPU and PSU side and everything looks fully seated without problems.


----------



## GRABibus

dk_mic said:


> I can't imagine that it's really the cable making a difference in mem overclocking, but who knows. My MSI Gaming X Trio maxes out at +1000, no matter which cable (have the Corsair and original adapter) or BIOS. I am hoping a waterblock might move that limit a bit, as it gets quite warm, but I think there's just a bad chip in memory. It's no dealbreaker to run only +1000 though


Corsair cable has 2 connectors while the CableMod's one has 4 connectors.
I think there are some differences....

At least for me, the issue is reproductable

CableMod => +1650MHz stable in game MW2, no crash at all.
Corsair cable => +1650MHz and it is 100% crash in menu.

Probably the Corsair cable amplifies an instability....

I don't know why.


----------



## yzonker

BigMack70 said:


> Yes, high performance power mode is used; PCI-e link state power management set to "off".
> 
> I attached the dxdiag "save all info" readout. Nothing in it means much to me by way of diagnostic.
> 
> I am currently running my memory at JEDEC settings. Going to do that for a bit and see if I get any crashing. If I do, then I'll try down-clocking my GPU memory and see if that helps, but it would be alarming if the graphics card is degrading that quickly that I could run it for a few weeks overclocked (+1000 MHz mem) with zero problems and then it can't even run stock. There is no discernable difference in the frequency of crashes between running my GPU at stock settings and at OC settings (+180 / +1000).
> 
> 
> 
> I have not tried different cables. I'd have to purchase them, as it's using all 4 of the PCI-e cables that my Corsair AX1000i has available. I have double and triple checked the connectors on both the GPU and PSU side and everything looks fully seated without problems.


No, display cables. I had one that worked fine with my 3090, but would not work at all with my 4090. Same display.

Did you try the VRAM downclock to try to rule out bad VRAM?


----------



## yzonker

GRABibus said:


> Corsair cable has 2 connectors while the CableMod's one has 4 connectors.
> I think there are some differences....
> 
> At least for me, the issue is reproductable
> 
> CableMod => +1650MHz stable in game MW2, no crash at all.
> Corsair cable => +1650MHz and it is 100% crash in menu.
> 
> Probably the Corsair cable amplifies an instability....
> 
> I don't know why.


Did you note what your voltage is under load with the 2 cables? I did notice mine droops to 11.7v under full load with the Corsair cable. Interestingly, my 3090 did not do that with just 2x8pin. It was always showing right at 12v even at 600w.


----------



## yzonker

J7SC said:


> ...worth reiterating that on well-cooled GPUs, there seems to be a 'minimum' VRAM temp (> 40 C) that gives best results, at least on my Giga G-OC w/stock vbios. This doesn't mean overheating VRAM, but at below 40 C, artefact-free value is ~ 1418-1431 MHz, but at around 45 C and above, it is ~ 1500 -1513 MHz...don't know the best / most efficient and artefact-free 'max' temp though.


Yea, that sounds about right. I can at least get close to my air cooled mem OC at 40C+. It's on my list of things to test further. See now why I was contemplating using cheap pads for the mount way back when?


----------



## ttnuagmada

GRABibus said:


> Corsair cable has 2 connectors while the CableMod's one has 4 connectors.
> I think there are some differences....
> 
> At least for me, the issue is reproductable
> 
> CableMod => +1650MHz stable in game MW2, no crash at all.
> Corsair cable => +1650MHz and it is 100% crash in menu.
> 
> Probably the Corsair cable amplifies an instability....
> 
> I don't know why.


I have the 4 connector CableMod cable and my VRAM OC's like ****


----------



## ttnuagmada

Question: what's the highest power-draw anyone has seen on their card?


----------



## Sheyster

BigMack70 said:


> Yes, high performance power mode is used; PCI-e link state power management set to "off".
> 
> I attached the dxdiag "save all info" readout. Nothing in it means much to me by way of diagnostic.


I did a quick review of the log, didn't see anything obvious.

Do you have the latest Intel Management Engine installed? You might have to get it from the MSI support site directly. I believe it's version 16.1.25.2020, there might be a newer one though.


----------



## ianann

Maybe it's the rather the PSU than the cables then?


----------



## GRABibus

ttnuagmada said:


> I have the 4 connector CableMod cable and my VRAM OC's like ****


Very helpful 😂


----------



## BigMack70

yzonker said:


> Did you try the VRAM downclock to try to rule out bad VRAM?


Trying a normal RAM downclock first. I've gone a couple days between crashes before so I'll probably have to use this for a while before I can tell if the crashes are still occurring. Want to try and test one thing at a time.



Sheyster said:


> I did a quick review of the log, didn't see anything obvious.
> 
> Do you have the latest Intel Management Engine installed? You might have to get it from the MSI support site directly. I believe it's version 16.1.25.2020, there might be a newer one though.


How do I check what version of IME i have installed?


----------



## GRABibus

ttnuagmada said:


> Question: what's the highest power-draw anyone has seen on their card?


578W in Superposition 8K.
Didn’t test yet Timespy extreme.


----------



## GRABibus

yzonker said:


> Did you note what your voltage is under load with the 2 cables? I did notice mine droops to 11.7v under full load with the Corsair cable. Interestingly, my 3090 did not do that with just 2x8pin. It was always showing right at 12v even at 600w.


No I didn’t check but that’s interesting.


----------



## Sheyster

yzonker said:


> Did you note what your voltage is under load with the 2 cables? I did notice mine droops to 11.7v under full load with the Corsair cable. Interestingly, my 3090 did not do that with just 2x8pin. It was always showing right at 12v even at 600w.


Do you have a CableMod cable? If yes, does it droop? Does the included adapter also droop?


----------



## mirkendargen

Sheyster said:


> Do you have a CableMod cable? If yes, does it droop? Does the included adapter also droop?


My Moddiy and Corsair cables both similarly droop for the record, I never tried the Nvidia adapter. PSU is an AX1600i.


----------



## atilcan06

Which bios is best for Strix then? original one? V1 V2?


----------



## yzonker

Sheyster said:


> Do you have a CableMod cable? If yes, does it droop? Does the included adapter also droop?


All I've run so far is the Corsair.


----------



## yzonker

BigMack70 said:


> Trying a normal RAM downclock first. I've gone a couple days between crashes before so I'll probably have to use this for a while before I can tell if the crashes are still occurring. Want to try and test one thing at a time.
> 
> 
> 
> How do I check what version of IME i have installed?


Yea definitely one thing at a time. Intermittent problems are the worst.

Should be able to see the ME version in the bios. Here's an Asus example,



[Motherboard] Intel® Management Engine Firmware Update Instructions(ME) | Official Support | ASUS Global


----------



## Alelau18

Talking about vdroop, at 550W load my moddiy 12VHPWR cable droops quite a lot to 11.6Vs, NVIDIA's adapter droops but not that much (around [email protected] load) and the cheaper fasgear one that I bought on Amazon has the lowest droop of all, stays at 11.9V like a champ. Of course this is only software reporting so gotta take it with a pinch of salt but the numbers are really consistent throughout the 3 adapters.

The Fasgear adapter also has the best feedback in terms of being plugged in all the way too, it only clicks when it's properly in, opposed to the NV adapter or the Moddiy cable.


----------



## alasdairvfr

gaddster said:


> I do and after changing my display to 10bpc from 12bpc in Nv Cpanel it stopped completely.


So I was at the "default" 8bpc and I manually changed it to 10 which I can't imagine why going higher colour scale would help but the issue seems to be gone entirely. Either that or I'm past the affected areas of the game.


----------



## yzonker

Alelau18 said:


> Talking about vdroop, at 550W load my moddiy 12VHPWR cable droops quite a lot to 11.6Vs, NVIDIA's adapter droops but not that much (around [email protected] load) and the cheaper fasgear one that I bought on Amazon has the lowest droop of all, stays at 11.9V like a champ. Of course this is only software reporting so gotta take it with a pinch of salt but the numbers are really consistent throughout the 3 adapters.
> 
> The Fasgear adapter also has the best feedback in terms of being plugged in all the way too, it only clicks when it's properly in, opposed to the NV adapter or the Moddiy cable.


Dumb luck on my part. I just ordered a Fasgear to try, right before you posted this. lol


----------



## Mucho

Also using the Fasgear Adapter, really nice quality.


----------



## Sheyster

Did anyone receive cable combs with their CableMod cable? I did not.


----------



## dr/owned

Alelau18 said:


> Talking about vdroop, at 550W load my moddiy 12VHPWR cable droops quite a lot to 11.6Vs, NVIDIA's adapter droops but not that much (around [email protected] load) and the cheaper fasgear one that I bought on Amazon has the lowest droop of all, stays at 11.9V like a champ. Of course this is only software reporting so gotta take it with a pinch of salt but the numbers are really consistent throughout the 3 adapters.
> 
> The Fasgear adapter also has the best feedback in terms of being plugged in all the way too, it only clicks when it's properly in, opposed to the NV adapter or the Moddiy cable.


CableMod 3x here: I see no droop. 16-pin Voltage goes from 12.3V to 12.0V at 650W. ModDIY says they use 18awg, CableMod is 16.



Sheyster said:


> Did anyone receive cable combs with their CableMod cable? I did not.


Yeah mine came with like 5 combs on it.


----------



## J7SC

yzonker said:


> Yea, that sounds about right. I can at least get close to my air cooled mem OC at 40C+. It's on my list of things to test further. See now why I was contemplating using cheap pads for the mount way back when?


...I knew about those 2 GB GDDR6X chips not liking 'cold', but I didn't think that 38 C is cold...I probably did too good a job with the thermal putty. Alas, I can regulate temps a bit via ambient heat and try to hit that 3 C - 4 C window when VRAM temp is right and GPU core doesn't downclock yet...then again, I may just wait until my new build is underway, just in case I want to add cooling to the A-Die DDR5...w/thermal putty of course 🥶 

I figure KingPin and/or DerBauer are busy designing new LN2 GPU pots just covering the die and with extra space above the VRAM, possibly for a heat option...



Sheyster said:


> Did anyone receive cable combs with their CableMod cable? I did not.


...mine came with 8 combs.


----------



## Sheyster

J7SC said:


> ...mine came with 8 combs.


Dang, were they already on the cable or in the box?


----------



## J7SC

Sheyster said:


> Dang, were they already on the cable or in the box?


on the cable, all tightly bunched together.


----------



## yzonker

J7SC said:


> ...I knew about those 2 GB GDDR6X chips not liking 'cold', but I didn't think that 38 C is cold...I probably did too good a job with the thermal putty. Alas, I can regulate temps a bit via ambient heat and try to hit that 3 C - 4 C window when VRAM temp is right and GPU core doesn't downclock yet...then again, I may just wait until my new build is underway, just in case I want to add cooling to the A-Die DDR5...w/thermal putty of course 🥶
> 
> I figure KingPin and/or DerBauer are busy designing new LN2 GPU pots just covering the die and with extra space above the VRAM, possibly for a heat option...
> 
> 
> 
> ...mine came with 8 combs.


I knew it would be like that from playing with it air cooled. I could do +1850 mem with the fans at default and fan stop. Going to 100% fan reduced that to +1800, even at 22-24C ambient.

Yea they may have to step up their method of heating the mem. I know they already used a heater on the back of the PCB. GN did that in their LN2 live stream. I've actually got a small ceramic heater I use on the backplate. That got me from +1650 to +1725 with the chiller. It adds about 10C to the mem temp.


----------



## mirkendargen

dr/owned said:


> CableMod 3x here: I see no droop. 16-pin Voltage goes from 12.3V to 12.0V at 650W. ModDIY says they use 18awg, CableMod is 16.
> 
> 
> Yeah mine came with like 5 combs on it.


That's....0.3V of droop, the same as other people are saying lol. You're just starting at 12.3V and others are starting at 12V.


----------



## chispy

Guys , i think i have definitely found the culprit of the black screen while on iddle on my msi gaming trio rtx 4090. It is the motherboard Bios needs update for compatibility with rtx 4090 gpu's







. I have troughly tested on my Asrock Z690 Aqua OC 2xDDR5 Dimm Slots and Asrock Z690 Steel Legend 4xDDR4 Dimm Slots and neither one of those two Intel Z690 resolved my issues with black screens and reboots while on idle even with clean windows 11 latest installation and updated everything. Those two Asrock boards will get a Bios update for compatibility just like my AMD am4 Asrock B550 Steel Legend did. I updated to the latest Bios of my am4 mobo with a fix for compatibility with rtx 4090 gpu's.

Now i'm testing on my third motherboard Asrock B550 Steel Legend with udated Bios with the fix for this problem of compatibility with rtx 4090 video cards , so far everything has been fixed on this platform am4.

Now my msi gaming trio rtx 4090 don't have any kind of problems running in normal mode on the drivers instead of performance , no black screen issues , no reboots , no driver errors , nothing , running smoots as butter.

If you still have this black screen problem i suggest you ask your motherboard vendor for support on a new Bios for compatibility with rtx 4090 video cards.

Here are screenshots of the video card running in normal settings in the driver power management mode + cpuz showing the new bios with the rtx 4090 fix from asrock + gpu-z downclocking perfectly and no black screens and reboots ever again, also a screenshot of the new Bios from Asrock download page with the fix for compatibility with the rtx 4090 video cards.


----------



## chispy

It seems most Asus motherboards Intel Z790 already have a Bios fix on their download page:
example: Like this Asus Z790 Strix - ROG STRIX Z790-F GAMING WIFI | ROG Strix | Gaming Motherboards｜ROG - Republic of Gamers｜ROG Global

It is NOT a rtx 4090 video card hardware problem , It IS a motherboard compatibility problem. Please guys ask your motherboard vendor on their support page to update their bios with the fix for campatibility issues with the rtx 4090.


----------



## J7SC

chispy said:


> It seems most Asus motherboards Intel Z790 already have a Bios fix on their download page:
> example: Like this Asus Z790 Strix - ROG STRIX Z790-F GAMING WIFI | ROG Strix | Gaming Motherboards｜ROG - Republic of Gamers｜ROG Global
> 
> It is NOT a rtx 4090 video card hardware problem , It IS a motherboard compatibility problem. Please guys ask your motherboard vendor on their support page to update their bios with the fix for campatibility issues with the rtx 4090.
> 
> View attachment 2584232


When I first switched from the 3090 Strix to the 4090 Giga-G-OC it booted multiple times before black-screens disappeared on my Asus CH8 Dark-Hero (bios 3801). It then on its own had disabled resizable_BAR in the mobo bios before I got a splash screen. Once everything was running fine in Win 10 Pro, I rebooted and reengaged resizable_BAR w/o any issues since.


----------



## GRABibus

Let's get a little bit fun currently instead of benches 

Here a gameplay MW2 Multi with my Gigabyte 4090 Gaming OC on air and my 5950X (PBO/CO), at 4K, Extreme settings, DLSS Quality.






GPU overclock :
MSI AB settings 
100% voltage
133% PL
+195MHZ on Core
+1650MHz on memory.

First temperature in front of "RTX 4090 (Core)" line is GPU temp, second one is Hotspot. I like the delta, rarely beyond 10°C


----------



## Zero989

KedarWolf said:


> Neptune BIOS wouldn't run Crysis 3 Remaster graphics maxed out at +188, +1553 memory. DXGI errors.
> 
> Original V1 Strix BIOS runs it at +220 +1607.


can confirm this BIOS sucks lmao :'(


----------



## Laithan

Outside of extreme cooling, what would you consider the entire 4090 boost range from a dud to a golden sample? Something like 2900Mhz-3200Mhz or is it wider than that? Seems like most 4090s can reach 3Ghz.

Memory seems to be much wider.. I think some are stuck at like +1400Mhz where others are over +2000Mhz.


----------



## BigMack70

yzonker said:


> Yea definitely one thing at a time. Intermittent problems are the worst.
> 
> Should be able to see the ME version in the bios. Here's an Asus example,
> 
> 
> 
> [Motherboard] Intel® Management Engine Firmware Update Instructions(ME) | Official Support | ASUS Global


Looking into this next. At some point might try the PCI-e 4.0 x4 chipset slot.

My PC just passed a 2 hour memtest on my RAM at its standard OC settings (DDR4-4000 c16). It passed a 12 hour test of that on my z390 setup and I never had any issues with it over the course of almost 3 years, so I'm expecting the memory is not the problem.

Just tested the VRAM with OCCT for an hour at stock settings; no errors or problems detected. Testing now with my standard +1000 OC.

Unless my GPU magically began failing at exactly the same time I upgraded from Z390 to Z790, this really feels like some kind of weird bug with software (e.g. drivers) or with the motherboard.

Worth noting is that I have re-seated the graphics card a couple times and adjusted how much the stand is supporting the GPU's sag. Nothing has made any difference with that.

I almost wish one of my components would just break completely so I'd know what the heck is causing these random crashes and I could fix it. In my 15 years building PCs, I've never experienced regular intermittent crashes that didn't throw any errors in the Windows logs which could be used to identify the culprit.

I already updated my board's BIOS to the newest one, which "improves graphics card compatibility" which I assume means RTX 40 series support.


--EDIT--
My board's ME version reads as 16.1.25.2020 in the BIOS.

Is it ever helpful to turn on ECC memory in NVCP?


----------



## J7SC

Laithan said:


> Outside of extreme cooling, what would you consider the entire 4090 boost range from a dud to a golden sample? Something like 2900Mhz-3200Mhz or is it wider than that? Seems like most 4090s can reach 3Ghz.
> 
> Memory seems to be much wider.. I think some are stuck at like +1400Mhz where others are over +2000Mhz.


That's a bit of a tough one as it all depends on specific apps, cooling and so forth. I would say that a 'decent' sample should hold at least 3060 MHz on the core and +1400 Mhz on the VRAM slider (see above re. negative impact of low temps on VRAM range) w/o a single issue. Anything over 3100ish on the core clock and +1470ish on the VRAM slider is getting shiny...


----------



## yzonker

J7SC said:


> That's a bit of a tough one as it all depends on specific apps, cooling and so forth. I would say that a 'decent' sample should hold at least 3060 MHz on the core and +1400 Mhz on the VRAM slider (see above re. negative impact of low temps on VRAM range) w/o a single issue. Anything over 3100ish on the core clock and +1470ish on the VRAM slider is getting shiny...


I think you can look at people's 3DMark runs to get an idea for the core. Usually I find I have to drop at least 45-60 mhz off of a max OC from benching to be 100% game stable. Probably could do the same for VRAM since ECC is off by default. Maybe drop 100 off of whatever they managed to complete the run with. 

Mem is always all over the place I think though. Anywhere from total duds that can't do +1000 (my Zotac 3090), to a small number of cards that can do +2000. Average is probably in the low to mid teens.


----------



## Panchovix

J7SC said:


> That's a bit of a tough one as it all depends on specific apps, cooling and so forth. I would say that a 'decent' sample should hold at least 3060 MHz on the core and +1400 Mhz on the VRAM slider (see above re. negative impact of low temps on VRAM range) w/o a single issue. Anything over 3100ish on the core clock and +1470ish on the VRAM slider is getting shiny...


3060Mhz with 1.1V or 1.05V? Because probably 3Ghz range with 1.1V is "common", but at 1.05V I would say damn, that's a really good chip.

I would say that +1500Mhz is prob the average on VRAM, +1300 or less is on the "not" good memory samples and +1600 or more is pretty good. (I'm crying while I write this with my +1100Mhz on VRAM)


----------



## ttnuagmada

Laithan said:


> Outside of extreme cooling, what would you consider the entire 4090 boost range from a dud to a golden sample? Something like 2900Mhz-3200Mhz or is it wider than that? Seems like most 4090s can reach 3Ghz.
> 
> Memory seems to be much wider.. I think some are stuck at like +1400Mhz where others are over +2000Mhz.


My VRAM can't even do +1300. This is watercooled with sub 50C memory temps.


----------



## motivman

Hey Guys, I need help making a decision. So I have 4090 Gaming OC and 4090 Founders edition. Both cards seems to perform about the same, which should I keep? Gaming OC uses about 40W more power than Founders edition, and memory clocks a little higher +1600 vs +1800. Below are highest speedway and timespy scores I can achieve with both cards. Seems like the founders eiditon is the more efficient and cool card, which is very suprising. Its very possible my gaming OC has a bad paste job from the factory, but I am hesitant to try to fix it, since I am not convinced I want to keep it yet. I mean, the hotspot hits 91C benching port royal! ONE card has to go, but not sure which one....

*Port Royal*

Gaming OC: I scored 28 260 in Port Royal

Founders: I scored 28 207 in Port Royal

*Speedway*

Gaming OC: I scored 11 115 in Speed Way

Founders: I scored 11 085 in Speed Way


----------



## th3illusiveman

Keep FE, likely slightly higher resale value down the line.


----------



## KedarWolf

th3illusiveman said:


> Keep FE, likely slightly higher resale value down the line.


FE you cannot install other BIOS's on except for a newer FE BIOS.


----------



## keikei

Keep the one with the longer warranty.


----------



## Arizor

Keep FE for resale value as @th3illusiveman says, plus any warranty issues you deal direct with NVIDIA rather than Gigabyte (can be awful to deal with).


----------



## BigMack70

So it looks like for some of my crashes, if I don't immediately reset the PC but let it sit at a black screen for a little bit, Windows will attempt to record something in the error logs. The system is throwing a 0x00000133 bugcheck, which looks according to google like a generic "something broke" error.

I ran one of the saved dump files through windbg; analysis attached here. It's definitely GPU related but I am not knowledgeable enough about this kind of thing to know more than that (is my GPU dying? is it a driver issue? etc).

It looks like I can pretty reliably cause the issue by looping Port Royal in 3DMark... haven't passed 10 minutes yet any time I have tried that without a black screen.

Anyone have any advice? Or is it time to flip a coin and replace either the board or see if Gigabyte/Newegg will RMA my graphics card? 

--EDIT--
I see that Gigabyte's support website has an updated vBIOS for the card that is newer than what my card has, which "increases compatibility"... is it worth flashing a new BIOS to the card or does that risk voiding my warranty?


----------



## KingEngineRevUp

motivman said:


> Hey Guys, I need help making a decision. So I have 4090 Gaming OC and 4090 Founders edition. Both cards seems to perform about the same, which should I keep? Gaming OC uses about 40W more power than Founders edition, and memory clocks a little higher +1600 vs +1800. Below are highest speedway and timespy scores I can achieve with both cards. Seems like the founders eiditon is the more efficient and cool card, which is very suprising. Its very possible my gaming OC has a bad paste job from the factory, but I am hesitant to try to fix it, since I am not convinced I want to keep it yet. I mean, the hotspot hits 91C benching port royal! ONE card has to go, but not sure which one....
> 
> *Port Royal*
> 
> Gaming OC: I scored 28 260 in Port Royal
> 
> Founders: I scored 28 207 in Port Royal
> 
> *Speedway*
> 
> Gaming OC: I scored 11 115 in Speed Way
> 
> Founders: I scored 11 085 in Speed Way


FE. Subjectively the better looking card and for that reason FE had a strong resale value judging from the 30 series.


----------



## ianann

I didn't even consider any other card than the FE as this already seems plenty enough "oomph" for day-to-day-use aside from the last Mhz OC. Okay, I admit: plus it was the only card with available water blocks.


----------



## dante`afk

Help lol

Installed apex z790, can see bios. However pcie config shows no detection.

shows windows boot logo, but when it goes into windows it’s not showing anything. I can hear windows sounds.

bios 0031 is installed


----------



## J7SC

ianann said:


> I didn't even consider any other card than the FE as this already seems plenty enough "oomph" for day-to-day-use aside from the last Mhz OC. Okay, I admit: plus it was the only card with available water blocks.


I never had a NVidia FE card, though you are right in that the 4090s have awesome performance even in stock trim. Then again, the Giga G-OC was only $30 more here and (at the time) available in store. It's OC performance is a bonus when I feel like tinkering...



dante`afk said:


> Help lol
> 
> Installed apex z790, can see bios. However pcie config shows no detection.
> 
> shows windows boot logo, but when it goes into windows it’s not showing anything. I can hear windows sounds.
> 
> bios 0031 is installed


...can you get into Windows via safemode and do a DDU ? Also, I would un-check resizable_BAR in the mobo bios for the initial setup.


----------



## dante`afk

Got it thanks
Riser cable had to be forced to gen3 to work


----------



## EarlZ

I wanted to flash the stock bios back on my Aorus Master 4090. Can anyone tell me the steps on how I can do this. I did upload a copy of the stock bios in this thread since someone asked for it.


----------



## DokoBG

Feel bad for you guys. I am scared to say my card is "fixed" after i updated the Vbios. It is working perfectly fine on my old bucket in the signature....BUT soon i will put it in a brand new Z790 system and i have no clue if it will work basically which makes me very anxious about it.


----------



## Sheyster

BigMack70 said:


> I see that Gigabyte's support website has an updated vBIOS for the card that is newer than what my card has, which "increases compatibility"... is it worth flashing a new BIOS to the card or does that risk voiding my warranty?


I updated to the newer F2 BIOS a while back. If it's an official GB BIOS, there should be no warranty issues to consider. I think it's worth a shot in this case.


----------



## J7SC

Sheyster said:


> I updated to the newer F2 BIOS a while back. If it's an official GB BIOS, there should be no warranty issues to consider. I think it's worth a shot in this case.


Did you notice any difference (ie. max peak power) with the new GB F2 ? With bios updates (mobo and GPU), I'm usually a complete luddite🥴


----------



## Sheyster

J7SC said:


> Did you notice any difference (ie. max peak power) with the new GB F2 ? With bios updates (mobo and GPU), I'm usually a complete luddite🥴


It was identical pretty much, ran it for a week or so. Now I'm actually running the ASUS Strix V2 BIOS that Kedarwolf posted. I have yet to test it in benchmarks though.


----------



## jootn2kx

For me the ASUS Strix V2 BIOS that Kedarwolf posted gave me best results in games and also in benchmarks compared to the gigabyte gaming OC bios on my gainward non gs model. 
Seems like i'm gona stick on that one for a while


----------



## kx11




----------



## yzonker

BigMack70 said:


> So it looks like for some of my crashes, if I don't immediately reset the PC but let it sit at a black screen for a little bit, Windows will attempt to record something in the error logs. The system is throwing a 0x00000133 bugcheck, which looks according to google like a generic "something broke" error.
> 
> I ran one of the saved dump files through windbg; analysis attached here. It's definitely GPU related but I am not knowledgeable enough about this kind of thing to know more than that (is my GPU dying? is it a driver issue? etc).
> 
> It looks like I can pretty reliably cause the issue by looping Port Royal in 3DMark... haven't passed 10 minutes yet any time I have tried that without a black screen.
> 
> Anyone have any advice? Or is it time to flip a coin and replace either the board or see if Gigabyte/Newegg will RMA my graphics card?
> 
> --EDIT--
> I see that Gigabyte's support website has an updated vBIOS for the card that is newer than what my card has, which "increases compatibility"... is it worth flashing a new BIOS to the card or does that risk voiding my warranty?


You won't void your warranty by flashing a bios that is for the card and of course you can always flash it back to the original if you save a backup.


----------



## vigorito

dr/owned said:


> CableMod 3x here: I see no droop. 16-pin Voltage goes from 12.3V to 12.0V at 650W. ModDIY says they use 18awg, CableMod is 16.
> 
> 
> Yeah mine came with like 5 combs on it.


I ask moddi they say 16awg


----------



## ianann

Isn't the whole design meant to work flawlessy in a certain voltage range?
Does wire gauge really make that difference here? (Serious question)


----------



## RaMsiTo

I think this score is valid, I do not appreciate artifacts.










I scored 29 270 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Zero989

dante`afk said:


> Help lol
> 
> Installed apex z790, can see bios. However pcie config shows no detection.
> 
> shows windows boot logo, but when it goes into windows it’s not showing anything. I can hear windows sounds.
> 
> bios 0031 is installed


Power to the card issue

¿

Never mind, I see you fixed it. And it's an issue with getting power to the card.


----------



## pat182

seasonic free upgrade 2x8 pin, very well made, rarely drop to 11.9v underload


----------



## BigMack70

I think I might have finally identified my problem. PC passed two hours of Port Royal on loop this morning with the entire PC at stock, including super slow default JEDEC 2133 memory settings.

Went back to my DDR4-4000 settings and Port Royal black screen crashes within 5 minutes.

So - it's something with my memory settings and/or the board. Weird that it can pass a couple hours of memtest but not 5 minutes of Port Royal.

At least I think the GPU is fine - that's the main thing I didn't want to need to RMA/replace.


----------



## yzonker

BigMack70 said:


> I think I might have finally identified my problem. PC passed two hours of Port Royal on loop this morning with the entire PC at stock, including super slow default JEDEC 2133 memory settings.
> 
> Went back to my DDR4-4000 settings and Port Royal black screen crashes within 5 minutes.
> 
> So - it's something with my memory settings and/or the board. Weird that it can pass a couple hours of memtest but not 5 minutes of Port Royal.
> 
> At least I think the GPU is fine - that's the main thing I didn't want to need to RMA/replace.


I might have a guess. I ran in to something like this while mem tuning my system (13900k/DDR4 4200). I could run 5000% in Karhu, but y-cruncher would fail almost immediately in one or more of the stress tests (might have been FFT). You can have instability with the IMC that is not revealed by running strictly a mem test (my theory anyway). 

Try running the y-cruncher stress tests at 4000 and see if it passes (menu selection 1-7-0, all tests). It takes 20min to do one pass, that should be enough. y-cruncher really hammers the CPU so don't run it for several passes if you like your CPU. In my case, I had to raise VDDQ TX to get it stable.


----------



## superino091

Guys installed the new 4090 Fe, I have a problem with the Sound Blaster AE-7 sound card, the microphone interferes, the noise continues even if I close the microphone ... with the 2080ti it worked perfectly
I have accepted that other 4090FE owners also have the same problem


----------



## Nizzen

RaMsiTo said:


> I think this score is valid, I do not appreciate artifacts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 270 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Noone will ever know  Effency per mhz is very good! 
Maybe too good compare to others. I dunno


----------



## yzonker

Nizzen said:


> Noone will ever know  Effency per mhz is very good!
> Maybe too good compare to others. I dunno


Looks inline with the rest of our results. Maybe slightly better efficiency.


----------



## BigMack70

yzonker said:


> I might have a guess. I ran in to something like this while mem tuning my system (13900k/DDR4 4200). I could run 5000% in Karhu, but y-cruncher would fail almost immediately in one or more of the stress tests (might have been FFT). You can have instability with the IMC that is not revealed by running strictly a mem test (my theory anyway).
> 
> Try running the y-cruncher stress tests at 4000 and see if it passes (menu selection 1-7-0, all tests). It takes 20min to do one pass, that should be enough. y-cruncher really hammers the CPU so don't run it for several passes if you like your CPU. In my case, I had to raise VDDQ TX to get it stable.


That could be. Right now I'm in troubleshooting hell, as attempting to lower memory frequency and boot (DDR4-3800 and DDR4-3600) failed so spectacularly that Windows can neither boot nor repair itself on my PC. Which is utterly bizarre that keeping stable boot settings for DDR4-4000 and only lowering frequency could break things.

I will be replacing this board and this memory. Should have just bit the bullet and upgraded to DDR5 in the first place. Just glad the GPU seems OK.


----------



## yzonker

BigMack70 said:


> That could be. Right now I'm in troubleshooting hell, as attempting to lower memory frequency and boot (DDR4-3800 and DDR4-3600) failed so spectacularly that Windows can neither boot nor repair itself on my PC. Which is utterly bizarre that keeping stable boot settings for DDR4-4000 and only lowering frequency could break things.
> 
> I will be replacing this board and this memory. Should have just bit the bullet and upgraded to DDR5 in the first place. Just glad the GPU seems OK.


Well once you get Windows repaired, try bumping VDDQ TX to 1.5v. There's a bug in the MSI bios that seems to require high CPU VDDQ, at least on Z690. My understanding is that it's relatively safe (conversation someone had with Shamino). Until I went to 1.5v+, I couldn't even post above 3600. I have a MSI Z690 Edge. 

I wouldn't necessarily give up on it. DDR4 above 4000 with good timings is still fairly competitive with DDR5.


----------



## Sheyster

BigMack70 said:


> I think I might have finally identified my problem. PC passed two hours of Port Royal on loop this morning with the entire PC at stock, including super slow default JEDEC 2133 memory settings.
> 
> Went back to my DDR4-4000 settings and Port Royal black screen crashes within 5 minutes.
> 
> So - it's something with my memory settings and/or the board. Weird that it can pass a couple hours of memtest but not 5 minutes of Port Royal.
> 
> At least I think the GPU is fine - that's the main thing I didn't want to need to RMA/replace.


Glad you're getting somewhere with it! Perhaps try for 3866, same voltage and timings as 4000? FWIW, I had a very good G.skill B-die DDR4 3200 kit in my old 9900K system. Everyone was running this kit at 4000+. For whatever reason I could not get over 3866 with loose timings. I finally settled on tight timings at 3600.


----------



## Sheyster

BigMack70 said:


> I will be replacing this board and this memory. Should have just bit the bullet and upgraded to DDR5 in the first place. Just glad the GPU seems OK.


I highly recommend the G.Skill 6400 C32 kit, I believe it's $219 right now. I was able to run it with tight timings at 6800 on an ASUS Strix Z790-F (also recommend this mobo highly):


----------



## BigMack70

Sheyster said:


> I highly recommend the G.Skill 6400 C32 kit, I believe it's $219 right now. I was able to run it with tight timings at 6800 on an ASUS Strix Z790-F (also recommend this mobo highly):
> View attachment 2584376


Yeah I think something is FUBAR on my current setup. Re-installed windows from USB after it bricked my install, and now it can't do anything without crashing even at completely stock JEDEC settings. It's currently failing to apply Windows updates without black screen crashing. I'm going to order a new board, new memory, and pray my CPU and especially my GPU are OK.


----------



## Zero989

Sheyster said:


> I highly recommend the G.Skill 6400 C32 kit, I believe it's $219 right now. I was able to run it with tight timings at 6800 on an ASUS Strix Z790-F (also recommend this mobo highly):
> View attachment 2584376


Bump bclk, prob tops around 6933 at those timings


----------



## Sheyster

Zero989 said:


> Bump bclk, prob tops around 6933 at those timings


Oh I'm not done, have not touched the tertiary timings yet. That will help a lot with write and copy numbers.


----------



## MikeS3000

I just secured a PNY XLR8 RTX 4090 that will ship to Best Buy on 12/2 for pickup. Am I going to be disappointed by the 450w max on this card or should I try to get a card that has the 600w bioses?


----------



## BigMack70

Sheyster said:


> I highly recommend the G.Skill 6400 C32 kit, I believe it's $219 right now. I was able to run it with tight timings at 6800 on an ASUS Strix Z790-F (also recommend this mobo highly):
> View attachment 2584376


How's the Strix Z690-E? I can get that with the G.Skill kit tomorrow... the z790-F backordered a few days.


----------



## Sheyster

BigMack70 said:


> How's the Strix Z690-E? I can get that with the G.Skill kit tomorrow... the z790-F backordered a few days.


It's a good board, but you'll need to update the BIOS prior to installing the 13700K, for Raptor CPU support. I've heard that can be challenging and you might need an Alder Lake CPU to update the BIOS. Someone please correct me if I'm wrong on this.


----------



## Panchovix

MikeS3000 said:


> I just secured a PNY XLR8 RTX 4090 that will ship to Best Buy on 12/2 for pickup. Am I going to be disappointed by the 450w max on this card or should I try to get a card that has the 600w bioses?


You can flash a VBIOS and you will be fine: also, binning is random, I got a TUF (600W) which doesn't help me at all because my chip/vram is mediocre: and some Trio Gaming users (non-X) which is MSRP, can do 3.1Ghz on core and +2000Mhz on VRAM, so it's pure luck.


----------



## BigMack70

Sheyster said:


> It's a good board, but you'll need to update the BIOS prior to installing the 13700K, for Raptor CPU support. I've heard that can be challenging and you might need an Alder Lake CPU to update the BIOS. Someone please correct me if I'm wrong on this.


OK. I've got the G.Skill 2x16GB DDR5-6400 CL32 kit and a Strix Z790-E coming Friday. Couldn't find the Z790-F in stock anywhere.

Hopefully my CPU and GPU are OK. I'm not booting my PC again until I've got the new board and memory installed - it can't even go 1-2 minutes at idle without a black screen crash right now.

And of course all this happens two days after the return period for the MSI Edge. RIP my wallet.


----------



## atilcan06

Just secured strix OC, it came with bios 95.02.18.80.89 . Do you guys recommend updating to another one? I'll just benchmark for couple of weeks  then will play games extensively.


----------



## MikeS3000

Panchovix said:


> You can flash a VBIOS and you will be fine: also, binning is random, I got a TUF (600W) which doesn't help me at all because my chip/vram is mediocre: and some Trio Gaming users (non-X) which is MSRP, can do 3.1Ghz on core and +2000Mhz on VRAM, so it's pure luck.


Hmm, well I also just scored a Gigabyte Gaming OC model for $100 more than the PNY. My plan was just to cancel one of the orders. I don't want to be a jerk and open them both and then return one so they have to sell it open box. Would you recommend keeping the cheaper PNY one or is it worth the upcharge for the higher end models?


----------



## Panchovix

MikeS3000 said:


> Hmm, well I also just scored a Gigabyte Gaming OC model for $100 more than the PNY. My plan was just to cancel one of the orders. I don't want to be a jerk and open them both and then return one so they have to sell it open box. Would you recommend keeping the cheaper PNY one or is it worth the upcharge for the higher end models?


The Gaming OC almost surely will overclock good at least, the PNY is random. Also the Gaming OC has better cooler/temps, so I would keep it.


----------



## Sheyster

MikeS3000 said:


> Hmm, well I also just scored a Gigabyte Gaming OC model for $100 more than the PNY. My plan was just to cancel one of the orders. I don't want to be a jerk and open them both and then return one so they have to sell it open box. Would you recommend keeping the cheaper PNY one or is it worth the upcharge for the higher end models?


Keep the GB Gaming OC. It's worth the extra $100 over the PNY, plus you'll get a 600w max PL BIOS.


----------



## MikeS3000

Sheyster said:


> Keep the GB Gaming OC. It's worth the extra $100 over the PNY, plus you'll get a 600w max PL BIOS.


I think I am going to wait until the Gigabyte actually ships. I'm a bit leery as I got worked over by my local Microcenter when I placed an online order for the single 4090 in stock and they cancelled it on me hours later stating the product was not in stock. The build quality looks to be a lot better on the Gigabyte as well. I have had no problems for 2 years with my Gigabyte RTX 3080 10gb so I guess there is that.


----------



## Sheyster

MikeS3000 said:


> I think I am going to wait until the Gigabyte actually ships. I'm a bit leery as I got worked over by my local Microcenter when I placed an online order for the single 4090 in stock and they cancelled it on me hours later stating the product was not in stock. The build quality looks to be a lot better on the Gigabyte as well. I have had no problems for 2 years with my Gigabyte RTX 3080 10gb so I guess there is that.


I'm surprised your Microcenter allows online orders for anything new (4090, 4080, 13900K, etc.)

The So-Cal Microcenter is in-store only for all the hot new stuff. No online orders or reservations.


----------



## ttnuagmada

MikeS3000 said:


> I just secured a PNY XLR8 RTX 4090 that will ship to Best Buy on 12/2 for pickup. Am I going to be disappointed by the 450w max on this card or should I try to get a card that has the 600w bioses?


Are you sure it's actually capped at 450w? Pretty sure the FE is advertised as 450w too even though it can do 600.

450w will limit overclocking, but it should be fine for stock speeds. Ada isn't as power hungry as Ampere was in terms of pushing an OC. With max clocks/voltage im not power limited at all and still have some headroom.


----------



## MikeS3000

ttnuagmada said:


> Are you sure it's actually capped at 450w? Pretty sure the FE is advertised as 450w too even though it can do 600.
> 
> 450w will limit overclocking, but it should be fine for stock speeds.


I have no idea. I'm just going by the first page of this thread that lists the power limitis.


----------



## MikeS3000

Sheyster said:


> I'm surprised your Microcenter allows online orders for anything new (4090, 4080, 13900K, etc.)
> 
> The So-Cal Microcenter is in-store only for all the hot new stuff. No online orders or reservations.


My store in Columbus, OH is allowing online order for to hold for pickup only on GPUs. They do ship some items.


----------



## ianann

DPD today delivered the original corsair Type-4 12VHPWR Cable. Good build quality, plugs easily connected with a nice and full "click". Ran half an hour Port Royale stresstest at a mild OC with the Plug staying at around 30°C. Works for me and at last got rid of that bulky NVIDIA lash. 

cheers.


----------



## dk_mic

ianann said:


> DPD today delivered the original corsair Type-4 12VHPWR Cable. Good build quality, plugs easily connected with a nice and full "click". Ran half an hour Port Royale stresstest at a mild OC with the Plug staying at around 30°C. Works for me and at last got rid of that bulky NVIDIA lash.
> 
> cheers.


off topic: but you might want to look into ram overclocking. you leave quite a bit of performance on the table running the fabric at 1500 MHz and only 3000 CL15 with 525 TRFC.








A guide to ram overclocking on Zen 3


BEGINNER: First Steps: 1.) Download Thaiphoon Burner. Read the SPD to find out details about which ram die you have. If you buy 3600+ XMP sticks(with 1.35v profiles), they have dies that can do 3800+. If you buy 3200 XMP or lower, you may have dies that do 3333 max. Note: The manufacturers...




www.overclock.net


----------



## ianann

dk_mic said:


> off topic: but you might want to look into ram overclocking. you leave quite a bit of performance on the table running the fabric at 1500 MHz and only 3000 CL15 with 525 TRFC.
> 
> 
> 
> 
> 
> 
> 
> 
> A guide to ram overclocking on Zen 3
> 
> 
> BEGINNER: First Steps: 1.) Download Thaiphoon Burner. Read the SPD to find out details about which ram die you have. If you buy 3600+ XMP sticks(with 1.35v profiles), they have dies that can do 3800+. If you buy 3200 XMP or lower, you may have dies that do 3333 max. Note: The manufacturers...
> 
> 
> 
> 
> www.overclock.net


Thanks, gonna give it a read - performance-wise I always thought that the difference between the sweet spot of 3600 and my native XMP 3000 wasn't too much of a giant gap tbh.


----------



## motivman

4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!


----------



## MrTOOSHORT

That's bad. Looks like you need a repaste.


----------



## biigshow666

my free seasonic 12VHPWR Cable was delivered today. seems to be well built. ran a few benchmarks but im not seeing anything over 530 watts for power... same as before when i had the old cables plugged in. is this the max for my psu?


----------



## Sheyster

motivman said:


> 4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!


I haven't run any benchmarks lately, but after a long MW2/WZ2 gaming session it's 75-76C with a 500W power limit. I'm currently using the ASUS Strix V2 performance BIOS.


----------



## mirkendargen

motivman said:


> 4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!
> 
> View attachment 2584437


Mine was like 83C with the fans maxed when I was still on air, definitely a paste/mounting issue.


----------



## ttnuagmada

biigshow666 said:


> my free seasonic 12VHPWR Cable was delivered today. seems to be well built. ran a few benchmarks but im not seeing anything over 530 watts for power... same as before when i had the old cables plugged in. is this the max for my psu?


What perfcap are you seeing? My guess is that your GPU just doesn't need more than that. Mine tops out in the mid 500's too but that's with a max stable OC on GPU/VRAM with a vrel cap.


----------



## biigshow666

motivman said:


> 4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!
> 
> View attachment 2584437


----------



## jediblr

motivman said:


> 4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!


its not only hotspot 98C too high , but gpu 76C too, cos on 520 PL +200 core +1400 mem my gpu was 67C


----------



## sblantipodi

hi guys.
is there someone who done this firmware upgrade from nvidia?
the firmware upgrade is meant for founders and AIB cards.






NVIDIA GPU UEFI Firmware Update Tool | NVIDIA







nvidia.custhelp.com





if yes, can you tell me if the BIOS version shown in GPUz changed after the upgrade?










thanks.


----------



## vigorito

1900


----------



## GRABibus

motivman said:


> 4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!
> 
> View attachment 2584437


My hospot is between +7°C and 10°C beyond GPU temp (whatever benchmarks or games), Gigabyte Gaming OC on stock air cooler :









So there is an issue on your card


----------



## GRABibus

delete


----------



## GRABibus

motivman said:


> 4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!
> 
> View attachment 2584437


Which Bios do you use ?

595W max is very good. Mine top at 580W


----------



## motivman

GRABibus said:


> Which Bios do you use ?
> 
> 595W max is very good. Mine top at 576W


Asus strix bios


----------



## Tibby67

motivman said:


> 4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!
> 
> View attachment 2584437





motivman said:


> 4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!
> 
> View attachment 2584437


thats bad.. my strix 4090 not seen anything over .60C


----------



## GRABibus

motivman said:


> Asus strix bios


This strix Bios gives me 150points less in PR than Gigabyte stock Bios.
Maybe in game it can be interesting


----------



## yzonker

GRABibus said:


> This strix Bios gives me 150points less in PR than Gigabyte stock Bios.
> Maybe in game it can be interesting


Any idea why? Same clocks? I saw the same thing with the MSI bios. Slightly better Speedway score, but definitely lower PR score. Same clocks.


----------



## GRABibus

yzonker said:


> Any idea why? Same clocks? I saw the same thing with the MSI bios. Slightly better Speedway score, but definitely lower PR score. Same clocks.


No idea...We should check the complete V/F curve. My clocks,are a bit lower (Peak and average in PR), even if the boost clock by applying offset in MSI is higher in GPUZ....??
MSI bios is also bad for me.

Gigabyte bios 95.02.18.00.C1 is the best I tested on my Gigabyte Gaming OC : stability wise and boost + average clocks wise, benchmarks scores.

I game at 3075MHz-3090MHz at 4K with 56°C on Core and 65-67°C on hotspot (+210MHz on core), 20°c ambient, fans @ 100%, Voltage slider 100%, PL 133%.


----------



## motivman

I am suspecting the cooler on my gigabyte gaming OC has bad contact on one side of the core. the thermal spread is not putting enough pressure on one side of the core. Can anyone with a waterblock on their gigabyte 4090 gaming OC measure the thermal pad thickness for the pads on the core side of the PCB? when I removed the cooler, most of the thermal pads disintegrated. I need to replace everything on the core side of the PCB. Any help will be appreciated. I think they are all 1mm thermal pads, but not entirely sure.


----------



## mirkendargen

motivman said:


> I am suspecting the cooler on my gigabyte gaming OC has bad contact on one side of the core. the thermal spread is not putting enough pressure on one side of the core. Can anyone with a waterblock on their gigabyte 4090 gaming OC measure the thermal pad thickness for the pads on the core side of the PCB? when I removed the cooler, most of the thermal pads disintegrated. I need to replace everything on the core side of the PCB. Any help will be appreciated. I think they are all 1mm thermal pads, but not entirely sure.


I didn't measure them with calipers, but they look like 1mm all around, and quite hard (Gelid Extreme/Thermalright Odyssey style)


----------



## motivman

mirkendargen said:


> I didn't measure them with calipers, but they look like 1mm all around, and quite hard (Gelid Extreme/Thermalright Odyssey style)


If you have calipers, any chance you can measure?


----------



## J7SC

I finally started to wrap my head around the Gigabyte Gaming OC / water block setup. I found a (somewhat counterintuitive) way to raise VRAM temps relative to GPU core temps  as VRAM temps are otherwise too cold (err, > not hot enough) in my well-cooled setup. I broke Port Royal 29k+ several times and am now working on +1513 for VRAM (almost had it...).

...performance collage below, also with some earlier posted numbers. Sorry if it is hard to read - I prepped it on a 48 Inch OLED where everything looks pretty big ...fyi, 5950X can still go higher (bottom right) if need be but for now, I want to get down to some DLSS3 Frame Insertion fun on MS FlightSimulatror 2020. More benching and may be some subbing in a few days.


----------



## statman28

Anyone with a Gigabyte Aorus Master here that can give me some feedback on the card? Ive been trying to snag a Strix for weeks now but no joy. Aorus seems to be in stock and i'm close to pulling the trigger especially that I have sold my 3080 today. Cheers


----------



## J7SC

statman28 said:


> Anyone with a Gigabyte Aorus Master here that can give me some feedback on the card? Ive been trying to snag a Strix for weeks now but no joy. Aorus seems to be in stock and i'm close to pulling the trigger especially that I have sold my 3080 today. Cheers


I don't have the Aorus Master but the Gigabyte Gaming OC which seem to share the same PCBs (my warranty card actually said 'Aorus' only), and aftermarket water-blocks are the same for both. I am very happy with the Gigabyte OC, and it is one of four Gigabyte and Aorus models I currently run for work + play. There is of course always the 'silicon lottery' and all that...

...the only issue I had with the Gigabyte-G-OC was a badly placed thermal strip and some TIM issues. Not a biggie as I was going to water-cool anyways...still, no matter which 4090 you end up getting, it pays to run a few hard benches with monitoring software such as HWInfo open to check max GPU core, hotspot and VRAM temps when you installed the card. Finally, like most 4090s, these things are huge in stock air-cooled format (re. fitting in your case), and they do need to breathe re. airflow.


----------



## Dragonsyph

Wooot finally got a 4090 Tuf OC. Picking it up with weekend at BB.

I have GF3 1650 atx 3.0 psu. It has native 16pin x2. This will work fine right? I don’t need to buy a cable mod?


----------



## J7SC

Dragonsyph said:


> Wooot finally got a 4090 Tuf OC. Picking it up with weekend at BB.
> 
> I have GF3 1650 atx 3.0 psu. It has native 16pin x2. This will work fine right? I don’t need to buy a cable mod?


Native ATX 3.o PSU w/16pin (aka 12VHPWR + 4 sense) should work great !


----------



## statman28

J7SC said:


> I don't have the Aorus Master but the Gigabyte Gaming OC which seem to share the same PCBs (my warranty card actually said 'Aorus' only), and aftermarket water-blocks are the same for both. I am very happy with the Gigabyte OC, and it is one of four Gigabyte and Aorus models I currently run for work + play. There is of course always the 'silicon lottery' and all that...
> 
> ...the only issue I had with the Gigabyte-G-OC was a badly placed thermal strip and some TIM issues. Not a biggie as I was going to water-cool anyways...still, no matter which 4090 you end up getting, it pays to run a few hard benches with monitoring software such as HWInfo open to check max GPU core, hotspot and VRAM temps when you installed the card. Finally, like most 4090s, these things are huge in stock air-cooled format (re. fitting in your case), and they do need to breathe re. airflow.


Thanks for the feedback. Appreciate it


----------



## J7SC

statman28 said:


> Thanks for the feedback. Appreciate it


----------



## hmatt1981

This is going to be a stupid question no doubt.

But am I able to install the Zotac Trinitiy OC bios on a normal Trinity card, as they both look the same with one just having a boost.


----------



## Arizor

motivman said:


> 4090 gaming oc owners, how are your hotspot temps? mine is hitting 98C in timespy extreme!!!
> 
> View attachment 2584437


Sweeeeet Jesus, and was this just one run, not a stress test?! 

Just as a reference point, I contacted ASUS a while back and they said they work with a max allowed Hot Spot of 97C for 4090s, which I think is a very high tolerance, something definitely not right.

I repasted my TUF, and doing a Time Spy Extreme Stress Test (20 loops) with everything maxxed (PL 133% / 1.1 V / 3045ghz core / +1800 VRAM) went from an original 92C hot spot max / 76 GPU core to 82C hot spot / 70 core; well worth repasting if you can.

Also undervolting. If I undervolt 0.975V, 2800-2830 core, +1750VRAM, the same TS:E Stress Test I get 59 core / 69 Hot Spot. Hardly any difference in game performance.


----------



## GRABibus

J7SC said:


> I finally started to wrap my head around the Gigabyte Gaming OC / water block setup. I found a (somewhat counterintuitive) way to raise VRAM temps relative to GPU core temps  as VRAM temps are otherwise too cold (err, > not hot enough) in my well-cooled setup. I broke Port Royal 29k+ several times and am now working on +1513 for VRAM (almost had it...).
> 
> ...performance collage below, also with some earlier posted numbers. Sorry if it is hard to read - I prepped it on a 48 Inch OLED where everything looks pretty big ...fyi, 5950X can still go higher (bottom right) if need be but for now, I want to get down to some DLSS3 Frame Insertion fun on MS FlightSimulatror 2020. More benching and may be some subbing in a few days.
> View attachment 2584552


I have same scores on air cooler.
Maybe due to Vram OC and the nice boost I have out of the box….
Your max power draw is 532W. Mine is 576W.

do you know why yours should be lower ?


----------



## EarlZ

Are these figures normal? I even have the card at 70%PL


----------



## Arizor

EarlZ said:


> Are these figures normal? I even have the card at 70%PL
> View attachment 2584601


Yes if you look at the tool tip you'll see the 'min/max' have very different definitions, e.g. the max is all aggregated values. Don't worry about it.


----------



## EarlZ

Arizor said:


> Yes if you look at the tool tip you'll see the 'min/max' have very different definitions, e.g. the max is all aggregated values. Don't worry about it.


Thanks, I never realized that. WIth the 3090 it was just the actual max of a reading so i've never seen it do something like 480watts or higher.


----------



## bmagnien

J7SC said:


> I found a (somewhat counterintuitive) way to raise VRAM temps relative to GPU core temps  as VRAM temps are otherwise too cold (err, > not hot enough) in my well-cooled setup.


Care to share more details/photos on your methodology here?


----------



## alasdairvfr

J7SC said:


> I finally started to wrap my head around the Gigabyte Gaming OC / water block setup. I found a (somewhat counterintuitive) way to raise VRAM temps relative to GPU core temps  as VRAM temps are otherwise too cold (err, > not hot enough) in my well-cooled setup. I broke Port Royal 29k+ several times and am now working on +1513 for VRAM (almost had it...).
> 
> ...performance collage below, also with some earlier posted numbers. Sorry if it is hard to read - I prepped it on a 48 Inch OLED where everything looks pretty big ...fyi, 5950X can still go higher (bottom right) if need be but for now, I want to get down to some DLSS3 Frame Insertion fun on MS FlightSimulatror 2020. More benching and may be some subbing in a few days.
> View attachment 2584552


How are you set up on CPU? My 5950x for sure is holding back my scores on most benches.


----------



## Panchovix

Arizor said:


> Sweeeeet Jesus, and was this just one run, not a stress test?!
> 
> Just as a reference point, I contacted ASUS a while back and they said they work with a max allowed Hot Spot of 97C for 4090s, which I think is a very high tolerance, something definitely not right.
> 
> I repasted my TUF, and doing a Time Spy Extreme Stress Test (20 loops) with everything maxxed (PL 133% / 1.1 V / 3045ghz core / +1800 VRAM) went from an original 92C hot spot max / 76 GPU core to 82C hot spot / 70 core; well worth repasting if you can.
> 
> Also undervolting. If I undervolt 0.975V, 2800-2830 core, +1750VRAM, the same TS:E Stress Test I get 59 core / 69 Hot Spot. Hardly any difference in game performance.


You're tempting me soooo much to open my TUF and repaste, but damn, warranty here in Chile  
I have the TUF since 1 month ago and all fine still (well except it sucks at OC and coil whine, but these are no reasons to warranty here in Chile), so I may do it soon.


----------



## motivman

Arizor said:


> Sweeeeet Jesus, and was this just one run, not a stress test?!
> 
> Just as a reference point, I contacted ASUS a while back and they said they work with a max allowed Hot Spot of 97C for 4090s, which I think is a very high tolerance, something definitely not right.
> 
> I repasted my TUF, and doing a Time Spy Extreme Stress Test (20 loops) with everything maxxed (PL 133% / 1.1 V / 3045ghz core / +1800 VRAM) went from an original 92C hot spot max / 76 GPU core to 82C hot spot / 70 core; well worth repasting if you can.
> 
> Also undervolting. If I undervolt 0.975V, 2800-2830 core, +1750VRAM, the same TS:E Stress Test I get 59 core / 69 Hot Spot. Hardly any difference in game performance.


yeah something is wrong with my 4090. It uses A LOT more power compared to my founders edition, and for some reason the hotspot is very high on this card. My first 4090 gigabyte OC did not have this issue. I ordered KPX, 1.5mm and 1mm thermal pads and calipers to repaste the entire core side of the PCB. I wonder if hotspot is strictly the GPU die or somewhere else on the card, like the VRM's. My GPU core stays very cool, but hotspot is always 20-25C higher. right now, I am running the card on stock settings to keep things cool. Also bummed that the phanteks gpu blocks are sold out at this time, SMH.


----------



## motivman

J7SC said:


> I don't have the Aorus Master but the Gigabyte Gaming OC which seem to share the same PCBs (my warranty card actually said 'Aorus' only), and aftermarket water-blocks are the same for both. I am very happy with the Gigabyte OC, and it is one of four Gigabyte and Aorus models I currently run for work + play. There is of course always the 'silicon lottery' and all that...
> 
> ...the only issue I had with the Gigabyte-G-OC was a badly placed thermal strip and some TIM issues. Not a biggie as I was going to water-cool anyways...still, no matter which 4090 you end up getting, it pays to run a few hard benches with monitoring software such as HWInfo open to check max GPU core, hotspot and VRAM temps when you installed the card. Finally, like most 4090s, these things are huge in stock air-cooled format (re. fitting in your case), and they do need to breathe re. airflow.


what TIM and thermal pad issue did you have? I am currently having a hotspot issue with my card, which I cannot seem to resolve. I have repasted multiple times, but my hotspot almost touches 100C whenever running timespy extreme or control with 4k maxed out (no DLSS)... do you happen to know what size thermal pads were used on the core side of the PCB? I want to repaste the entire thing with gelid extreme and tfx, while waiting for the phanteks block to come back in stock.


----------



## yzonker

motivman said:


> yeah something is wrong with my 4090. It uses A LOT more power compared to my founders edition, and for some reason the hotspot is very high on this card. My first 4090 gigabyte OC did not have this issue. I ordered KPX, 1.5mm and 1mm thermal pads and calipers to repaste the entire core side of the PCB. I wonder if hotspot is strictly the GPU die or somewhere else on the card, like the VRM's. My GPU core stays very cool, but hotspot is always 20-25C higher. right now, I am running the card on stock settings to keep things cool. Also bummed that the phanteks gpu blocks are sold out at this time, SMH.


All I know is the VRM temp was included in the hotspot for 30 series. Probably likely it still is, but not confirmed AFAIK.


----------



## yzonker

Interesting note in the 3DMark update?

"Added NVAPI settings query for ECC video memory on supported NVIDIA hardware "


----------



## dr/owned

Random question for the WC nerds: anyone make a 3 way ball valve (1 input, 2 outputs with only 1 running at a time)? I want to be able to bypass my server for maintenance but not have to undo a bunch of quick disconnects.


----------



## motivman

What is interesting, in this video der8auer measure thermal pad thickness for the memory modules at 2mm, but I just triple measured the ones that came off my card with a caliper and its 1mm???? could this be part of the reason my card has ****ty hotspot temps? can anyone with a gigabyte 4090 gaming OC measure the original thermal pad thickness for your memory modules please?


----------



## dr/owned

motivman said:


> What is interesting, in this video der8auer measure thermal pad thickness for the memory modules at 2mm, but I just triple measured the ones that came off my card with a caliper and its 1mm???? could this be part of the reason my card has ****ty hotspot temps? can anyone with a gigabyte 4090 gaming OC measure the original thermal pad thickness for your memory modules please?


The Master and OC don't use the same heatsink:

OC looks like 1mm thermal pads and much lower standoffs:










Master has much taller standoffs:


----------



## mirkendargen

dr/owned said:


> Random question for the WC nerds: anyone make a 3 way ball valve (1 input, 2 outputs with only 1 running at a time)? I want to be able to bypass my server for maintenance but not have to undo a bunch of quick disconnects.


I'm sure you can find generic plumbing brass ones with barbs the size of your tubing if you're using soft tubing. You could also use an unvalved Y fitting with a QD on each of the two arms, not sure if 2 QD's counts as a bunch or not.


----------



## GRABibus

alasdairvfr said:


> My 5950x for sure is holding back my scores on most benches.


In TS, for sure as there is a CPU score.
For PR maybe, but not sure….

I can do 29050-29100 on my Gigabyte gaming OC on air with my 5950X.
Next week, my AM5 build with 7950X will be ready and I will post if I see different scores in PR with this CPU.


----------



## J7SC

motivman said:


> what TIM and thermal pad issue did you have? I am currently having a hotspot issue with my card, which I cannot seem to resolve. I have repasted multiple times, but my hotspot almost touches 100C whenever running timespy extreme or control with 4k maxed out (no DLSS)... do you happen to know what size thermal pads were used on the core side of the PCB? I want to repaste the entire thing with gelid extreme and tfx, while waiting for the phanteks block to come back in stock.


...my hotspot started to hit about 92 C (20 C above general GPU temp) at full blast while air-cooled at ~ max peak 570+ W and 25 C ambient. When I opened the card up, the TIM looked like it had some blank spots and two areas where it had clumped on the die, perhaps due to a misaligned thermal strip on one side of the VRM caps. From what I recall, factory pads were different sizes depending on the exact location, but I measured a couple at 1 mm with a caliper. FYI, while the Giga-G-OC and Master have mostly the same PCB and interchangeable water-blocks, the air cooler (and thus potentially pads) are a bit different on the Master, also because it has that LCD screen and related room and prep.

...after water-cooling with the Bykski block, and using Gelid OC for the die, thermal putty (TG 10) and VRAM and Bykski-included thermal pads for the VRM plus extra heatsink on the backplate, my temps dropped dramatically - almost too low for the VRAM (below 40 C, see pic above). Also, as already mentioned a couple of times before by others and myself, max peak wattage dropped by about 30W - 40W after water-cooling...I think that is the 3x GPU fans and GPU RGB now not connected. That also contributes further to lower temps.

When air-cooled, the card has hit 3210 MHz core on light 3D loads as posted before, but the hotspot issue prevented anything more than 3120 MHz under medium to heavy loads. When air-cooled, VRAM would settle around +1513 to +1526...with water-cooling, I can hit 3165+ MHz (depending on bench), but VRAM maxed in the mid +1400s...though now I am back at 1500+- (hoping for a bit more when pursuing a new settings approach I'll share later; need to confirm it works beyond just PR). There is a really narrow range when the VRAM gets beyond 40 C but the GPU stays below 43 C...some of it I can regulate w/ambient heat. With the 4090, I had to rethink several things when stretching the cards' legs when compared to the 3090 Strix (same mobo, RAM, CPU settings), and at the end of the day, I like to find the absolute max for my specific card and specific setup and compared against my own prior results. When all is said and done (other than perhaps new drivers etc), I can concentrate on the fun bits - DLSS3 / FrameInsertion...works superbly well for FS2020, but I guess we're still waiting for Cyberpunk 2077's update on that ?


----------



## SDEagle

jediblr said:


> This is for MSI GeForce RTX 4090 GAMING TRIO owners. I flashed Suprim X silent bios on my silent bios , so i can swap.
> For stability test i run 20 loop Port Royal + UE 5.1 Matrix demo all max in DLDSR 4k on my 2 k monitor.
> On 520PL Suprim X bios i can run +200 core +1400 mem , on 480PL default game bios +250 core +1400 mem, after day of tests for 24/7 i make 2 profile :
> 1. default bios +250 core +1300 mem 1.05 v 480PL // 2. UV+OC default bios +250 core +1300 mem 0.975 v 480PL
> 520PL Suprim X bios only for benches, we dont need it for daily gaming
> P.S. new 526.86 driver is good(only DWM-HAGS bug sux) , and for G-sync and fps limit questions look at this



@jediblr, could you please share your stock MSI 4090 Suprim X VBIOS that you flashed on Gaming Trio? Thank you.

I have Gaming Trio 4090, and would like to flash the same VBIOS. Also, you scores on Gaming Trio (with Suprim X bios) look really good. Great job!


----------



## WayWayUp

My founders finally came in 🙌🏻

I was worried about the bin because my 3090 was an ultra golden bin and I didn’t want to be disappointed with my 4090 sample

But so far so good it’s a nice sample. I’ve only done 30min of playing around but I was able to run timespy and FSU at +1700 memory
Also saw the core jump over 3100 but nothing crazy so far it needs more testing later for I’m pretty happy


----------



## PhuCCo

I measured my Gaming OC pads and this is what I came up with. 
I checked over and over to make sure I wasn't measuring wrong, but don't take these as absolute. It would be nice if someone else could do the same and compare results. 
I am definitely getting a consistent difference between the 1mm and 1.2mm pads, so I do think they are slightly different and not just measurement tolerance. I measured the pads lengthwise that way I also measure the areas that aren't compressed.


----------



## motivman

PhuCCo said:


> I measured my Gaming OC pads and this is what I came up with.
> I checked over and over to make sure I wasn't measuring wrong, but don't take these as absolute. It would be nice if someone else could do the same and compare results.
> I am definitely getting a consistent difference between the 1mm and 1.2mm pads, so I do think they are slightly different and not just measurement tolerance. I measured the pads lengthwise that way I also measure the areas that aren't compressed.


thanks a lot!!! YOU ARE A LIFE SAVER. I have repasted my card 3 times now, but hotspot still hits 97-98C in timespy extreme, so not sure what the issue is. my first gaming OC certainly did not have this problem. On that one, hotspot was about 10C higher than core temps. Your results are also consistent with my own measurements . I am not sure where to get 1.2mm pads from though. would it be safe to use 1.5mm pads in place of 1.2mm? I bought Gelid extreme thermal pads, but not sure if those compress well. Any recommendations for good thermal pads that compress well?


----------



## yzonker

Looks like Hwbot at least may be headed to ECC requirements. That update was for detecting ECC. But 4-5% performance penalty. 






Giant 3D elephant in the room no one wants to talk about...(why is it always me bringing this stuff to light?)


3D benching as we know it is broken. Plain and simple. What are we going to do about it? RTX-4090's are great cards but they also bug very easily. We need to be better users. If you see your score jump 1k point after raising your mem clock 15mhz that is not normal and not a "lucky" run. When your...



community.hwbot.org


----------



## Panchovix

By the way guys, have someone noticed the difference between 2600Mhz, 2800Mhz, 3000Mhz and maybe 3100Mhz on the core? I know VRAM only can give you like 8-10% if you get lucky, but the core itself? Maybe another 10% from 2600 to 3100Mhz?


----------



## Roacoe717

Will this kill the pump? I tested both up and down and I still got the same temps and benchmarks


----------



## PhuCCo

motivman said:


> thanks a lot!!! YOU ARE A LIFE SAVER. I have repasted my card 3 times now, but hotspot still hits 97-98C in timespy extreme, so not sure what the issue is. my first gaming OC certainly did not have this problem. On that one, hotspot was about 10C higher than core temps. Your results are also consistent with my own measurements . I am not sure where to get 1.2mm pads from though. would it be safe to use 1.5mm pads in place of 1.2mm? I bought Gelid extreme thermal pads, but not sure if those compress well. Any recommendations for good thermal pads that compress well?


Can you post a picture of your card when you take it apart? Seeing the compression marks on the pads and the thermal paste spread on the cooler would maybe help show the issue.

The oddball sized pads are always a headache. You either try to add a buffer to thinner pads to try and get the extra thickness, or you try to compress thicker pads in hopes that they compress enough. You could use 1mm pads and add some thermal paste on top of the pads to make the material a bit thicker. Or just use some thermal putty in place of the 1.2mm pads. EVGA has thermal putty on their website that you can buy in a syringe and apparently it's pretty good stuff. I haven't personally tested it, but I am going to grab some and try it. You just extrude it out and it fills in all of the gaps.


----------



## PhuCCo

Roacoe717 said:


> Will this kill the pump? I tested both up and down and I still got the same temps and benchmarks
> View attachment 2584685


Probably not ideal unless the pump is located in the middle of the radiator like some of the non-Asetek aios. Pretty sure that card has the pump in the actual block. If you don't hear gurgling I personally wouldn't worry about it.


----------



## SDEagle

Can someone please share the latest Suprim X (non-liquid) bios/rom that has the fix for 93% PL please. Thanks, and really appreciate it.


----------



## KingEngineRevUp

Panchovix said:


> By the way guys, have someone noticed the difference between 2600Mhz, 2800Mhz, 3000Mhz and maybe 3100Mhz on the core? I know VRAM only can give you like 8-10% if you get lucky, but the core itself? Maybe another 10% from 2600 to 3100Mhz?


It's almost 1% extra on the core is 0.5% performance gain.


----------



## KingEngineRevUp

Roacoe717 said:


> Will this kill the pump? I tested both up and down and I still got the same temps and benchmarks
> View attachment 2584685


That's a bad position to have your radiator and pump.

I have a guide on mounting it on the side 



http://imgur.com/a/vopXdWs

 try it and take pictures


----------



## WayWayUp

KingEngineRevUp said:


> It's almost 1% extra on the core is 0.5% performance gain.


Wut?


----------



## mirkendargen

motivman said:


> thanks a lot!!! YOU ARE A LIFE SAVER. I have repasted my card 3 times now, but hotspot still hits 97-98C in timespy extreme, so not sure what the issue is. my first gaming OC certainly did not have this problem. On that one, hotspot was about 10C higher than core temps. Your results are also consistent with my own measurements . I am not sure where to get 1.2mm pads from though. would it be safe to use 1.5mm pads in place of 1.2mm? I bought Gelid extreme thermal pads, but not sure if those compress well. Any recommendations for good thermal pads that compress well?


Get some of the Arctic TP-3 pads in 1.5mm. They're SUPER soft and will compress easily to both 1.2mm and 1.0mm (and less if necessary) without compromising core contact.


----------



## KingEngineRevUp

WayWayUp said:


> Wut?





WayWayUp said:


> Wut?


For every 1% you increase your core it translates to about 0.5% performance gain...


----------



## Arizor

KingEngineRevUp said:


> For every 1% you increase your core it translates to about 0.5% performance gain...


Wait so, let's say baseline is 2600mhz. Every 26mhz increase adds 0.5%? Surely not. That would mean going from 2600mhz to 3100mhz would increase performance by 20%. Can't say I've seen that in my games or benches.


----------



## yzonker

Arizor said:


> Wait so, let's say baseline is 2600mhz. Every 26mhz increase adds 0.5%? Surely not. That would mean going from 2600mhz to 3100mhz would increase performance by 20%. Can't say I've seen that in my games or benches.


Isn't that 10%?

(500/26)*.5 = 9.6%

But I was actually testing that earlier today and found for example PR only gains about 1/3 of the clock speed increase. So 1/2 might be optimistic but I'm sure it varies from game to game.


----------



## Arizor

yzonker said:


> Isn't that 10%?
> 
> (500/26)*.5 = 9.6%
> 
> But I was actually testing that earlier today and found for example PR only gains about 1/3 of the clock speed increase. So 1/2 might be optimistic but I'm sure it varies from game to game.


Oh yeah my bad. Yeah 10% seems optimistic still, I’d say closer to 5% in the jump from 2600 to 3ghz. VRAM can give a 7-8% bump from stock to +1700ish in my testing. So basically with a good OC you can get 10-12% jump.


----------



## dr/owned

mirkendargen said:


> Get some of the Arctic TP-3 pads in 1.5mm. They're SUPER soft and will compress easily to both 1.2mm and 1.0mm (and less if necessary) without compromising core contact.


I can confirm this. They're more like putty than pad.

Also I've given this advice before: warm your card and pads up in an oven set very low...like 150-170F before clamping everything together. You'll get waaaay better compression on any pad. Plastics don't care until 185F minimum (acrylic softens at 210).


----------



## KingEngineRevUp

Arizor said:


> Wait so, let's say baseline is 2600mhz


I see yzonker helped you with your math besides that, what card runs at 2600 MHz? The cards are going to boost into the 2700-2800 range naturally.









NVIDIA GeForce RTX 4090 Founders Edition Review - Impressive Performance


The NVIDIA GeForce RTX 4090 Founders Edition offers huge gains over its predecessors. It's the first graphics card to get you 4K 60 FPS with ray tracing enabled, and upscaling disabled. Do you prefer 120 FPS instead of 60? Just turn on DLSS 3.




www.techpowerup.com













NVIDIA GeForce RTX 4090 Founders Edition Review - Impressive Performance


The NVIDIA GeForce RTX 4090 Founders Edition offers huge gains over its predecessors. It's the first graphics card to get you 4K 60 FPS with ray tracing enabled, and upscaling disabled. Do you prefer 120 FPS instead of 60? Just turn on DLSS 3.




www.techpowerup.com





Unless if you run them in a hot box or something and they dont boost at all.


----------



## Roacoe717

KingEngineRevUp said:


> That's a bad position to have your radiator and pump.
> 
> I have a guide on mounting it on the side
> 
> 
> 
> http://imgur.com/a/vopXdWs
> 
> try it and take pictures


Fixed


----------



## KingEngineRevUp

Roacoe717 said:


> Fixed
> View attachment 2584695


Much better!


----------



## J7SC

Arizor said:


> Oh yeah my bad. Yeah 10% seems optimistic still, I’d say closer to 5% in the jump from 2600 to 3ghz. VRAM can give a 7-8% bump from stock to +1700ish in my testing. So basically with a good OC you can get 10-12% jump.


Basically, the 4090s design is so powerful that it seems to be a bit memory-constrained to begin with, who knows what it could do with GDDR7/X or HBM2/3. 

In any event, the VRAM has to be at least 43 C +- under load before it really gets going on my setup.


----------



## motivman

J7SC said:


> Basically, the 4090s design is so powerful that it seems to be a bit memory-constrained to begin with, who knows what it could do with GDDR7/X or HBM2/3.
> 
> In any event, the VRAM has to be at least 43 C +- under load before it really gets going on my setup.


well I would say vram is very temp sensitive with the 4090. Say for example, I can bench at +1800 all day if the vram is hot ie above 45C, but try to start a bench when the computer has been sitting, and vram is around 35C, and my screen goes black right away. I do not have this issue at all ath +1600, it runs at any temperature.


----------



## J7SC

motivman said:


> well I would say vram is very temp sensitive with the 4090. Say for example, I can bench at +1800 all day if the vram is hot ie above 45C, but try to start a bench when the computer has been sitting, and vram is around 35C, and my screen goes black right away. I do not have this issue at all ath +1600, it runs at any temperature.


...I guess what I am saying is that when a 5% boost in core speed gets you 3 extra peanuts and a 5% boost in VRAM gets you an extra pound of peanut butter, the chip is VRAM constrained (I better ask @Arizor to check my math on that, though )


----------



## Arizor

KingEngineRevUp said:


> I see yzonker helped you with your math besides that, what card runs at 2600 MHz? The cards are going to boost into the 2700-2800 range naturally.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 Founders Edition Review - Impressive Performance
> 
> 
> The NVIDIA GeForce RTX 4090 Founders Edition offers huge gains over its predecessors. It's the first graphics card to get you 4K 60 FPS with ray tracing enabled, and upscaling disabled. Do you prefer 120 FPS instead of 60? Just turn on DLSS 3.
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA GeForce RTX 4090 Founders Edition Review - Impressive Performance
> 
> 
> The NVIDIA GeForce RTX 4090 Founders Edition offers huge gains over its predecessors. It's the first graphics card to get you 4K 60 FPS with ray tracing enabled, and upscaling disabled. Do you prefer 120 FPS instead of 60? Just turn on DLSS 3.
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Unless if you run them in a hot box or something and they dont boost at all.


Fair enough, but anything above 2800mhz I see hardly any gains in games, I do see much more in VRAM OC. I suppose my proper question should have been about scaling beyond a certain amount. I see diminishing returns after 2800mhz, whilst VRAM seems to scale better.



J7SC said:


> ...I guess what I am saying is that when a 5% boost in core speed gets you 3 extra peanuts and a 5% boost in VRAM gets you an extra pound of peanut butter, the chip is VRAM constrained (I better ask @Arizor to check my math on that, though )


Aha trick question! A pound of peanut butter is the same as 3 extra large peanuts! Give me my certificate!


----------



## J7SC

Arizor said:


> Fair enough, but anything above 2800mhz I see hardly any gains in games, I do see much more in VRAM OC. I suppose my proper question should have been about scaling beyond a certain amount. I see diminishing returns after 2800mhz, whilst VRAM seems to scale better.
> 
> 
> 
> Aha trick question! A pound of peanut butter is the same as 3 extra large peanuts! Give me my certificate!


You won a crown of peanuts (medium-sized)


----------



## Arizor

J7SC said:


> You won a crown of peanuts (medium-sized)


As long as the crown fits Prince Charles' head, that enormous noggin would keep me in peanuts for a decade.


----------



## Arizor

To add some actual worthwhile content, does anyone have a solution for the GPU running at a locked low Mhz after waking from Sleep in Windows 11?

I put my computer to Sleep quite often, and rarely (but enough to irritate), I'll wake it, boot up a game, and the clock refuses to go above 780mhz until I reboot. Some weird Power Management setting?


----------



## motivman

hmmm, I just discovered another issue with my 4090... I am on driver 526.98... anyone having trouble turning on ECC? I could on my FE, but with my gigabyte, after reboot, I check ECC, and its still turned off??


----------



## KingEngineRevUp

motivman said:


> well I would say vram is very temp sensitive with the 4090. Say for example, I can bench at +1800 all day if the vram is hot ie above 45C, but try to start a bench when the computer has been sitting, and vram is around 35C, and my screen goes black right away. I do not have this issue at all ath +1600, it runs at any temperature.


Yeah, who would have thought water-cooling would actually hurt performance. I was able to benchmark at 1700 MHz when my card on air was a little warmer. But now on water I instant crash if I put my card even 50 above 1500.

Makes me wonder what my memory would be if it was warmer.


----------



## Arizor

Yep gone are the old tricks of benching on cold days / blasting fans directly at the GPU. I tried it one cold morning in the first week and got instant crashes until my card had hit something like 45C.


----------



## motivman

motivman said:


> hmmm, I just discovered another issue with my 4090... I am on driver 526.98... anyone having trouble turning on ECC? I could on my FE, but with my gigabyte, after reboot, I check ECC, and its still turned off??


turns out it was a bios issue. cannot enable ECC on the latest gaming bios for gigabyte (F2), switch to silent old bios, and ECC was enabled with no issues.. SMH


----------



## Arizor

motivman said:


> turns out it was a bios issue. cannot enable ECC on the latest gaming bios for gigabyte (F2), switch to silent old bios, and ECC was enabled with no issues.. SMH


Gigabyte are such a weird company. They can make really good hardware (Gaming OC seems to be a clear winner amongst 4090 AIBs, Aorus looks good) and yet screw it up still with weird hot spots and QC.

They seem to universally put out absolutely atrocious software; I've had a monitor of theirs and a graphics card and for both the software was almost unusable. Not in the least surprised their BIOSs are dodgy.


----------



## motivman

Arizor said:


> Gigabyte are such a weird company. They can make really good hardware (Gaming OC seems to be a clear winner amongst 4090 AIBs, Aorus looks good) and yet screw it up still with weird hot spots and QC.
> 
> They seem to universally put out absolutely atrocious software; I've had a monitor of theirs and a graphics card and for both the software was almost unusable. Not in the least surprised their BIOSs are dodgy.


yeah, I am at my wits end with this 4090. Too bad my FE sold within 10 minutes of putting it up for sale, if not, I would have kept my FE, and sent this card back...


----------



## KingEngineRevUp

Arizor said:


> Yep gone are the old tricks of benching on cold days / blasting fans directly at the GPU. I tried it one cold morning in the first week and got instant crashes until my card had hit something like 45C.


Imagine people using Thermal pads that are purposely not that great so the memory heats up.

@yzonker your going to have to get a water heater now instead of your chiller lol


----------



## J7SC

Arizor said:


> Gigabyte are such a weird company. They can make really good hardware (Gaming OC seems to be a clear winner amongst 4090 AIBs, Aorus looks good) and yet screw it up still with weird hot spots and QC.
> 
> They seem to universally put out absolutely atrocious software; I've had a monitor of theirs and a graphics card and for both the software was almost unusable. Not in the least surprised their BIOSs are dodgy.


I really haven't had too much trouble with Gigabyte and like their engineering (not liking their 'software', though). I wonder though whether the combo of Covid / post-Covid and mining collapse did not introduce huge stress on the assembly line (re. quality of assembly) across the industry. Even Asus had issues (ie. early Z690 Hero boards). For hardware such as mobos + GPUs, I typically rely either on Asus, Gigabyte or MSI. For what it is worth, I really do like the 4090 Gigabyte G-OC, but then again, I probably would like the Asus or MSI as well if I would have gotten one of those 4090s. Now that I figured out the temps, I can turn up the heat for winter and regulate the OC just with central heat .

As to your earlier comment on worthwhile content, we're watching '2001' (directed by Stanley Kubrick) on the 2nd monitor / tv...brilliant, especially the first part before the computer (freaky HAL 9000) kills all/most of the humans.


----------



## Arizor

KingEngineRevUp said:


> Imagine people using Thermal pads that are purposely not that great so the memory heats up.
> 
> @yzonker your going to have to get a water heater now instead of your chiller lol


Who's going to be the first to hook up the MO-RA3 to get fed from the jacuzzi?



J7SC said:


> I really haven't had too much trouble with Gigabyte and like their engineering (not liking their 'software', though). I wonder though whether the combo of Covid / post-Covid and mining collapse did not introduce huge stress on the assembly line (re. quality of assembly) across the industry. Even Asus had issues (ie. early Z690 Hero boards). For hardware such as mobos + GPUs, I typically rely either on Asus, Gigabyte or MSI. For what it is worth, I really do like the 4090 Gigabyte G-OC, but then again, I probably would like the Asus or MSI as well if I would have gotten one of those 4090s. Now that I figured out the temps, I can turn up the heat for winter and regulate the OC just with central heat .
> 
> As to your earlier comment on worthwhile content, we're watching '2001' (directed by Stanley Kubrick) on the 2nd monitor / tv...brilliant, especially the first part before the computer (freaky HAL 9000) kills all the humans.


Yeah it's always a gamble, I stick to ASUS and MSI after my bad experiences with GB, but honestly it's the software that kills me.

Absolute classic film, especially enjoyed on an OLED, just nabbed a 65" LG for the living room now Black Friday sales have hit Australia as of this morning, looking forward to firing up a good movie on it.


----------



## J7SC

Arizor said:


> Who's going to be the first to hook up the MO-RA3 to get fed from the jacuzzi?
> 
> 
> 
> Yeah it's always a gamble, I stick to ASUS and MSI after my bad experiences with GB, but honestly it's the software that kills me.
> 
> Absolute classic film, especially enjoyed on an OLED, just nabbed a 65" LG for the living room now Black Friday sales have hit Australia as of this morning, looking forward to firing up a good movie on it.


...yup, now watching the last part of '2001' on a 48 inch OLED...it's great to have a second work monitor that is also a tv (most of the time). When the movie is over, it's time for some FS2020 4K120 DLSS3/Fr.In. on the LG C1 .


----------



## motivman

J7SC said:


> I really haven't had too much trouble with Gigabyte and like their engineering (not liking their 'software', though). I wonder though whether the combo of Covid / post-Covid and mining collapse did not introduce huge stress on the assembly line (re. quality of assembly) across the industry. Even Asus had issues (ie. early Z690 Hero boards). For hardware such as mobos + GPUs, I typically rely either on Asus, Gigabyte or MSI. For what it is worth, I really do like the 4090 Gigabyte G-OC, but then again, I probably would like the Asus or MSI as well if I would have gotten one of those 4090s. Now that I figured out the temps, I can turn up the heat for winter and regulate the OC just with central heat .
> 
> As to your earlier comment on worthwhile content, we're watching '2001' (directed by Stanley Kubrick) on the 2nd monitor / tv...brilliant, especially the first part before the computer (freaky HAL 9000) kills all/most of the humans.


Man, all you have to do is use those blue thermal pads that comes with EKWB blocks on your vram. While troubleshooting the hotspot issue on my 4090, I used those stock ekwb pads on my vram at first, and my vram temps went up by 8-10C compared to the gelid ultimate pads i am using now. So when I get my waterblock, I will just use those pads on my VRAM. I turned on ECC on my card, and tested my vram. I can do +1800 with no issues. +1850 is instant crash. average vram temp was around 66C.


----------



## Azazil1190

motivman said:


> Man, all you have to do is use those blue thermal pads that comes with EKWB blocks on your vram. While troubleshooting the hotspot issue on my 4090, I used those stock ekwb pads on my vram at first, and my vram temps went up by 8-10C compared to the gelid ultimate pads i am using now. So when I get my waterblock, I will just use those pads on my VRAM. I turned on ECC on my card, and tested my vram. I can do +1800 with no issues. +1850 is instant crash. average vram temp was around 66C.


Does it matter if you have 16gb ram to enable the ecc? 
Im asking because i cant enable it at my tuf.
My second system 10900k and 16gb ram


----------



## motivman

Azazil1190 said:


> Does it matter if you have 16gb ram to enable the ecc?
> Im asking because i cant enable it at my tuf.
> My second system 10900k and 16gb ram


i had to switch to silent original bios on my card to enable ecc. It does not work on the latest performance gigabyte gaming oc bios.


----------



## Hanks552

What do you guys think about the zotac AMP extreme? Thinking about getting one to replace my msi trio, tired of waiting for bykski to replace a waterblock for MSI, they already have one for zotac. And the extreme have more gpu and vram stages


----------



## KingEngineRevUp

motivman said:


> Man, all you have to do is use those blue thermal pads that comes with EKWB blocks on your vram. While troubleshooting the hotspot issue on my 4090, I used those stock ekwb pads on my vram at first, and my vram temps went up by 8-10C compared to the gelid ultimate pads i am using now. So when I get my waterblock, I will just use those pads on my VRAM. I turned on ECC on my card, and tested my vram. I can do +1800 with no issues. +1850 is instant crash. average vram temp was around 66C.


Really? My stock pads have my memory max out at 50C, unless if I'm not reading that right for the 40 series.


----------



## motivman

KingEngineRevUp said:


> Really? My stock pads have my memory max out at 50C, unless if I'm not reading that right for the 40 series.


is this on water or air? my temps are based on the air cooler. I do not have a waterblock yet


----------



## mirkendargen

motivman said:


> Man, all you have to do is use those blue thermal pads that comes with EKWB blocks on your vram. While troubleshooting the hotspot issue on my 4090, I used those stock ekwb pads on my vram at first, and my vram temps went up by 8-10C compared to the gelid ultimate pads i am using now. So when I get my waterblock, I will just use those pads on my VRAM. I turned on ECC on my card, and tested my vram. I can do +1800 with no issues. +1850 is instant crash. average vram temp was around 66C.


I don't think cranking the memory speed is gonna do what you think it's doing with ECC on. Yeah it's stable at those speeds....because it's correcting all the errors that the speed is causing and actually performing worse.


----------



## KingEngineRevUp

motivman said:


> is this on water or air? my temps are based on the air cooler. I do not have a waterblock yet


Oh it's on water. I was confused since you were talking about using EKWB stock pads.


----------



## WayWayUp

So I tried out frame generation in plague tale

honestly I couldn’t tell a difference in IQ switching back and forth between them

I thought I wouldnt like it but it turned out really good

now I need either a 4k 240hz monitor or a new 8k ultrawide 🤣 otherwise dlss3.0 will go to waste it more than maxes out my 4k 144 display


----------



## motivman

mirkendargen said:


> I don't think cranking the memory speed is gonna do what you think it's doing with ECC on. Yeah it's stable at those speeds....because it's correcting all the errors that the speed is causing and actually performing worse.


well with ECC off, I can run port royal up to +1900 and get artifacts, with ECC on, anything above +1850 crashes instantly.


----------



## J7SC

In my humble opinion, the complainers and the resulting ECC farce has only served to make HWBot and 3DMark management look really bad...

On VRAM temps while water-cooling, I have got an old LN2pot that has a separate, super-thin heating element powered by a Molex (for the back of old HEDT Intel CPU mobos). I am reasonably certain the shape - a hollow rectangle / square - would fit on the VRAM w/o impacting the die. VRAM is already encased in thermal putty for the water-block and I'm sure I could make it work - that is after carefully checking adjusted temps. But at the end of the day, I'm happy with my scores and the incredible gaming performance of pretty much any 4090 powering 4K120 OLED - that's the main use for the 4090. Per spoiler, I just ran a quick Port Royal and Speedway at 26 C ambient and w/o either GPU or VRAM maxed but at game settings (for those games I even bother to oc), and I love those straight monitoring lines on the graphics w/such ambient...may be I keep that old heating element for the 4090 Ti instead  


Spoiler


----------



## jediblr

SDEagle said:


> @jediblr, could you please share your stock MSI 4090 Suprim X VBIOS that you flashed on Gaming Trio? Thank you


this is my Suprim X gaming (not silence) bios 480/520 PL (with latest upd.)MSI.4090.SUPX.GAMER.rom
P.S. and this is default bios 450/480 PL (with upd. too)MSI.RTX4090.TRIO.GAMING.rom


----------



## ianann

mirkendargen said:


> Get some of the Arctic TP-3 pads in 1.5mm. They're SUPER soft and will compress easily to both 1.2mm and 1.0mm (and less if necessary) without compromising core contact.


Good to know, Arctic's sheets say their thermal conductivity is even superior to many other well-known pads, while being way softer. I already have all three thicknesses incoming for testing. Are you happy with them?



dr/owned said:


> I can confirm this. They're more like putty than pad.
> 
> Also I've given this advice before: warm your card and pads up in an oven set very low...like 150-170F before clamping everything together. You'll get waaaay better compression on any pad. Plastics don't care until 185F minimum (acrylic softens at 210).


Indeed, I can confirm that although I use a hair dryer for carefully heating up the card and block. Works like a charm.


----------



## Panchovix

Arizor said:


> Fair enough, but anything above 2800mhz I see hardly any gains in games, I do see much more in VRAM OC. I suppose my proper question should have been about scaling beyond a certain amount. I see diminishing returns after 2800mhz, whilst VRAM seems to scale better.


Probably this is one of the most important things about core OC; if it doesn't scale that well, some people may get net benefit by overclock to 2800~ Mhz range (assuming your card doesn't do it at stock, mine doesn't lol, 2705Mhz stock) and then overclocking the VRAM the most they can, and then undervolt or power limit. I guess 2800~Mhz at whatever Volts less than 1.1V would use a lot less power.

For max overclocking potential though, 3-3.1Ghz on the core is the objective


----------



## lawson67

Hanks552 said:


> What do you guys think about the zotac AMP extreme? Thinking about getting one to replace my msi trio, tired of waiting for bykski to replace a waterblock for MSI, they already have one for zotac. And the extreme have more gpu and vram stages


I have a zotac Amp extreme which i am very happy with my Vram scales up to 2000mhz and my core clock pushes up high enough for me, i have not bothered flashing with a higher watt bios as adding extra power does not seem to achieve much better results that at 450w when i have pushed 500w using the power slider, also ive found that in games like Shadow of the Tomb Raider a slight bump on Vram adds more FPS than pushing the core clock more than 300mhz higher, below are a couple of benchmarks from my Zotac on the stock bios and there good enough for me, i was thinking of putting the card on a water block before i got it but now i am not even going to bother


----------



## coelacanth

Will a Seasonic Prime 850W Titanium (SSR-850TR) be enough for Zotac Trinity OC? CPU is a stock 5950X. This is for gaming, not going to OC the GPU. I have five SSDs, two mechanical HDs, and 7 fans as well.

Seems it should be OK but have read about transient spikes causing crashes with smaller PSUs.


----------



## lawson67

coelacanth said:


> Will a Seasonic Prime 850W Titanium (SSR-850TR) be enough for Zotac Trinity OC? CPU is a stock 5950X. This is for gaming, not going to OC the GPU. I have five SSDs, two mechanical HDs, and 7 fans as well.
> 
> Seems it should be OK but have read about transient spikes causing crashes with smaller PSUs.


I would say its the bare minimum however personally i would want at least 1000w and of course you should take in the consideration of transient spikes which could force you to get a 1000w psu anyhow


----------



## coelacanth

lawson67 said:


> I would say its the bare minimum however personally i would want at least 1000w and of course you should take in the consideration of transient spikes which could force you to get a 1000w psu anyhow


Thanks. By the way I've always enjoyed your posts going back to the Qnix days.


----------



## Sheyster

coelacanth said:


> Will a Seasonic Prime 850W Titanium (SSR-850TR) be enough for Zotac Trinity OC? CPU is a stock 5950X. This is for gaming, not going to OC the GPU. I have five SSDs, two mechanical HDs, and 7 fans as well.
> 
> Seems it should be OK but have read about transient spikes causing crashes with smaller PSUs.


If everything is stock 850w is probably okay. Having said this, I would not go less than 1000w myself. PSU's have a "sweet spot" for efficiency and it's generally somewhere between 50-60 percent of full load. You'll have more room for transients with 1000w as well. A good 1000w Titanium or Platinum is the way to go.

This PSU is a steal right now, Black Friday price:









Super Flower Leadex Platinum SE 1200W 80+ Platinum, 10 Years Warranty, ECO Fanless & Silent Mode, Full Flat Ribbon Modular Power Supply, Dual Ball Bearing Fan, SF-1200F14MP V2 - Newegg.com


Buy Super Flower Leadex Platinum SE 1200W 80+ Platinum, 10 Years Warranty, ECO Fanless & Silent Mode, Full Flat Ribbon Modular Power Supply, Dual Ball Bearing Fan, SF-1200F14MP V2 with fast shipping and top-rated customer service. Once you know, you Newegg!




www.newegg.com


----------



## arvinz

Anyone interested in a TUF OC brand new/sealed? Got it from a Best Buy drop end of October and haven't opened it. Managed to grab an FE and will likely keep that instead.

Selling for what I paid for. I'm in Canada so price came out to $2802.39 CAD. You cover shipping cost + Paypal G&S 4% fees.

I'm within the return window but figured I ask here in case anyone's looking for one, PM me.


----------



## Sheyster

arvinz said:


> Anyone interested in a TUF OC brand new/sealed? Got it from a Best Buy drop end of October and haven't opened it. Managed to grab an FE and will likely keep that instead.
> 
> Selling for what I paid for. I'm in Canada so price came out to $2802.39 CAD. You cover shipping cost + Paypal G&S 4% fees.
> 
> I'm within the return window but figured I ask here in case anyone's looking for one, PM me.


GLWS. I didn't realize you folks in Canada paid so much for these cards. I did the conversion to USD and was surprised! I'm sure someone in Canada will appreciate your offer. These cards are still hard to get!


----------



## coelacanth

Sheyster said:


> If everything is stock 850w is probably okay. Having said this, I would not go less than 1000w myself. PSU's have a "sweet spot" for efficiency and it's generally somewhere between 50-60 percent of full load. You'll have more room for transients with 1000w as well. A good 1000w Titanium or Platinum is the way to go.
> 
> This PSU is a steal right now, Black Friday price:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Super Flower Leadex Platinum SE 1200W 80+ Platinum, 10 Years Warranty, ECO Fanless & Silent Mode, Full Flat Ribbon Modular Power Supply, Dual Ball Bearing Fan, SF-1200F14MP V2 - Newegg.com
> 
> 
> Buy Super Flower Leadex Platinum SE 1200W 80+ Platinum, 10 Years Warranty, ECO Fanless & Silent Mode, Full Flat Ribbon Modular Power Supply, Dual Ball Bearing Fan, SF-1200F14MP V2 with fast shipping and top-rated customer service. Once you know, you Newegg!
> 
> 
> 
> 
> www.newegg.com


Are there Cablemod or Fasgear adapters that fit the Super Flower?


----------



## Sheyster

coelacanth said:


> Are there Cablemod or Fasgear adapters that fit the SuperFlower?


Yes, the E-series (EVGA) CableMod cables all work. Super Flower is the OEM for all the good EVGA PSU series (T2, P2, etc.)


----------



## J7SC

Sheyster said:


> If everything is stock 850w is probably okay. Having said this, I would not go less than 1000w myself. PSU's have a "sweet spot" for efficiency and it's generally somewhere between 50-60 percent of full load. You'll have more room for transients with 1000w as well. A good 1000w Titanium or Platinum is the way to go.
> 
> This PSU is a steal right now, Black Friday price:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Super Flower Leadex Platinum SE 1200W 80+ Platinum, 10 Years Warranty, ECO Fanless & Silent Mode, Full Flat Ribbon Modular Power Supply, Dual Ball Bearing Fan, SF-1200F14MP V2 - Newegg.com
> 
> 
> Buy Super Flower Leadex Platinum SE 1200W 80+ Platinum, 10 Years Warranty, ECO Fanless & Silent Mode, Full Flat Ribbon Modular Power Supply, Dual Ball Bearing Fan, SF-1200F14MP V2 with fast shipping and top-rated customer service. Once you know, you Newegg!
> 
> 
> 
> 
> www.newegg.com


Nice find...btw, the 1200W Leadex Platinum is also > available at a great price ... unfortunately the 1600W is still 'up there' in terms of dollars, never mind the 2000W model


----------



## Sheyster

J7SC said:


> Nice find...btw, the 1200W Leadex Platinum is also > available at a great price ... unfortunately the 1600W is still 'up there' in terms of dollars, never mind the 2000W model


The 1600w Titanium is also at an all-time low price today!

Link:









Super Flower Leadex Titanium 1600W 80+ Titanium, 10 Years Warranty, ECO Fanless & Silent Mode, Full Modular Power Supply, Dual Ball Bearing Fan, SF-1600F14HT - Newegg.com


Buy Super Flower Leadex Titanium 1600W 80+ Titanium, 10 Years Warranty, ECO Fanless & Silent Mode, Full Modular Power Supply, Dual Ball Bearing Fan, SF-1600F14HT with fast shipping and top-rated customer service. Once you know, you Newegg!




www.newegg.com





I can personally attest to the quality of this one. Best PSU I've ever owned hands down. The Platinum 1600w is not on sale for some reason.


----------



## coelacanth

Sheyster said:


> The 1600w Titanium is also at an all-time low price today!
> 
> Link:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Super Flower Leadex Titanium 1600W 80+ Titanium, 10 Years Warranty, ECO Fanless & Silent Mode, Full Modular Power Supply, Dual Ball Bearing Fan, SF-1600F14HT - Newegg.com
> 
> 
> Buy Super Flower Leadex Titanium 1600W 80+ Titanium, 10 Years Warranty, ECO Fanless & Silent Mode, Full Modular Power Supply, Dual Ball Bearing Fan, SF-1600F14HT with fast shipping and top-rated customer service. Once you know, you Newegg!
> 
> 
> 
> 
> www.newegg.com
> 
> 
> 
> 
> 
> I can personally attest to the quality of this one. Best PSU I've ever owned hands down. The Platinum 1600w is not on sale for some reason.


Wow Super Flower is blowing these PSUs out. May have to pick one up. I have a CoolerMaster V1000 I could pull from another system but it looks like no Cablemod adapters for that old model.


----------



## J7SC

Sheyster said:


> The 1600w Titanium is also at an all-time low price today!
> 
> Link:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Super Flower Leadex Titanium 1600W 80+ Titanium, 10 Years Warranty, ECO Fanless & Silent Mode, Full Modular Power Supply, Dual Ball Bearing Fan, SF-1600F14HT - Newegg.com
> 
> 
> Buy Super Flower Leadex Titanium 1600W 80+ Titanium, 10 Years Warranty, ECO Fanless & Silent Mode, Full Modular Power Supply, Dual Ball Bearing Fan, SF-1600F14HT with fast shipping and top-rated customer service. Once you know, you Newegg!
> 
> 
> 
> 
> www.newegg.com
> 
> 
> 
> 
> 
> I can personally attest to the quality of this one. Best PSU I've ever owned hands down. The Platinum 1600w is not on sale for some reason.


Yeah, in a 110V environment, the Superflower Leadex 1600W Titanium would be my choice as well...for others in 220V regions, there's that 2000W Titanium model (which rarely goes on sale).



Sheyster said:


> GLWS. I didn't realize you folks in Canada paid so much for these cards. I did the conversion to USD and was surprised! I'm sure someone in Canada will appreciate your offer. These cards are still hard to get!


...depends where / from what chain you buy in Canada; my Giga G-OC was the equivalent of US$ 1,619 and the TUF OC is listed within $100 (neither one available now, though)


----------



## Sheyster

J7SC said:


> Yeah, in a 110V environment, the Superflower Leadex 1600W Titanium would be my choice as well...for others in 220V regions, there's that 2000W Titanium model (which rarely goes on sale).


I actually have two 220V outlets in my house. One is intended for an EV in the garage, the other for an electric clothes dryer.  Neither one is in use today. Hmmm...


----------



## arvinz

Sheyster said:


> GLWS. I didn't realize you folks in Canada paid so much for these cards. I did the conversion to USD and was surprised! I'm sure someone in Canada will appreciate your offer. These cards are still hard to get!


Ya they're definitely pricey over here! I may end up returning it if nobody wants it. I did actually order the Optimus waterblock for it so I'll have to cancel or return that as well and wait for their FE block.


----------



## J7SC

Sheyster said:


> I actually have two 220V outlets in my house. One is intended for an EV in the garage, the other for an electric clothes dryer.  Neither one is in use today. Hmmm...


Yup, the 220V (240V) Dryer outlet in our place has my attention for all the wrong reasons.

@arvinz ...see earlier > part - I find that Canada Computers and Memory Express in Canada are very competitive (if they have stock) when compared to big US retailers


----------



## coelacanth

J7SC said:


> Yeah, in a 110V environment, the Superflower Leadex 1600W Titanium would be my choice as well...for others in 220V regions, there's that 2000W Titanium model (which rarely goes on sale).
> 
> 
> 
> ...depends where / from what chain you buy in Canada; my Giga G-OC was the equivalent of US$ 1,619 and the TUF OC is listed within $100 (neither one available now, though)





Sheyster said:


> The 1600w Titanium is also at an all-time low price today!
> 
> Link:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Super Flower Leadex Titanium 1600W 80+ Titanium, 10 Years Warranty, ECO Fanless & Silent Mode, Full Modular Power Supply, Dual Ball Bearing Fan, SF-1600F14HT - Newegg.com
> 
> 
> Buy Super Flower Leadex Titanium 1600W 80+ Titanium, 10 Years Warranty, ECO Fanless & Silent Mode, Full Modular Power Supply, Dual Ball Bearing Fan, SF-1600F14HT with fast shipping and top-rated customer service. Once you know, you Newegg!
> 
> 
> 
> 
> www.newegg.com
> 
> 
> 
> 
> 
> I can personally attest to the quality of this one. Best PSU I've ever owned hands down. The Platinum 1600w is not on sale for some reason.





J7SC said:


> Yeah, in a 110V environment, the Superflower Leadex 1600W Titanium would be my choice as well...for others in 220V regions, there's that 2000W Titanium model (which rarely goes on sale).
> 
> 
> 
> ...depends where / from what chain you buy in Canada; my Giga G-OC was the equivalent of US$ 1,619 and the TUF OC is listed within $100 (neither one available now, though)


This is exactly what I wanted to avoid, buying all sorts of new stuff. Hanging out in this thread is dangerous. Thanks all for the input.


----------



## Azazil1190

does anyone have the stock tuf oc perf. bios? not the v1 or v2?
i want to try something before rma


----------



## cheddardonkey

I'm super Thankful today, on T day, OUT with the (month) old Waterforce 4090 and IN with the new 4090 FE. Lottery round 2.. 

FE build quality feels alot more premium than the Aorus. No comparison benches yet. I wanted a quiet experience which is why I opet3d for waterforce in the first place but the cooling isn't that much better than air.
I sold the waterforce for $2k (retail) and got the FE for retail. With the $300 I saved, I plan to put that towards building my first custom loop. Who can recommend a good block for the FE?


----------



## Nero Augustus

Is there any windows/nvidia setting to help improve scores when benchmarking port royal & cie? Not talking about bugs like the artefacts thing, but more like optimization you can do on the windows/nvidia side of things (or anywhere else) to improve your score. I'm having fun OCing this card but I'm hitting a wall playing around afterburner so wondering if there's anything else.


----------



## Nd4spdvn

Sheyster said:


> I can personally attest to the quality of this one. Best PSU I've ever owned hands down.


Same here. Best PSU I ever owned. Have one since about 3-4 years ago and goes flawless and strong. Not a single worry with the 4090 Suprim X and its transient spikes which can reach up to 900-1000W (PMD verified).


----------



## ianann

cheddardonkey said:


> Who can recommend a good block for the FE?


There aren’t too many released to date and they all have similar performance so just go whichever tickles your fancy. Bykski, EKWB, …


----------



## J7SC

@MrTOOSHORT and others....

My (free) Seasonic 12VHPWR cable arrived today for my Seasonic Prime Platinum PX1300. It has 2x PCIe 8 pin connectors at the PSU end, unlike the 4x PCIe8 pin connectors of the PSU-specific Cablemod 12VHPWR cable I am currently using...both are rated for 600W and both seem to be very good quality re. wire gauge and sleeving - which one would you prefer ?

Also, looking at the pic of the back of the PX1300, does it make any difference which of the available connectors on the PSU side I use for 2x PCIe (in the case of the new Seasonic cable) or 4x PCIe (the currently-mounted Cablemod Seasonic-specific set) ?


----------



## ianann

J7SC said:


> @MrTOOSHORT and others....
> 
> My (free) Seasonic 12VHPWR cable arrived today for my Seasonic Prime Platinum PX1300. It has 2x PCIe 8 pin connectors at the PSU end, unlike the 4x PCIe8 pin connectors of the PSU-specific Cablemod 12VHPWR cable I am currently using...both are rated for 600W and both seem to be very good quality re. wire gauge and sleeving - which one would you prefer ?
> 
> Also, looking at the pic of the back of the PX1300, does it make any difference which of the available connectors on the PSU side I use for 2x PCIe (in the case of the new Seasonic cable) or 4x PCIe (the currently-mounted Cablemod Seasonic-specific set) ?
> View attachment 2584842


Won’t matter too much on a high quality PSU I’d say. Go with whichever feels higher quality for you and fits your colour scheme.


----------



## J7SC

ianann said:


> Won’t matter too much on a high quality PSU I’d say. Go with whichever feels higher quality for you and fits your colour scheme.


...Thanks ! Colour scheme is black-and-white (+blue-only RGB), so coin-flip


----------



## mirkendargen

J7SC said:


> @MrTOOSHORT and others....
> 
> My (free) Seasonic 12VHPWR cable arrived today for my Seasonic Prime Platinum PX1300. It has 2x PCIe 8 pin connectors at the PSU end, unlike the 4x PCIe8 pin connectors of the PSU-specific Cablemod 12VHPWR cable I am currently using...both are rated for 600W and both seem to be very good quality re. wire gauge and sleeving - which one would you prefer ?
> 
> Also, looking at the pic of the back of the PX1300, does it make any difference which of the available connectors on the PSU side I use for 2x PCIe (in the case of the new Seasonic cable) or 4x PCIe (the currently-mounted Cablemod Seasonic-specific set) ?
> View attachment 2584842


I like the color of the Cablemod better, but I like the 1:1 wires/pins design of the Seasonic one over the unnecessary splicing in the Cablemod one.


----------



## dr/owned

Sheyster said:


> I actually have two 220V outlets in my house. One is intended for an EV in the garage, the other for an electric clothes dryer.  Neither one is in use today. Hmmm...


I have my dryer outlet split for my desktop. My desktop lives in the laundry room / server room. Sound becomes irrelevant and the 220V outlet is a bonus. (Edit: third bonus is I only need 1 watercooling loop for all my computers)

The EV outlet in the garage doubles up for my welder.


----------



## coelacanth

J7SC said:


> @MrTOOSHORT and others....
> 
> My (free) Seasonic 12VHPWR cable arrived today for my Seasonic Prime Platinum PX1300. It has 2x PCIe 8 pin connectors at the PSU end, unlike the 4x PCIe8 pin connectors of the PSU-specific Cablemod 12VHPWR cable I am currently using...both are rated for 600W and both seem to be very good quality re. wire gauge and sleeving - which one would you prefer ?
> 
> Also, looking at the pic of the back of the PX1300, does it make any difference which of the available connectors on the PSU side I use for 2x PCIe (in the case of the new Seasonic cable) or 4x PCIe (the currently-mounted Cablemod Seasonic-specific set) ?
> View attachment 2584842


I was wondering about this too. There are 12vhpwr adapters with 2, 3, and 4 plugs for the PSU but they all say 600W.


----------



## ianann

coelacanth said:


> I was wondering about this too. There are 12vhpwr adapters with 2, 3, and 4 plugs for the PSU but they all say 600W.


Any modern higher tier PSU with at least 1000W should be fine with two connectors. The 4-Plug cables only split the current over more connectors, potentially leading to more stability on lower end and - power PSUs. IMHO (No electrical engineer) this most likely won’t affect any of you high-enders here. 

Corsair goes with 2*8 Pin, BeQuiet with 2*12 Pin Plug, so … If anyone knows better, please correct me.


----------



## Roacoe717

MSI Liquid X, 600w bios (Thanks


motivman said:


> rename to .rom


Thanks For this, I did a 2 benchmarks and got great results, once I get more time I will do more tweaking.


----------



## Petrarca

Hey guys. There is any list bad/good AIB cards? For now I know Zotac has bad/noisy fans, Palit pcbs are trash.


----------



## lawson67

Petrarca said:


> Hey guys. There is any list bad/good AIB cards? For now I know Zotac has bad/noisy fans, Palit pcbs are trash.


I can only talk for myself but the fans on my Zotac are not bad/noisy, Amp Extreme, and if they get that way I have a 5-year warranty


----------



## Nico67

ianann said:


> Any modern higher tier PSU with at least 1000W should be fine with two connectors. The 4-Plug cables only split the current over more connectors, potentially leading to more stability on lower end and - power PSUs. IMHO (No electrical engineer) this most likely won’t affect any of you high-enders here.
> 
> Corsair goes with 2*8 Pin, BeQuiet with 2*12 Pin Plug, so … If anyone knows better, please correct me.


2,3 or 4 x 8pin cables can only really pull 300w (ATX2.3spec), if you go buy the connector spec x how many wires are being used. The difference here is the wire gauge as they are mostly using 16awg which can handle higher current.
A lot of 3080/ 3090 users were well into the 500/ 600w range on 2 x 8pin cards and never had a problem using aftermarket 16awg cables. I had two Moddiy cables myself and they never even got warm.
Best usage case for 3 or 4 x 8pin cables is multi rail PSU's, so you don't end up overloading one rail and blowing up the PSU. Other wise 2 or 3 x pin cables are just cleaner. 2 x 8pin uses 6+6 wires and 3 x 8pin uses 4+4+4 wires to get the 12 active wires on the 16pin, just make sure they are 16awg.
I did see one vendor saying the pins are only designed for 17awg max, so not too sure what to make of that


----------



## J7SC

Nico67 said:


> 2,3 or 4 x 8pin cables can only really pull 300w (ATX2.3spec), if you go buy the connector spec x how many wires are being used. The difference here is the wire gauge as they are mostly using 16awg which can handle higher current.
> A lot of 3080/ 3090 users were well into the 500/ 600w range on 2 x 8pin cards and never had a problem using aftermarket 16awg cables. I had two Moddiy cables myself and they never even got warm.
> Best usage case for 3 or 4 x 8pin cables is multi rail PSU's, so you don't end up overloading one rail and blowing up the PSU. Other wise 2 or 3 x pin cables are just cleaner. 2 x 8pin uses 6+6 wires and 3 x 8pin uses 4+4+4 wires to get the 12 active wires on the 16pin, just make sure they are 16awg.
> I did see one vendor saying the pins are only designed for 17awg max, so not too sure what to make of that


...re. single rail, makes perfect sense. I'm just wondering about PSU-internal heat spacing / distribution because those 16awg cables do get warmish (not hot though) when they flow max power.


----------



## AvengedRobix

Petrarca said:


> Hey guys. There is any list bad/good AIB cards? For now I know Zotac has bad/noisy fans, Palit pcbs are trash.


My amp extreme is so quiet


----------



## bmagnien

If only thermals mattered more for the 4090, this would be a super tempting plug and play solution: This massive external cooler can chill up to four GPUs at once - VideoCardz.com









Bykski External Water-cooled Unit 1080


Bykski External Water-cooled Unit 1080




www.coolinglab.co.jp


----------



## bmagnien

bmagnien said:


> If only thermals mattered more for the 4090, this would be a super tempting plug and play solution: This massive external cooler can chill up to four GPUs at once - VideoCardz.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bykski External Water-cooled Unit 1080
> 
> 
> Bykski External Water-cooled Unit 1080
> 
> 
> 
> 
> www.coolinglab.co.jp


Lol this is so tempting - someone walk me off the ledge:


----------



## Larkonian

coelacanth said:


> Will a Seasonic Prime 850W Titanium (SSR-850TR) be enough for Zotac Trinity OC? CPU is a stock 5950X. This is for gaming, not going to OC the GPU. I have five SSDs, two mechanical HDs, and 7 fans as well.
> 
> Seems it should be OK but have read about transient spikes causing crashes with smaller PSUs.


I am running a SSR-850TR myself with my 4090 TUF at 450w. No issues so far (a month). I recall the PSU being able to deliver more than 1000w in a review a saw. Personally I will wait to have a selection of ATX 3.0 PSUs available before I upgrade. 

System specs:
Overclocked 12700KF with DDR4
3 NVME, 1 SATA
1 D5 pump and 12 fans


----------



## KingEngineRevUp

cheddardonkey said:


> I'm super Thankful today, on T day, OUT with the (month) old Waterforce 4090 and IN with the new 4090 FE. Lottery round 2..
> 
> FE build quality feels alot more premium than the Aorus. No comparison benches yet. I wanted a quiet experience which is why I opet3d for waterforce in the first place but the cooling isn't that much better than air.
> I sold the waterforce for $2k (retail) and got the FE for retail. With the $300 I saved, I plan to put that towards building my first custom loop. Who can recommend a good block for the FE?


Alphacool is probably going to be the best bang for the buck and it comes with a 1 slot replacement bracket.

Bykski is the cheapest but you have to reuse your 3 slot bracket which might not look aesthetically pleasing depending how you mount it, vertically or horizontally.

EKWB subjectivity might be the best looking but is still expensive for it's thermal performance.

Optimus is super expensive, I personally wouldn't go for this block because this generation extreme cooling isn't doing anything in terms of performance and OC where previously you would get 1-2 extra boost bins with an Optimus.

So in terms of performance, they're all the same, as in you don't get any real noticable performance benefit. Honestly, this generation water-cooling is hurting performance lol. It turns out it chills the memory so much the memory runs out of its optimal temperature range and many of us can't OC the memory as high as we were on air.


----------



## J7SC

bmagnien said:


> If only thermals mattered more for the 4090, this would be a super tempting plug and play solution: This massive external cooler can chill up to four GPUs at once - VideoCardz.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bykski External Water-cooled Unit 1080
> 
> 
> Bykski External Water-cooled Unit 1080
> 
> 
> 
> 
> www.coolinglab.co.jp


Basically, it looks like Watercool / Heatkiller's external enclosure setup for their MoRa that's been around for many years...see vid below...depending on the rest of your system, RTX 4090's VRAM would probably go into deep sleep ?


----------



## Laithan

J7SC said:


> ...re. single rail, makes perfect sense. I'm just wondering about PSU-internal heat spacing / distribution because those 16awg cables do get warmish (not hot though) when they flow max power.


Agree and also wonder about power distribution. All is fine for regular people who don't push limits but we are not them . 600W may be official spec but then again 150W was the official spec of a single 8-pin but we could run 300W+ over it in previous GPU generations. We cannot expect to double it again with 12VHWPR knowing it is really just using the same 8-pins lol. We see that some of the quality power supply companies are like yeah we can handle the *stock 600W* with only 2 8-pins....perhaps with a silent promise to never *exceed* that (and we know some of us will). How much headroom remains? 

We are likely to see a 4090Ti.. and even now we have seen a 1000W BIOS screenshot for the 4090 Strix. I question if the same 2 8-pin cables would do well with that 1000W BIOS or a shunt mod. I know I wouldn't feel very comfortable exceeding 600W with only 2 8-pins..


----------



## KingEngineRevUp

Honestly not sure if we're going to see a 4090 Ti only because

The 3090 Ti exist because NVIDIA made way too many 30 series chips due to mining (one of the reasons EVGA probably left) and from what we're seeing, 3090 Ti is taking away 4080 sales. Essentially, Nvidia cannibalize their own product. 

Not sure if they want to do this to themselves again. But we'll see.


----------



## bmagnien

J7SC said:


> Basically, it looks like Watercool / Heatkiller's external enclosure setup for their MoRa that's been around for many years...see vid below...depending on the rest of your system, RTX 4090's VRAM would probably go into deep sleep ?


Yeah def just a mora but the mora doesn’t come with a res, pump, 9 fans, and full enclosure all included for less than $500 shipped with taxes


----------



## changboy

I got my cablemod and have done oc test again and score higher. Me gpu core at 3075mhz and vram at +1900mhz. My cpu oc at 5.1ghz (10980xe) and ram at around 3400mhz. 








I scored 28 098 in Port Royal


Intel Core i9-10980XE Extreme Edition Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com












I scored 10 677 in Speed Way


Intel Core i9-10980XE Extreme Edition Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com












I scored 32 574 in Time Spy


Intel Core i9-10980XE Extreme Edition Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





I think with a new cpu like 13900k and ddr5 at 7200mhz i can score 29k in port royal, pc part change way too fast .


----------



## keikei

KingEngineRevUp said:


> Honestly not sure if we're going to see a 4090 Ti only because
> 
> The 3090 Ti exist because NVIDIA made way too many 30 series chips due to mining (one of the reasons EVGA probably left) and from what we're seeing, 3090 Ti is taking away 4080 sales. Essentially, Nvidia cannibalize their own product.
> 
> Not sure if they want to do this to themselves again. But we'll see.


I can see a 4090Ti existing only to further the performance distance from AMD (beating a dead horse) or as a late 4090 replacement. In either scenario, the odds are very low imo. In terms for performance, Green is sitting pretty.


----------



## KingEngineRevUp

changboy said:


> think with a new cpu like 13900k and ddr5 at 7200mhz i can score 29k in port royal


You'll need to raise your memory OC higher than that. PR loves memory OC.


----------



## bigfootnz

At the moment I'm waiting for water block for my PNY XLR8 4090. I'm just playing with card on my test bench. Card is not a much OCer probably due to 450W BIOS limit. But I'm having problem with Port Royal stress test. Doesn't matter do I have OC or stock this test always fails after some period of time. On other hand Time Spy stress test pass just fine with OC +200/+1200

Is anyone else experience this problem with Port Royal stress test? Or maybe my card is defective? Thanks

I've removed all CPU and RAM OC, still crashing in port royal stress test


----------



## Nico67

J7SC said:


> ...re. single rail, makes perfect sense. I'm just wondering about PSU-internal heat spacing / distribution because those 16awg cables do get warmish (not hot though) when they flow max power.


With single rail I don't think it would matter as all the 12v connectors join internal onto one bus.



Laithan said:


> Agree and also wonder about power distribution. All is fine for regular people who don't push limits but we are not them . 600W may be official spec but then again 150W was the official spec of a single 8-pin but we could run 300W+ over it in previous GPU generations. We cannot expect to double it again with 12VHWPR knowing it is really just using the same 8-pins lol. We see that some of the quality power supply companies are like yeah we can handle the *stock 600W* with only 2 8-pins....perhaps with a silent promise to never *exceed* that (and we know some of us will). How much headroom remains?
> 
> We are likely to see a 4090Ti.. and even now we have seen a 1000W BIOS screenshot for the 4090 Strix. I question if the same 2 8-pin cables would do well with that 1000W BIOS or a shunt mod. I know I wouldn't feel very comfortable exceeding 600W with only 2 8-pins..


I personally think that 600w is pretty close to the max, not sure I would like to get near 1000w without the side effect of LN2 cooling the connector.
I also don't think 3 x8pins would do any better, as they aren't actually 8 pins, there really only 4 pins in a 8pin connector. There's a reason the HOF has two 16pin connectors 

When you really think about it, 6pin was 75w, 8pin was really a 6pin with two sense at 150w? and 16pin is really two 6pins with four sense that can do 600w. Can remember what 12pin was, maybe 300w on a 3090FE? none of it really makes sense, and largely came done to wire gauge used in combination with the connector.


----------



## Nero Augustus

changboy said:


> I think with a new cpu like 13900k and ddr5 at 7200mhz i can score 29k in port royal, pc part change way too fast .


CPU doesn't matter in port royal. You might get a very, very small bump (maybe 50 points at best?) but you clearly won't gain 1k. memory might make a diff but once again far from 1k


----------



## changboy

KingEngineRevUp said:


> You'll need to raise your memory OC higher than that. PR loves memory OC.


My memory is XPM 3200MHZ and i oc it at 3468MHZ, i think i cant go higher then this.



Nero Augustus said:


> CPU doesn't matter in port royal. You might get a very, very small bump (maybe 50 points at best?) but you clearly won't gain 1k. memory might make a diff but once again far from 1k


Hall of Fame top 100 are all new cpu, there is no 10980xe there ! I oc mine from 5.0ghz to 5.1ghz and tweak a lil the memory and gain 400 point lol. All those new cpu have near double single core performance then mine so its helping a lot even in port royal.
I saw new ddr4 4000MHZ at around 450$ cl15, i dont know if its worth buying this or wait and change my mbo and cpu maybe next year also with DDR5. I feel like buying new memory for my system a bit waste of money at this point.


----------



## alitayyab

Petrarca said:


> Hey guys. There is any list bad/good AIB cards? For now I know Zotac has bad/noisy fans, Palit pcbs are trash.


While the default bios is relatively noisier as compared to the rest of the flock, the quiet bios is really quiet. Also you can set your own fan curves.


----------



## KingEngineRevUp

changboy said:


> My memory is XPM 3200MHZ and i oc it at 3468MHZ, i think i cant go higher then this.


Your vram memory


----------



## Azazil1190

I think GRAbibus mention that the Corsair 16pin cable reduce the vram oc.
Agree was tested yesterday on my giga oc .im about 50mhz lower with the Corsair cable.
About core oc is the same like the adaptor.


----------



## KingEngineRevUp

Azazil1190 said:


> I think GRAbibus mention that the Corsair 16pin cable reduce the vram oc.
> Agree was tested yesterday on my giga oc .im about 50mhz lower with the Corsair cable.
> About core oc is the same like the adaptor.


Are you sure you're not just running you're memory at a lower temperature? Like the day might be colder during the time you tested?


----------



## J7SC

bmagnien said:


> Yeah def just a mora but the mora doesn’t come with a res, pump, 9 fans, and full enclosure all included for less than $500 shipped with taxes


...Watercool's MoRA comes with all kinds of configs, including dual D5 housing module on the rad, fans, grills etc...but not for $500 shipped  ...one day, I want to build a dual MoRA 420 setup, just because...











...nothing wrong with Bykski, though, if I do say so myself. First (so-so quality) pics after cleaning the dual mobo work-play build up a bit tonight. Both GPUs have Bykski blocks


----------



## changboy

KingEngineRevUp said:


> Your vram memory


My motherboard memory (ram)


----------



## Petrarca

lawson67 said:


> I can only talk for myself but the fans on my Zotac are not bad/noisy, Amp Extreme, and if they get that way I have a 5-year warranty





AvengedRobix said:


> My amp extreme is so quiet



I glad your cards are ok. But there are many reddit posts and youtube videos about zotac fans be noisy (motor, bearing) espetially on low rpm and during start stop.

Also does 24 power stages matter? It's better to buy basic trinity or amp for 100 euro more?


----------



## Azazil1190

KingEngineRevUp said:


> Are you sure you're not just running you're memory at a lower temperature? Like the day might be colder during the time you tested?


Ahmm maybe you are right.
Im 2-3c lower on memory but is this enough to make my memory unstable?


----------



## jootn2kx

Anyone has the Alphacool Eisblock Aurora Acryl waterblock? how are the temps with this one?


----------



## aijay

coelacanth said:


> Will a Seasonic Prime 850W Titanium (SSR-850TR) be enough for Zotac Trinity OC? CPU is a stock 5950X. This is for gaming, not going to OC the GPU. I have five SSDs, two mechanical HDs, and 7 fans as well.
> 
> Seems it should be OK but have read about transient spikes causing crashes with smaller PSUs.


I'm doing just fine running my water cooled Gigabyte 4090 Gaming OC on a Corsair HX850i.

1.1v 600w PL. All good so far. I'm on a 5800x3d though which doesn't use a lot of power.


----------



## pshiepler

DennyA said:


> I mean, if y'all are looking for _excuses_ to upgrade your CPU, just load Microsoft Flight Sim with the settings on Ultra and fly over NYC. Frame rate on my 10900K without DLSS/frame doubling turned on is pretty much identical with my 3080 10GB -- about 42 fps at 5,120x1440. (Of course, turning on DLAA/frame generation gives me 75-ish, so still happy with the card!) Totally CPU-bound, so it'll be interesting to see what going to a 13900K is going to do for MSFS.


 Hi, I have just upgraded to a 4090 to improve my frames at 5120x1440p. My cpu is an amd 5900x so pretty similar to your 10900k. I can get more than 60fps wherever I fly. You wrote you get around 70 fps. Do you have an idea why my performance is lower?


----------



## Nizzen

pshiepler said:


> Hi, I have just upgraded to a 4090 to improve my frames at 5120x1440p. My cpu is an amd 5900x so pretty similar to your 10900k. I can get more than 60fps wherever I fly. You wrote you get around 70 fps. Do you have an idea why my performance is lower?


Intel cpu is faster in that game. Lower latency. Memory performance helps too in that game.

There are very few games that 5900x is as fast as 10900k. Atleast with tuned memory like 4700c17 sub 35ns 😁


----------



## ShadowYuna

Finally received Cablemod 12VHPWR Cable for my 4090.










With original cable









With Cablemod Cable









It looks way better and also easy cable management. Took me 1 montho to complete change over to 4090.


----------



## KingEngineRevUp

Azazil1190 said:


> Ahmm maybe you are right.
> Im 2-3c lower on memory but is this enough to make my memory unstable?


Well the air-coolers this generation are very good. They seem to float around that area where memory is too chilled so 2-3C can make a difference.

You can always try running your fans power and warming the card up and heating your memory and see if you can bench at those memory clocks at higher temperatures again.


----------



## KingEngineRevUp

changboy said:


> My motherboard memory (ram)


How are you not getting it? You need to OC your GPU memory more to break 29-30K in PR.


----------



## Nizzen

KingEngineRevUp said:


> How are you not getting it? You need to OC your GPU memory more to break 29-30K in PR.


PR loves unstable vram oc 🤠😆


----------



## simonabamber

Just got an EK block on my FE, GPU only loop with a 360gts rad.

2800MHZ core/+1650 on the mem
Temps are:
50 degree GPU
60 degree Hotspot
55 memory
with water temp at 32 degrees.

Does this sound right? I'm a first time builder.


----------



## Jay-G30

simonabamber said:


> Just got an EK block on my FE, GPU only loop with a 360gts rad.
> 
> 2800MHZ core/+1650 on the mem
> Temps are:
> 50 degree GPU
> 60 degree Hotspot
> 55 memory
> with water temp at 32 degrees.
> 
> Does this sound right? I'm a first time builder.


Looks about right depending on what wattage that was at ? 

What do you get if running furmark with the card stock at 4k ?

for reference i am using a EK ABP for the 4090 FE and in Furmark with the card stock i get :

GPU : 42 deg c
Hotspot : 49 deg c 
Memory : 28 deg c ( bad for memory oc! ) 
Water temp : 21 deg c
Load : 450w

In games at 350w delta drops to 13 deg over water temp .


----------



## ianann

Jay-G30 said:


> Looks about right depending on what wattage that was at ?
> 
> What do you get if running furmark with the card stock at 4k ?
> 
> for reference i am using a EK ABP for the 4090 FE and in Furmark with the card stock i get :
> 
> GPU : 42 deg c
> Hotspot : 49 deg c
> Memory : 28 deg c ( bad for memory oc! )
> Water temp : 21 deg c
> Load : 450w
> 
> In games at 350w delta drops to 13 deg over water temp .


That's some nice delta on the core/ hotspot there mate!


----------



## Jay-G30

ianann said:


> That's some nice delta on the core/ hotspot there mate!


I am using Liquid metal on the die which i think is helping out a lot , rather then me fill this thread up with pics i have posted my data on temps on Overclockers uk forum if you wanted to see what i did  






EK Active Backplate Waterblock install for 4090 FE


I posted a few pics of this in the main 4000 series thread but decided to do a separate thread in this section showing the process i took of installing the EK Active back plate on the 4090 FE , will include pics of the temps i achieved after doing do :) Setup to accommodate air cooled 4090...




forums.overclockers.co.uk


----------



## ianann

Jay-G30 said:


> I am using Liquid metal on the die which i think is helping out a lot , rather then me fill this thread up with pics i have posted my data on temps on Overclockers uk forum if you wanted to see what i did
> 
> 
> 
> 
> 
> 
> EK Active Backplate Waterblock install for 4090 FE
> 
> 
> I posted a few pics of this in the main 4000 series thread but decided to do a separate thread in this section showing the process i took of installing the EK Active back plate on the 4090 FE , will include pics of the temps i achieved after doing do :) Setup to accommodate air cooled 4090...
> 
> 
> 
> 
> forums.overclockers.co.uk


I‘ll do a repad on my FE today with the new Arctic TP-3 while this time plan to leave the inductor coils unpadded. Hoping of an even better contact on the Die. Wish me luck. 😬


----------



## changboy

KingEngineRevUp said:


> How are you not getting it? You need to OC your GPU memory more to break 29-30K in PR.


My gpu memory is oc at +1900, i talk about my motherboard memory dude.
Also my pcie is just 3.0, if i have at least 4.0 , just that i can gain 500 point in port royal. It's around 2%.


----------



## Jay-G30

changboy said:


> My gpu memory is oc at +1900, i talk about my motherboard memory dude.
> Also my pcie is just 3.0, if i have at least 4.0 , just that i can gain 500 point in port royal. It's around 2%.


Push the memory higher until it artifacts , will gain you an extra 1000 points easy


----------



## changboy

Jay-G30 said:


> Push the memory higher until it artifacts , will gain you an extra 1000 points easy


What your gpu memory oc at ? Mine is oc at +1900, can you beat that ?


----------



## Jay-G30

changboy said:


> What your gpu memory oc at ? Mine is oc at +1900, can you beat that ?


Mine is currently +1450Mhz now its watercooled ( to cold ) was +1500 previously on air but regardless of what mine is doing i am saying if you really do want an extra 1000 point what you do is overclock your own memory as high as it will go until if artifacts which causes the fps to go through the roof as the image on screen will be a mess but will still pass and be classed as vaild by PR , at the end you end up with 29k- 30k without ln2 and low core clocks  it is pretty easy to spot a genuine run from a non genuine run , your 28000 looks correct and would class that as a genuine run


----------



## KingEngineRevUp

simonabamber said:


> Just got an EK block on my FE, GPU only loop with a 360gts rad.
> 
> 2800MHZ core/+1650 on the mem
> Temps are:
> 50 degree GPU
> 60 degree Hotspot
> 55 memory
> with water temp at 32 degrees.
> 
> Does this sound right? I'm a first time builder.


Looks about right. Throw up a HWINFO64 screenshot.


----------



## KingEngineRevUp

changboy said:


> My gpu memory is oc at +1900, i talk about my motherboard memory dude.
> Also my pcie is just 3.0, if i have at least 4.0 , just that i can gain 500 point in port royal. It's around 2%.


That's odd then, because I'm only doing +1500 and my memory reports 1500, where yours reports 1433.









Result







www.3dmark.com


----------



## changboy

KingEngineRevUp said:


> That's odd then, because I'm only doing +1500 and my memory reports 1500, where yours reports 1433.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com


Ya i see that, its strange....maybe coz i use Aorus oc program, i dont know.

Why this guy score 29k and is core lower then me ?








I scored 29 011 in Port Royal


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## KingEngineRevUp

changboy said:


> Ya i see that, its strange....maybe coz i use Aorus oc program, i dont know.
> 
> Why this guy score 29k and is core lower then me ?
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 011 in Port Royal
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Can you try MSI or EVGA? It doesn't seem like it's adding anything to your memory. According to my napkin math, you're only doing +964 on your memory.

[(1433*16) - 21000] /2

Something is off with these softwares. I know most of us are using MSI Afterburner.


----------



## changboy

KingEngineRevUp said:


> Can you try MSI or EVGA? It doesn't seem like it's adding anything to your memory. According to my napkin math, you're only doing +964 on your memory.
> 
> [(1433*16) - 21000] /2


I will try later, maybe next week hahaha.


----------



## Jay-G30

changboy said:


> Ya i see that, its strange....maybe coz i use Aorus oc program, i dont know.
> 
> Why this guy score 29k and is core lower then me ?
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 011 in Port Royal
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


As i said to you above artifacting


----------



## changboy

I never saw artifact on my 4090 hehehe.


----------



## Jay-G30

changboy said:


> I never saw artifact on my 4090 hehehe.


no but the runs at 29k at low core clocks very likely are 😂


----------



## Nizzen

changboy said:


> Ya i see that, its strange....maybe coz i use Aorus oc program, i dont know.
> 
> Why this guy score 29k and is core lower then me ?
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 011 in Port Royal
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Don't care about PR score. Many bugged scores around 29k+


----------



## Nizzen

changboy said:


> I never saw artifact on my 4090 hehehe.


I'm turning off monitor while running PR. Never saw artifacts


----------



## KingEngineRevUp

Nizzen said:


> Don't care about PR score. Many bugged scores around 29k+


I have ever only used it to see if what I'm doing is actually improving my score in a worthwhile fashion. Then I try my best to reach out while undervolted.


----------



## Roacoe717

Almost complete, need another 16 pin cable


----------



## Jay-G30

changboy said:


> Ya i see that, its strange....maybe coz i use Aorus oc program, i dont know.
> 
> Why this guy score 29k and is core lower then me ?
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 011 in Port Royal
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


I run i just did with similar core clock to yourself and also watercooled .

28299 no Artifacting at all ! As you can see no way in hell a small memory bump will give you 1000 points + and claim to be genuine 









I scored 28 299 in Port Royal


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## KingEngineRevUp

https://www.tomshardware.com/news/fake-msi-afterburner-infects-targets-with-coin-miner-password-stealer


----------



## KingEngineRevUp

Jay-G30 said:


> I run i just did with similar core clock to yourself and also watercooled .
> 
> 28299 no Artifacting at all ! As you can see no way in hell a small memory bump will give you 1000 points + and claim to be genuine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 299 in Port Royal
> 
> 
> Intel Core i9-10900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Some math to share. You have +1400ish on the memory, they have +950. The difference between your scores is about 200 points.

So indeed, no +1000.

Edit: So... If some people are reporting +2000, what if they are using another OC software and actually doing +1000?

3Dmark should report around 1562.5 MHz for +2000


----------



## changboy

Ya i agree heheh.

You guys did you bought some game for black friday ? I bought The Witcher 3, i played a copy a lil some times ago but it will have a nice update to this title with dlss and other things, it will be nice to play.


----------



## keikei

changboy said:


> Ya i agree heheh.
> 
> You guys did you bought some game for black friday ? I bought The Witcher 3, i played a copy a lil some times ago but it will have a nice update to this title with dlss and other things, it will be nice to play.


I bought mostly jrpgs. I already have TW3. Spiderman is on sale though. The pc version is outstanding. The later 2 will test the 4090.


----------



## akgis

Whats the best BIOS for a suprim liquid X, to try to push PL past 520w?
thanks.


----------



## changboy

keikei said:


> I bought mostly jrpgs. I already have TW3. Spiderman is on sale though. The pc version is outstanding. The later 2 will test the 4090.


Have a trailer to show The Witcher with the new update here : 




I have new Spiderman and dlss 3.0 is amazing, my 3090 cant do half of my 4090.


----------



## ianann

Sooooo, after Repad with Arctic TP-3 it seems like I got slightly better contact on the core, not too much but still ... I ran the Port Royale Stress Test with 20 loops.

left is new with Arctiv TP-3 and without padding the inductor coils, right is old with alphacool rise ultra soft.

The 2mm pad which via manual had to be placed on the back of the GPU didn't even had contact with the former mounting. (No prints at all) Same for the MOSFETs back. I went with 3mm TP-3 GPU back and 2mm MOSFETs back now. Backplate gets noticable warmer. Maybe that even helped in the slight temp drop.

Guess I'll leave it at that. 

Cheers!


(edit) just saw I didn't ran the pumps with the same flow rate. Stupid me. 
Guess no major drops then. Going to retake the test right away.


----------



## mirkendargen

bmagnien said:


> Lol this is so tempting - someone walk me off the ledge:
> 
> 
> 
> View attachment 2584881


Once shipping is included it's not really any cheaper than a MORA, and a little smaller, less modular, etc. I actually just ordered a MORA 420 (bigger) with a Noctua AF20 adapter, wiring, dual D5 pump housing, and stupid overpriced fan grill and feet for 430eur shipped. Yeah once I get fans and especially pumps the total would be more....but I already have D5 pumps that are better than whatever single pump Bykski is putting in that. I wish they sold it without the pump/res/fans/etc as just a rad in a box for like $100, I'd jump all over that if shipping was reasonable.


----------



## Roacoe717

akgis said:


> Whats the best BIOS for a suprim liquid X, to try to push PL past 520w?
> thanks.


Middle of the page, I just flashed it myself and it works great.
link


----------



## KingEngineRevUp

The sycom 4090. Looks like it'll cool the vram to the point you lose performance lol.

1. The tubes are too close to the IO bracket, limits you to top mounting in many cases 

2. Why cover up 1/3 of one of your fans with that shroud?

Looks like a interesting card that is built to handle 600W but will have a 450W bios slapped on it.


----------



## Hanks552

bykski just released the 4090 MSI waterblock


https://www.aliexpress.us/item/3256804808525250.html?spm=a2g0o.order_list.0.0.6d601802htCeLr&gatewayAdapt=glo2usa&_randl_shipto=US


----------



## changboy

Maybe if using Aorus engine to oc the 4090 if you apply + 1000mhz on memory thats mean in real just +250mhz.
I cant go higher then +1900mhz, so maybe its mean +475mhz.

Pass this i have green screen at + 2000mhz. Maybe my memory dosent oc so well lol.
Not sure how its working for the memory, with my evga 3090 fftw3 ultra i using precision x. The memory oc better on my evga at a real + 1600mhz.


----------



## Jay-G30

changboy said:


> Maybe if using Aorus engine to oc the 4090 if you apply + 1000mhz on memory thats mean in real just +250mhz.
> I cant go higher then +1900mhz, so maybe its mean +475mhz.
> 
> Pass this i have green screen at + 2000mhz. Maybe my memory dosent oc so well lol.
> Not sure how its working for the memory, with my evga 3090 fftw3 ultra i using precision x. The memory oc better on my evga at a real + 1600mhz.


Have you tried with MSI Afterburner ? i have never used Aorus engine to oc but have used both MSI AB and EVGA precision x1 and both have different methods ( numbers ) to oc the memory . personally i prefer using msi . Give that a try and let us know what you can achieve


----------



## J7SC

Nizzen said:


> I'm turning off monitor while running PR. Never saw artifacts


...That reminds me of some older R290X subs I saw when the GPU would black-screen but continue to run and finish the bench ! Anyhow, this farce with 3DM and HWBot is going to get interesting when other cards show up that don't have a EEC and/or a toggle for it...there will be accusations of corporate bias. May be it's time to write to some corporate sponsors and make clear how this reflects badly on them...

I mentioned before that what they ought to do is embed a little artefact checker (like the old ATItool) into SystemInfo - Systeminfo already takes forever anyway while snooping around my computers for non-critical information. Forcing EEC on the other hand automatically introduces extra work cycles, no matter what.

Below are two readouts of 29k+ and 30K+ PR runs... (top left, no issues), and the one over _1,200 score pts higher_ (bottom right, issues)...haven't subbed either; wouldn't sub the latter, debating whether to sub anything at all now at 3DM


----------



## WayWayUp

waterblock recommendations for the FE

alphacool vs EK

also does the ek with active backplate show any difference in thermals?


----------



## yzonker

Got the Fasgear cable. Seems like good quality as others have reported. Clicks in nice and tight on the card. No change otherwise except 0.1v less droop. Same core/mem OC as best I can tell (not surprised by that, really no reason for it to change).









Amazon.com: Fasgear PCIe 5.0 GPU Power Cable Sleeved 70cm | 16pin(12+4) 12VHPWR Connector for RTX 3090 Ti 4080 4090 | 3x8pin (4+4) CPU Male Plugs Only for Corsair/Great Wall/Thermaltake Modular Power Supply : Electronics


Amazon.com: Fasgear PCIe 5.0 GPU Power Cable Sleeved 70cm | 16pin(12+4) 12VHPWR Connector for RTX 3090 Ti 4080 4090 | 3x8pin (4+4) CPU Male Plugs Only for Corsair/Great Wall/Thermaltake Modular Power Supply : Electronics



www.amazon.com


----------



## Mucho

WayWayUp said:


> waterblock recommendations for the FE
> 
> alphacool vs EK
> 
> also does the ek with active backplate show any difference in thermals?


Granzon Full Armor GPU Block For Nvidia RTX 4090 Founders Edition, Full Coverage Full Wrap Cooling Armor, Bykski Premium Sub-Brand High Quality Series GPU Water Cooling Cooler, GBN-RTX4090FE at formulamod sale


----------



## neteng101

Interesting day - upgraded BIOS on my MSI Z690-A Pro DDR4 board (v192 -> v190 release bios), Cablemod 12VHPWR cable came in (ordered 4 8-pin to 12VHPWR version), ran some quick tests and scored the highest runs I've done with the 4090 on 3DMark?! Was previously using the 3-way adapter that came with my Trio card.

This is just with my daily OC, on the SuprimX air 520w bios, +120 core, +1500 memory, no voltage increase. Wondering if the extra 4th power input somehow made a difference in scores, but a bunch of variables have changed since I last did runs, Nvidia driver version, BIOS on motherboard, etc.









I scored 18 104 in Time Spy Extreme


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com













I scored 27 632 in Port Royal


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com













I scored 10 777 in Speed Way


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com





Cablemod installed, my old EVGA G3 soldiers on, haven't tried a max CPU+GPU OCCT load test recently, did hold up fine with the 12700k+3080Ti running the Galax bios though.


----------



## changboy

In MSI afterburner i cant set the voltage its grey out


----------



## KingEngineRevUp

WayWayUp said:


> waterblock recommendations for the FE
> 
> alphacool vs EK
> 
> also does the ek with active backplate show any difference in thermals?


This question has been asked a few times, even asked 12 hours ago. Try to search.


----------



## KingEngineRevUp

changboy said:


> In MSI afterburner i cant set the voltage its grey out


Download the latest beta version 

Edit: Also, be careful. There's a fake one going around that installs malware and crypto mining software.


----------



## changboy

KingEngineRevUp said:


> Download the latest beta version
> 
> Edit: Also, be careful. There's a fake one going around that installs malware and crypto mining software.


lol i think i downloaded that one grrrrr.


----------



## bsch3r

KingEngineRevUp said:


> How are you not getting it? You need to OC your GPU memory more to break 29-30K in PR.


In 3DMark PR HOF the top ranks using watercooling or air run the video memory around 1500 MHz. I see no scores with mem speed of 1700+. So why should GPU memory speed be so important?


----------



## KingEngineRevUp

changboy said:


> lol i think i downloaded that one grrrrr.


Always grab latest beta and official release from guru 3d website


----------



## th3illusiveman

Still cant get over how much VRAM this card has lol. Almost as much as my RAM (16GB) + 3080 (10GB) VRAM and all on one chip!

I doubt any games will make use of it before the cards high performance life cycle is through though. 5090 will probably actually benefit from it w/ actual 8k gaming tragets.


----------



## neteng101

changboy said:


> lol i think i downloaded that one grrrrr.


I did this with the latest beta version of MSI Afterburner, voltage slider unlocked...






GUIDE to enable "voltage control" in MSI Afterburner


INTRODUCTION If you've installed MSI Afterburner before (usually on a notebook) you'd remember it didn't have GPU voltage control. There are unsupported desktop GPU models that require certain workarounds to have voltage control as well working in MSI Afterburner. So this isn't just limited...



rog.asus.com


----------



## KingEngineRevUp

bsch3r said:


> In 3DMark PR HOF the top ranks using watercooling or air run the video memory around 1500 MHz. I see no scores with mem speed of 1700+. So why should GPU memory speed be so important?


On average, it yields more performance than OC the core. That doesn't speak for everyone of course. There is a number on the core that surpasses someone that can do +2000 on the memory for instance. What that number is not known but can be found on time spy and port royal results searching.

im on a cruise ship so can't really look that easily right now.


----------



## bsch3r

Wonder if one should leave the protective film on the memory pads to reduce heat transfer when mounting a waterblock. What an absurd situation with this gen memory.


----------



## BigMack70

Well, I think my GPU is not the issue. So at least my $$$$$$$ 4090 is OK. 

That said, I'm in the worst troubleshooting hell I've been in for 20 years. Got my Z790 Strix / DDR5 and it won't boot. Went and bought a second 13700k and it still won't boot. Same problems on iGPU as with the 4090.

If it weren't so dang expensive it would be funny. I guess I need to try another motherboard or RAM.

Sometimes I think I should just sell everything and buy an Xbox and delete my OCN account


----------



## yzonker

BigMack70 said:


> Well, I think my GPU is not the issue. So at least my $$$$$$$ 4090 is OK.
> 
> That said, I'm in the worst troubleshooting hell I've been in for 20 years. Got my Z790 Strix / DDR5 and it won't boot. Went and bought a second 13700k and it still won't boot. Same problems on iGPU as with the 4090.
> 
> If it weren't so dang expensive it would be funny. I guess I need to try another motherboard or RAM.
> 
> Sometimes I think I should just sell everything and buy an Xbox and delete my OCN account


Be a little more specific. What codes are you getting on the display? Memory training? CPU related?


----------



## BigMack70

yzonker said:


> Be a little more specific. What codes are you getting on the display? Memory training? CPU related?


My Asus z790 strix gives me a Q-code readout of 16 which is "pre-system agent initialization is started" and won't boot.

It does that on two different 13700k CPUs. With or without a discrete 4090 installed. On the newest BIOS. With or without any SSDs installed. With default BIOS settings or with increased VSSA/memory controller voltages. It does it with both one or two DDR5 modules installed and regardless of their slot positions.


----------



## motivman

bsch3r said:


> Wonder if one should leave the protective film on the memory pads to reduce heat transfer when mounting a waterblock. What an absurd situation with this gen memory.


no just use a bad thermal pad, like the stock ekwb pads. 10c higher temps compared to using gelid extreme pads.


----------



## yzonker

BigMack70 said:


> My Asus z790 strix gives me a Q-code readout of 16 which is "pre-system agent initialization is started" and won't boot.
> 
> It does that on two different 13700k CPUs. With or without a discrete 4090 installed. On the newest BIOS. With or without any SSDs installed. With default BIOS settings or with increased VSSA/memory controller voltages. It does it with both one or two DDR5 modules installed and regardless of their slot positions.


Hold on. If it doesn't post, how did you try different voltages? Or is XMP just not working?


----------



## yzonker

motivman said:


> no just use a bad thermal pad, like the stock ekwb pads. 10c higher temps compared to using gelid extreme pads.


I considered doing this. Although I think it might help when benchmarking, it wouldn't be very effective for daily use. This is because the mem chips will still cold soak at idle. Then the mem will lose stability on the desktop and artifact and/or black screen.


----------



## cerealkeller

What's the best choice for a 600w BIOS for the MSI Gaming Trio? Are there instructions on here somewhere, I've done this before, but not since I had a 2080 Ti. Is there any risk of popping fuses like on the 3090's?


----------



## cerealkeller

neteng101 said:


> Finally got to comparing MSI bios options - Gaming Trio vs. Suprim X (Air) vs. 600w Suprim X Liquid... at 480w power limit. The liquid gave the highest score but X (Air) was just a few points so its margin of error they're basically the same - because the Gaming Trio is not factory overclocked I had to tweak the core and couldn't replicate the same OC as on the Suprim bioses. Hardware Unboxed tested 2 of the cards and found the Gaming Trio's bios is set to run fans much slower - I can confirm - the Gaming bios runs fans around 1500rpm, saw 2000rpm on the X Air... the 600w liquid bios doesn't run fans at the same speed, with Fan 1 going to 2200rpm, fan 2 a bit less.
> 
> Overall, unless you really want to push it, the 520w Suprim X (Air) bios seems to be the best option for a Gaming Trio - exceeding the Trio's own bios. Fan speeds on the Trio bios is way too slow. Unless you're going to tinker with voltage overclock @1.1V, you won't really need the 600w power limit either.


So you can confirm that both of the Suprim cards BIOS works on the Trio? My water block is coming soon so I'd go with the 600w BIOS personally without any fans concerns? Is there any concern with popping fuses like there was on the 3090's that you're aware of?


----------



## KingEngineRevUp

motivman said:


> no just use a bad thermal pad, like the stock ekwb pads. 10c higher temps compared to using gelid extreme pads.


I was thinking we buy thermal pads off digikey and get one with a 0.25 or 0.5 W/m2 rates lol


----------



## lolhaxz

What are people seeing for power draw? My core seems to be a hungry boy.. hard to get a good impression even reading many of the posts.

Doesn't really seem worthwhile (for my sample) going beyond 1000-1050mv

Some testing in Cyberpunk....





































2min - Averages:
34FPS ~ 302W TBP @ 875mv - 2505MHz core (2426MHz effective)
36FPS ~ 327W TBP @ 900mv - 2565MHz core (2519MHz effective)
40FPS ~ 438W TBP @ 1000mv - 2880MHz core (2831MHz effective)
40FPS ~ 494W TBP @ 1050mv - 3000MHz core (2856MHz effective) --- I suspect MSVDD is too low here, needs higher voltage for better effective clocks.

4FPS for almost ~150W, yeah, nah.

+1433MHz Memory

Cyberpunk Settings: no DLSS, 4K/RT all enabled, every single setting at max or physco

Seems most GPU's are pulling a good 80+W less than mine, but otherwise, average and will do the job


----------



## J7SC

yzonker said:


> I considered doing this. Although I think it might help when benchmarking, it wouldn't be very effective for daily use. This is because the mem chips will still cold soak at idle. Then the mem will lose stability on the desktop and artifact and/or black screen.


...I wonder whether this VRAM problem isn't really a problem unless one's life centers solely on benching...24 Gbps (my setting) is pretty cool good for most things I do. FYI, to get higher VRAM for benching, I run a few cycles of memtest-vulkan right before. I can also get higher & stable VRAM clocks when running the Win desktop resolution at 4096x2160 @ 120 Hz and HDR though that can obviously impact other parts of the score, depending on the specific bench. Finally, I already mentioned that I have a heater element from an old LN2 pot that likely fits - a project for the late winter, perhaps. That would have the advantage of being able to tun heat on/off and/or regulate via a potentiometer.



lolhaxz said:


> What are people seeing for power draw? My core seems to be a hungry boy.. hard to get a good impression even reading many of the posts.
> 
> Doesn't really seem worthwhile (for my sample) going beyond 1000-1050mv
> 
> Some testing in Cyberpunk....
> 
> View attachment 2585071
> 
> 
> View attachment 2585072
> 
> 
> View attachment 2585073
> 
> 
> View attachment 2585074
> 
> 
> 2min - Averages:
> 34FPS ~ 302W TBP @ 875mv - 2505MHz core (2426MHz effective)
> 36FPS ~ 327W TBP @ 900mv - 2565MHz core (2519MHz effective)
> 40FPS ~ 438W TBP @ 1000mv - 2880MHz core (2831MHz effective)
> 40FPS ~ 494W TBP @ 1050mv - 3000MHz core (2856MHz effective) --- I suspect MSVDD is too low here, needs higher voltage for better effective clocks.
> 
> 4FPS for almost ~150W, yeah, nah.
> 
> +1433MHz Memory
> 
> Cyberpunk Settings: no DLSS, 4K/RT all enabled, every single setting at max or physco
> 
> Seems most GPU's are pulling a good 80+W less than mine, but otherwise, average and will do the job


For gaming, these 4090s are extremely efficient...for example, in FS2020 / DLSS3 +F.I., I leave the GPU on default an still am blown away by the fps. That typically translates to around 380 W - 390 W.


----------



## neteng101

cerealkeller said:


> So you can confirm that both of the Suprim cards BIOS works on the Trio?


Yup - the liquid bios works but fans don't run at same speed, the air Suprim bios runs fans at same speed. And the Trio bios is junk cause it runs the fans super slow.

Edit - once you have your water block you won't care about fan speeds so yeah just go for the 600w bios.


----------



## changboy

With the right afterburner its working lol. I can put my memory at +950 for 1431mhz in gpuz, with aorus engine it was 1413mhz so not a big step 
I played cyberpunk for 2 hour my memory at +950mhz and all was fine.The core at 3040 mhz.


----------



## KingEngineRevUp

J7SC said:


> ...I wonder whether this VRAM problem isn't really a problem unless one's life centers solely on benching...24 Gbps (my setting) is pretty cool good for most things I do. FYI, to get higher VRAM for benching, I run a few cycles of memtest-vulkan right before. I can also get higher & stable VRAM clocks when running the Win desktop resolution at 4096x2160 @ 120 Hz and HDR though that can obviously impact other parts of the score, depending on the specific bench. Finally, I already mentioned that I have a heater element from an old LN2 pot that likely fits - a project for the late winter, perhaps. That would have the advantage of being able to tun heat on/off and/or regulate via a potentiometer.
> 
> 
> 
> For gaming, these 4090s are extremely efficient...for example, in FS2020 / DLSS3 +F.I., I leave the GPU on default an still am blown away by the fps. That typically translates to around 380 W - 390 W.


It's not a problem, it runs within spec at the temperature ranges that were promised.

It's only a problem for a *turbo nerd*


----------



## Arizor

KingEngineRevUp said:


> It's not a problem, it runs within spec at the temperature ranges that were promised.
> 
> It's only a problem for a *turbo nerd*


----------



## UndaDawg

Own a zotac 4090 amp airo. Was getting +220 on core +1500 on memory voltage slider, power and temp all the way up. GPU crushed anything i threw at it. That is until I went from a 3 monitor setup to a 2 monitor set up. I dont know what happend but now memory overclock wont go past +200. I tried new cables, rolling back driver, reinstalling windows 10, using just my original monitor by itself and switching to the 3 different DP ports, I took the card out unplugged MB power removed baterry of 1/2 hr then reinstalled. Flashed MB with new bios, flashed gpu with new bios. Still wont go past +200. GPU runs of 12VHP cable. Does anyone hv any idea what caused this and can it be fixed?

FYI Old monitor setup-- 2-1080P (1 at 144hz, 1 at 240hz) and 1 - 1440P monitor at 240hz
New monitor setup-- 2 - 1440P monitors at 240hz
12900k
corsair ddr5 5600
Asus z690 gaming-e
1000w MSI mpg-a ATX-3


----------



## UndaDawg

Here is my crash report from OCCT if someone can help me decipher the problem that would be appreciated.


----------



## Nd4spdvn

lolhaxz said:


> What are people seeing for power draw? My core seems to be a hungry boy.. hard to get a good impression even reading many of the posts.
> 
> Doesn't really seem worthwhile (for my sample) going beyond 1000-1050mv
> 
> Some testing in Cyberpunk....


I'm seeing between 470-490W in Cyberpunk on my 4090 SuprimX aircooled. This is at 3030-3045Mhz core (effective in the range of 2840-2920MHz) and 1.1V and is hitting both voltage and power limiters with this 520W official bios.

Interesting piece of software you use, Ampere Tools v 0.1, did not hear about it, can you please share more details and where to get it?


----------



## ianann

So when I fire up OCCT or Furmark with max Setting on 133% and a Power draw of about 600W, GPU runs in a deltaT of about 26K with the Hotspot hovering at a plus of 13K on top of that. Memory stays at a delta of a nearly chilling 8K. My best guess is, its just the block which isn't able to pull the heat of about 600W from the core quickly enough. 

Do you guys think that's fine at about 140 l/h flow rate or to high? OCCT though is seriously nothing near "normal" bench/ real-world workload.


----------



## J7SC

ianann said:


> So when I fire up OCCT or Furmark with max Setting on 133% and a Power draw of about 600W, GPU runs in a deltaT of about 26K with the Hotspot hovering at a plus of 13K on top of that. Memory stays at a delta of a nearly chilling 8K. My best guess is, its just the block which isn't able to pull the heat of about 600W from the core quickly enough.
> 
> Do you guys think that's fine at about 140 l/h flow rate or to high? OCCT though is seriously nothing near "normal" bench/ real-world workload.


...forgot about OCCT's GPU and VRAM tests - wow, first time I have seen 590 W on my water-cooled Giga-G-OC, and that is obviously w/o power for fans and RGB. With that came the highest temps I have ever seen; GPU and +23 C over ambient, Hotspot at +33 C over ambient, and VRAM at only +10 C over ambient. Peak GPU was 3135 MHz, VRAM at 1502, max PL and voltage on the sliders.

In a typical bench scenario such as 3DM PR, deltas are much lower apart from VRAM which (thankfully) gets into the mid 40 C (absolute, not delta). I think your cooling system is fine and I wouldn't worry about it. FYI, I am using 2x D5s (75% speed setting) and 1320x63 rads w/push+pull Arctic P12s.


----------



## Krzych04650

lolhaxz said:


> 2min - Averages:
> 34FPS ~ 302W TBP @ 875mv - 2505MHz core (2426MHz effective)
> 36FPS ~ 327W TBP @ 900mv - 2565MHz core (2519MHz effective)
> 40FPS ~ 438W TBP @ 1000mv - 2880MHz core (2831MHz effective)
> 40FPS ~ 494W TBP @ 1050mv - 3000MHz core (2856MHz effective) --- I suspect MSVDD is too low here, needs higher voltage for better effective clocks.
> 
> 4FPS for almost ~150W, yeah, nah.


This is a perfect example of why talking about the difference in FPS instead of percentage is so silly. 4 FPS difference is this case is 11%. For a more realistic game scenario, it is like going from 100 FPS to 90. Also, the power draw difference is 111W, not 150, and 875mv setting that draws 136W less is a massive 15% slower, so you would have 85 FPS instead of 100. It is basically going one tier down at this point, even if the efficiency is improved.

Full 1100mv should also be able to reach 2900+ better effective clocks and maybe reach over 41 FPS in your test, but Cyberpunk really hits the clocks and power massively so it may not be achievable, especially on air. That is where real diminishing returns hit and we basically only do it to have 3000 MHz on the OSD, but the difference between reasonable tune like that 1050mv setting and 350W UV is still substantial, around 10%.


----------



## yzonker

J7SC said:


> ...I wonder whether this VRAM problem isn't really a problem unless one's life centers solely on benching...24 Gbps (my setting) is pretty cool good for most things I do. FYI, to get higher VRAM for benching, I run a few cycles of memtest-vulkan right before. I can also get higher & stable VRAM clocks when running the Win desktop resolution at 4096x2160 @ 120 Hz and HDR though that can obviously impact other parts of the score, depending on the specific bench. Finally, I already mentioned that I have a heater element from an old LN2 pot that likely fits - a project for the late winter, perhaps. That would have the advantage of being able to tun heat on/off and/or regulate via a potentiometer.
> 
> 
> 
> For gaming, these 4090s are extremely efficient...for example, in FS2020 / DLSS3 +F.I., I leave the GPU on default an still am blown away by the fps. That typically translates to around 380 W - 390 W.


Ha, that is what I started out with (running mem test right before a run). Using a heater on the backplate works a bit better, but is limited to 8-10C in actual VRAM temp increase since it's going through the backplate, relatively thick pads, and the PCB.


----------



## ianann

J7SC said:


> ...forgot about OCCT's GPU and VRAM tests - wow, first time I have seen 590 W on my water-cooled Giga-G-OC, and that is obviously w/o power for fans and RGB. With that came the highest temps I have ever seen; GPU and +23 C over ambient, Hotspot at +33 C over ambient, and VRAM at only +10 C over ambient. Peak GPU was 3135 MHz, VRAM at 1502, max PL and voltage on the sliders.
> 
> In a typical bench scenario such as 3DM PR, deltas are much lower apart from VRAM which (thankfully) gets into the mid 40 C (absolute, not delta). I think your cooling system is fine and I wouldn't worry about it. FYI, I am using 2x D5s (75% speed setting) and 1320x63 rads w/push+pull Arctic P12s.


Thats what I was thinking, thanks for your confirmation/ opinion and results. Workload OCCT puts on the GPU is utterly insane. Guess if you really want to check for stability on GPU and PSU, that's the way to go. _Inductor Coils go brrrrrr_


----------



## FIIZiK_

Anyone tried flashing the bios of a PNY 4090 XLR8 (non oc). I got it on water but is limited at 450w. (it's not my PSU, I had the FE from a friend running at full 600w)


----------



## Jay-G30

FIIZiK_ said:


> Anyone tried flashing the bios of a PNY 4090 XLR8 (non oc). I got it on water but is limited at 450w. (it's not my PSU, I had the FE from a friend running at full 600w)


You will need to download another bios for the card to increase it over 450w , i can't see any PNY one yet but have used a Gigabyte 600w bios on a Zotac Trinity OC i had previously and worked perfectly 









TechPowerUp


Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.




www.techpowerup.com


----------



## neteng101

ianann said:


> Workload OCCT puts on the GPU is utterly insane. Guess if you really want to check for stability on GPU and PSU, that's the way to go. _Inductor Coils go brrrrrr_


Try the Power test in OCCT - my UPS instantly complains it can't deal with the load. I wouldn't run a burn in test, but a quick test will let you know if your PSU/system can sustain a crazy peak for a period of time. Seeing peak ~516W on the SuprimX bios I'm running (520W TDP) and ~320W on the 13700k. I'm literally scared to run that as a test given my PSU, but those Super Flower units built for EVGA are stout.


----------



## KingEngineRevUp

ianann said:


> So when I fire up OCCT or Furmark with max Setting on 133% and a Power draw of about 600W, GPU runs in a deltaT of about 26K with the Hotspot hovering at a plus of 13K on top of that. Memory stays at a delta of a nearly chilling 8K. My best guess is, its just the block which isn't able to pull the heat of about 600W from the core quickly enough.
> 
> Do you guys think that's fine at about 140 l/h flow rate or to high? OCCT though is seriously nothing near "normal" bench/ real-world workload.


It's not just that. 

Previously, the 3090 would draw only about 50-60% of it's power draw into the die.

The 4090 draws about 80-90% of it's power draw into the die.

Multiply your power draw by 1.6, that's roughly what a 3090 would be running at.

So at 600W, it's like running a 3090 at 960W.

So it's not a surprise if the Delta is that high. If you had a 16C Delta at 600W on a 3090, it'll now be 25.5C Delta on a 4090 at 600W which mathematically seems reasonable.


----------



## FIIZiK_

Jay-G30 said:


> You will need to download another bios for the card to increase it over 450w , i can't see any PNY one yet but have used a Gigabyte 600w bios on a Zotac Trinity OC i had previously and worked perfectly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> TechPowerUp
> 
> 
> Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.
> 
> 
> 
> 
> www.techpowerup.com


I know I can flash another bios to unlock full 600w, I just didn't want to brick the GPU (especially since the loop is fully finished and the gpu is waterblocked).
That's why I was asking for the PNY OC bios, since they have the same PCB.

Has anyone tried any bios on the PNY non-oc XLR8 GPU?

EDIT: Apparently I cannot even backup the current one:


----------



## Jay-G30

FIIZiK_ said:


> I know I can flash another bios to unlock full 600w, I just didn't want to brick the GPU (especially since the loop is fully finished and the gpu is waterblocked).
> That's why I was asking for the PNY OC bios, since they have the same PCB.
> 
> Has anyone tried any bios on the PNY non-oc XLR8 GPU?


unless the pc shut down during a bios flash i cant see you having any issues flashing another 4090 aib bios on to your card especially considering it is in a waterblock and will not have to worry about the fans playing up due to different fan firmware etc


----------



## FIIZiK_

Jay-G30 said:


> unless the pc shut down during a bios flash i cant see you having any issues flashing another 4090 aib bios on to your card especially considering it is in a waterblock and will not have to worry about the fans playing up due to different fan firmware etc


So that means that any AIB bios would work? Like the STRIX one?


----------



## Jay-G30

FIIZiK_ said:


> So that means that any AIB bios would work? Like the STRIX one?


Correct although with the strix bios you lose one of your display ports as the strix comes with 2 hdmi and 3 display ports . Gigabyte one will work just as well and wont lose display ports . You need the latest version of GPU-Z to save your bios that way or you can just use NVflash to do the same thing


----------



## FIIZiK_

Jay-G30 said:


> Correct although with the strix bios you lose one of your display ports as the strix comes with 2 hdmi and 3 display ports . Gigabyte one will work just as well and wont lose display ports . You need the latest version of GPU-Z to save your bios that way or you can just use NVflash to do the same thing


Thanks for your help.
And last question, with the 4090fe I was drawing around 500-520watts with 125% power limit, 2950mhz core and 12300mhz vram.
With the pny, with 3050mhz core and 11500mhz vram I am drawing circa 420-430watts.

There is not much info online about the PNY power delivery system, but do you think with a bios flash and then more hz on the vram, it could support an extra 100watts bringing it to the level of the FE (in terms of power consumption)?


----------



## Jay-G30

FIIZiK_ said:


> Thanks for your help.
> And last question, with the 4090fe I was drawing around 500-520watts with 125% power limit, 2950mhz core and 12300mhz vram.
> With the pny, with 3050mhz core and 11500mhz vram I am drawing circa 420-430watts.
> 
> There is not much info online about the PNY power delivery system, but do you think with a bios flash and then more hz on the vram, it could support an extra 100watts bringing it to the level of the FE (in terms of power consumption)?


A bios flash will allow the card to exceed 450w it has now , it may give you the ability to push the core clocks a little higher but wont effect the amount you can overclock the vram . you say the pny card is watercooled ? if so it probably wont reach as high on a memory overclock due to the vram being colder which leads to instability at higher higher mem oc


----------



## AKBrian

FIIZiK_ said:


> I know I can flash another bios to unlock full 600w, I just didn't want to brick the GPU (especially since the loop is fully finished and the gpu is waterblocked).
> That's why I was asking for the PNY OC bios, since they have the same PCB.
> 
> Has anyone tried any bios on the PNY non-oc XLR8 GPU?
> 
> EDIT: Apparently I cannot even backup the current one:
> View attachment 2585186


You'll need to update GPU-Z in order to save and submit 4090/4080 VBIOS files. It was added in 2.51.0.


----------



## FIIZiK_

AKBrian said:


> You'll need to update GPU-Z in order to save and submit 4090/4080 VBIOS files. It was added in 2.51.0.


Yes, updated it and submitted my official pny bios.



Jay-G30 said:


> A bios flash will allow the card to exceed 450w it has now , it may give you the ability to push the core clocks a little higher but wont effect the amount you can overclock the vram . you say the pny card is watercooled ? if so it probably wont reach as high on a memory overclock due to the vram being colder which leads to instability at higher higher mem oc


Yes, I don't expect the memory to get a higher oc because of the bios. I'm doing the bios for the power limit unlock.

The question was more like "if the vram can get higher in terms of frequency, it will need more power (on the FE, 800mhz extra increases power consumption by 80watts ish), do you think my vrms/power delivery will be fine?"

The pny has worse power delivery/vrm than the 4090 fe.


----------



## ianann

KingEngineRevUp said:


> It's not just that.
> 
> Previously, the 3090 would draw only about 50-60% of it's power draw into the die.
> 
> The 4090 draws about 80-90% of it's power draw into the die.


I know this has been around for a while, what is the reason for that? Where did the 50-40% go on the 3090?


----------



## Jay-G30

FIIZiK_ said:


> Yes, updated it and submitted my official pny bios.
> 
> 
> 
> Yes, I don't expect the memory to get a higher oc because of the bios. I'm doing the bios for the power limit unlock.
> 
> The question was more like "if the vram can get higher in terms of frequency, it will need more power (on the FE, 800mhz extra increases power consumption by 80watts ish), do you think my vrms/power delivery will be fine?"
> 
> The pny has worse power delivery/vrm than the 4090 fe.


If the card is watercooled and that covers the vrm etc then yes the card will be fine with 600w bios , if it is one air then would still be fine but would get hotter , once all said and done you will see very very little gain going from 450w to 600w unless you are power limited and even then will only be a few fps give or take


----------



## Sheyster

Just wanted to share my experience with the ASUS V2 Strix BIOS (500w default PL) in a gaming session. For reference, I was previously using the default GB-G-OC performance BIOS. I had a feeling I needed a little more headroom than the default GB 450w power limit would allow, but I wanted something that I could just "power on and go", so the ASUS Strix BIOS made sense due to the increased default PL.

In a 2 hour combined MW2 ground war and WZ2 session in 4K, it hit 98% power limit (491w). I used the default core, mem and fan settings, GPU temp maxed out at 64C.


----------



## mirkendargen

ianann said:


> I know this has been around for a while, what is the reason for that? Where did the 50-40% go on the 3090?


The first gen of GDDR6X was way less power efficient, it alone took like 100W at load.


----------



## J7SC

Sheyster said:


> Just wanted to share my experience with the ASUS V2 Strix BIOS (500w default PL) in a gaming session. For reference, I was previously using the default GB-G-OC performance BIOS. I had a feeling I needed a little more headroom than the default GB 450w power limit would allow, but I wanted something that I could just "power on and go", so the ASUS Strix BIOS made sense due to the increased default PL.
> 
> In a 2 hour combined MW2 ground war and WZ2 session in 4K, it hit 98% power limit (491w). I used the default core, mem and fan settings, GPU temp maxed out at 64C.


 ...have you tried that Asus V2 Srix bios for a bit of benching ? Specifically, how do the VRAM results compare to the Gigabyte G-OC ? Some vendors trade a bit bandwidth for tighter timings.


----------



## motivman

Update on my 4090 Gigabyte OC after repaste with TFX on the core and Gelid Extreme on the memory, big improvement from before with hotspot temps (98C to 86C), but I still think hotspot should be about 5C to 6C cooler. Anyways, I will just wait for EK waterblock at this point.


----------



## ianann

mirkendargen said:


> The first gen of GDDR6X was way less power efficient, it alone took like 100W at load.


Ah I see, thank you! 
Do you have any links regarding that topic?


----------



## KingEngineRevUp

ianann said:


> I know this has been around for a while, what is the reason for that? Where did the 50-40% go on the 3090?


I actually don't have a clue. That's beyond my expertise. I would like to know myself.


----------



## neteng101

motivman said:


> big improvement from before with hotspot temps (98C to 86C)


98C is crazy, hotspot in the 80s is the most I've seen most people report.


----------



## Petrarca

FIIZiK_ said:


> Thanks for your help.
> And last question, with the 4090fe I was drawing around 500-520watts with 125% power limit, 2950mhz core and 12300mhz vram.
> With the pny, with 3050mhz core and 11500mhz vram I am drawing circa 420-430watts.
> 
> There is not much info online about the PNY power delivery system, but do you think with a bios flash and then more hz on the vram, it could support an extra 100watts bringing it to the level of the FE (in terms of power consumption)?


pny has one of the worst vrms. 14 +3 50a each.


----------



## bsch3r

what and where is this hot spot exactly? A temp probe located somewhere on the pcb or the hottest spot inside the gpu die?


----------



## yzonker

bsch3r said:


> what and where is this hot spot exactly? A temp probe located somewhere on the pcb or the hottest spot inside the gpu die?


Hottest sensor in the die and possibly the VRM (30 series included the VRM).


----------



## Sheyster

J7SC said:


> ...have you tried that Asus V2 Srix bios for a bit of benching ? Specifically, how do the VRAM results compare to the Gigabyte G-OC ? Some vendors trade a bit bandwidth for tighter timings.


Not yet, I've been too busy having fun in various games!  I will test again soon.


----------



## WayWayUp

just got my 4090 founders the other day. super happy with the results. 
I was very worried about my bin since i was coming from a 3090 which was super ultra golden bin. the memory sucked tho at 1100 but the core often reached and held 2295 constant in games on water which is unheard of 

I know ill never get as lucky with bin again as that thing was like 1 in 100k but this turned out good definitely above average and at least my memory doesn't suck!

now i need a block... waiting until Jan though in case of a 4090ti announcement. Tested my luck in fire strike ultra as well. I think i can crack the top 10


----------



## WayWayUp

i just checked and im running in 4.0 x 8 mode 

Im going to take out the m.2 ssd and move it to a dimm.2 gen 4

good news at least is that i should gain a few hundred points in benchmarks and get another 1-3% in games when i finish


----------



## J7SC

Sheyster said:


> Not yet, I've been too busy having fun in various games!  I will test again soon.


...I know what you mean - I got this 4090 primarily for my fav games and sims. Just had an hour with FS2020 4K HDR Ultra max everything / DLSS3 / F.I. ...the GPU easily reaches 3 giggles at 1.050 V (around 400 W in FS2020 w/ my settings in spoiler), though it doesn't really need any oc for it. GSync / OLED and DLSS3/F.I. continues to blow my mind...









FS2020 game clocks



Spoiler



~25 C ambient


----------



## cerealkeller

neteng101 said:


> Yup - the liquid bios works but fans don't run at same speed, the air Suprim bios runs fans at same speed. And the Trio bios is junk cause it runs the fans super slow.
> 
> Edit - once you have your water block you won't care about fan speeds so yeah just go for the 600w bios.


Thanks for the assist, flashed the Suprim X BIOS while i wait for my block, everything went well. It’s crazy how much juice this thing pulls, playing God of War 2018 at 8K and it’s maxing it out like nothing


----------



## ESRCJ

Does anyone know which parameters need to be changed to force resizable BAR in Nvidia Profile Inspector on the 4090 currently? I noticed the same settings are no longer present in Nvidia Profile Inspector.


----------



## yzonker

ESRCJ said:


> Does anyone know which parameters need to be changed to force resizable BAR in Nvidia Profile Inspector on the 4090 currently? I noticed the same settings are no longer present in Nvidia Profile Inspector.


I think in the newest version they intent for you to use these settings in the Common section,


----------



## bmagnien

Is it worth swapping the stock bios on the gigaoc to the strix bios? Is there a meaningful improvement in performance? Perhaps from tighter vram timings? And if so - which exact strix bios? Thanks


----------



## newls1

yzonker said:


> I think in the newest version they intent for you to use these settings in the Common section,
> 
> View attachment 2585454


is it possible to force rebar globally?


----------



## yzonker

newls1 said:


> is it possible to force rebar globally?


The screenshot I showed are the same 3 settings I think we used in the older versions. Just set them like below,


----------



## newls1

yzonker said:


> The screenshot I showed are the same 3 settings I think we used in the older versions. Just set them like below,
> 
> View attachment 2585490


im not very familiar with that app, only used it to force rebar in 3dmark that i learned in this thread. So is what you are saying the step to force rebar in a "per app" basis or is this a global thing?


----------



## Panchovix

bmagnien said:


> Is it worth swapping the stock bios on the gigaoc to the strix bios? Is there a meaningful improvement in performance? Perhaps from tighter vram timings? And if so - which exact strix bios? Thanks


Honestly not sure, maybe ASUS has tighter timings on their VBIOS? We can't know atm without an "ADA Bios Editor" (with editing disabled of course, but it lets you see info), like ABE (Ampere BIOS Editor) or TBE (Turing BIOS Editor), the only way is to test the same speed with different vbios and see which is faster lol


----------



## J7SC

bmagnien said:


> Is it worth swapping the stock bios on the gigaoc to the strix bios? Is there a meaningful improvement in performance? Perhaps from tighter vram timings? And if so - which exact strix bios? Thanks


...wondered about > the same thing yesterday, but no rush...learning a lot of little steps to take w/ water-cooled VRAM above 45 C w/o heating up the core too much.


----------



## keikei

Guys, I'm gaming @ 4k in BF 2042. I have a 3700X with usage of 70-90%. I know the game isnt the best optimized. It had similar numbers with my prior 7800XT. Is the cpu bottlenecked?


----------



## motivman

bmagnien said:


> Is it worth swapping the stock bios on the gigaoc to the strix bios? Is there a meaningful improvement in performance? Perhaps from tighter vram timings? And if so - which exact strix bios? Thanks


With strix bios, my scores were slightly higher, but power usage also higher too ( 595W in timespy extreme)... I went back to regular gigabyte bios till I get a waterblock.

*rename bios to .rom*


----------



## Nizzen

keikei said:


> Guys, I'm gaming @ 4k in BF 2042. I have a 3700X with usage of 70-90%. I know the game isnt the best optimized. It had similar numbers with my prior 7800XT. Is the cpu bottlenecked?


Game is very optimized now 

I think it's safe to say you are cpu bound and memory performance bound 

Set resolution to 1080p and if you don't have 250-320 fps, you are VERY cpu bound 

*PCMR*

*CPU*: AMD RYZEN 3700X, 3.6 ghz (base), 4.4 ghz (boost), 8c/16t | *Motherboard*: ASROCK TAICHI X370 | *GPU*: MSI Gaming Trio RTX 4090 | *RAM*: G.SKILL 32GB Ripjaws V Series DDR4 PC4-25600 3200MHz |


----------



## motivman

Panchovix said:


> Honestly not sure, maybe ASUS has tighter timings on their VBIOS? We can't know atm without an "ADA Bios Editor" (with editing disabled of course, but it lets you see info), like ABE (Ampere BIOS Editor) or TBE (Turing BIOS Editor), the only way is to test the same speed with different vbios and see which is faster lol


Asus bios scores higher and uses more power on my 4090 gaming OC


----------



## MrTOOSHORT

motivman said:


> Asus bios scores higher and uses more power on my 4090 gaming OC


I need to try that bios. Did some 3dmarks yesterday scoring pretty good on the stock giga OC bios.


----------



## yzonker

Firestrike is interesting on the 13900k. Apparently Windows 10/11 doesn't direct threads correctly l, so the trick is to disable some e-cores but not nearly all. Looking at the top scores, there are various numbers disabled too.

I had 12 enabled for this run. 









I scored 66 231 in Fire Strike


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## J7SC

motivman said:


> Asus bios scores higher and uses more power on my 4090 gaming OC


May be I'll give the Asus a try late next week. Absolute power consumption isn't a problem for the stock Giga-OC (OCCT proved that ), but I'm hoping that the VRAM is treated a bit more aggressively with the Strix V2. Getting over +1500 MHz on the MSI AB slider for VRAM when it is less than mid 40 C is difficult unless I 'pre-heat' it with certain apps (doable). At higher 40s C to 50 C, it can get about +1530s. At the same time, I like to keep the core below 43 C peak....so one kind of works against the other.



yzonker said:


> Firestrike is interesting on the 13900k. Apparently Windows 10/11 doesn't direct threads correctly l, so the trick is to disable some e-cores but not nearly all. Looking at the top scores, there are various numbers disabled too.
> 
> I had 12 enabled for this run.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 66 231 in Fire Strike
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Nice ! I've yet to try a single run of the regular TimeSpy w/ the 4090...my 6900XT w/custom vbios would go berserk with TimeSpy re. peaks of 1000+ fps; that scares me...


----------



## yzonker

@J7SC Firestrike is, not surprisingly, extremely CPU dependent now with the 4090. Disabling some e-cores even has a significant impact on the graphics score. 

I've been mostly working on a good CPU config for it. Once I get that sorted out I may make a little more effort (chiller, backplate heater, etc...). Still my initial effort got me to 10th in the HOF so that's encouraging. 

And yea AMD kicks butt in FS and TS. It'll be interesting to see what the 7900 can do.


----------



## J7SC

yzonker said:


> @J7SC Firestrike is, not surprisingly, extremely CPU dependent now with the 4090. Disabling some e-cores even has a significant impact on the graphics score.
> 
> I've been mostly working on a good CPU config for it. Once I get that sorted out I may make a little more effort (chiller, backplate heater, etc...). Still my initial effort got me to 10th in the HOF so that's encouraging.
> 
> And yea AMD kicks butt in FS and TS. It'll be interesting to see what the 7900 can do.


For TS and FS for the 6900XT, I disabled one of the two CCDs of the CPU and clocked the remaining one up (spoiler re. the fps)...but still have to run either FS or TS on the 4090 - part of it is the Amazon New World early fiasco of blowing up some other folks' 3090s while in the menu phase and exceeding 1000 fps. That was limited only to a few cards / specific models, but I much prefer 4K or 1440+DLSS benches anyhow, especially with a 5950X, even a good-clocking one...>> Of course, I saw the first references to the 13900KS (available ~ 5 weeks from now) and that plus 8 GHz A-die DDR5 would be a nice update and companion for the 4090...even the ~5 GHz 56 core Intel HEDTs (W790 chipset) are getting some more play time in the rumour mills now...

...but as @Arizor confirmed to me, it is all a slippery slope 


Spoiler














I am getting a kick out of your 'chiller, backplate heater etc.' comment (as you know, I'm contemplating s.th. similar with a heater from an old LN2 pot). Then again, that might be easier than trying to always hit that very narrow window for more heat for the VRAM but less for the core w/good watercooling, much less sub-ambient...


----------



## 4ThreeX

Hey guys. Sorry for the newbie question, but are there any dangers at all to flashing the VBIOS from a different brand? In this case I wanted to flash a TUF BIOS to my 4090 iChill X3, to see if it'd bring down the idle voltage/power draw a bit. The TUF I had before this one (returned due to heavy coilwhile) had my PC idling at 56W, while with this iChill X3 it idles at 63W, and generally just uses a bit more power. I've noticed it sits at around 10mV higher minimum at all times than the TUF did.

So if all it realistically changes is the voltage curve, then that'd be perfect, but if there's some actual danger of ruining the card in any way, then it's probably not worth bothering with on such an expensive card for just a few watt. 

I do have an iGPU if that helps, in case I'd have to reflash it, but the card does not have dual BIOS. I would not be raising the power limit at any time.

I also understand that this is mostly a pointless venture for such little gain, it just bugs me a little bit, that's all.


----------



## J7SC

4ThreeX said:


> Hey guys. Sorry for the newbie question, but are there any dangers at all to flashing the VBIOS from a different brand? In this case I wanted to flash a TUF BIOS to my 4090 iChill X3, to see if it'd bring down the idle voltage/power draw a bit. The TUF I had before this one (returned due to heavy coilwhile) had my PC idling at 56W, while with this iChill X3 it idles at 63W, and generally just uses a bit more power. I've noticed it sits at around 10mV higher minimum at all times than the TUF did.
> 
> So if all it realistically changes is the voltage curve, then that'd be perfect, but if there's some actual danger of ruining the card in any way, then it's probably not worth bothering with on such an expensive card for just a few watt.
> 
> I do have an iGPU if that helps, in case I'd have to reflash it, but the card does not have dual BIOS. I would not be raising the power limit at any time.
> 
> I also understand that this is mostly a pointless venture for such little gain, it just bugs me a little bit, that's all.


There always is _some_ risk when flashing a bios, but it is small as long as you follow the correct procedures and input commands. I am just not sure whether the net difference of 7 W is worth it; besides, each core has its own internal power table, depending on bin, as the bios is only part of the story. Another thing to consider is the I/O (ie. DP, HDMI); that can be affected if the TUF and iChill X3 have different I/O configs. The same holds for fan speeds.


----------



## xrb936

Guys, Asus is going to sale PCB-only version 4090 for watercooling users. Any thoughts? I am assuming the GPU will be specially picked.


----------



## ianann

xrb936 said:


> Guys, Asus is going to sale PCB-only version 4090 for watercooling users. Any thoughts? I am assuming the GPU will be specially picked.


What? Link or it doesn't happen!


----------



## yzonker

Looks like HWbot is continuing to move toward ECC on. 






Mandatory Systeminfo 5.56 Update and new benchmark rules for RTX40 series


Read the full article @ HWBOT




hwbot.org


----------



## 4ThreeX

J7SC said:


> There always is _some_ risk when flashing a bios, but it is small as long as you follow the correct procedures and input commands. I am just not sure whether the net difference of 7 W is worth it; besides, each core has its own internal power table, depending on bin, as the bios is only part of the story. Another thing to consider is the I/O (ie. DP, HDMI); that can be affected if the TUF and iChill X3 have different I/O configs. The same holds for fan speeds.


Thank you. I'll just leave it well alone then.


----------



## bmagnien

yzonker said:


> Looks like HWbot is continuing to move toward ECC on.
> 
> 
> 
> 
> 
> 
> Mandatory Systeminfo 5.56 Update and new benchmark rules for RTX40 series
> 
> 
> Read the full article @ HWBOT
> 
> 
> 
> 
> hwbot.org


I wonder if there’s any difference in vram temps with ECC on or off. Or if the temp / performance relationship changes at all with ECC on


----------



## yzonker

bmagnien said:


> I wonder if there’s any difference in vram temps with ECC on or off. Or if the temp / performance relationship changes at all with ECC on


Seems unlikely, but last time I tried I couldn't enable ECC on my card so obviously haven't tested it. Maybe it works now that I have the Strix bios flashed. Try it tonight maybe.


----------



## dboom

bmagnien said:


> I wonder if there’s any difference in vram temps with ECC on or off. Or if the temp / performance relationship changes at all with ECC on


At least 1000points lower in 3dMark.
ECC ON: I scored 26 679 in Port Royal
ECC OFF: I scored 27 702 in Port Royal
Same in others.


----------



## KedarWolf

yzonker said:


> Looks like HWbot is continuing to move toward ECC on.
> 
> 
> 
> 
> 
> 
> Mandatory Systeminfo 5.56 Update and new benchmark rules for RTX40 series
> 
> 
> Read the full article @ HWBOT
> 
> 
> 
> 
> hwbot.org


ECC On is to stop people that use the artefacts bug to get insane scores that should be invalid.


----------



## J7SC

yzonker said:


> Looks like HWbot is continuing to move toward ECC on.
> 
> 
> 
> 
> 
> 
> Mandatory Systeminfo 5.56 Update and new benchmark rules for RTX40 series
> 
> 
> Read the full article @ HWBOT
> 
> 
> 
> 
> hwbot.org


I got a kick out of HWBot's 'article' as there really was no additional information whatsoever. As discussed before, I think this ECC thing has become a farce at HWBot (and 3DMark) due to management caving in to folks who screamed very loudly, for their own not entirely altruistic reasons. Nobody wants to compete against bugged scores, but HWBot has plenty of those anyhow. What concerns me is not so much the effect on RTX4K owners _competing against each other_, but the fact that they now require a different setting for just one family of cards, but not other GPUs (including by the same vendor). It introduces a questionable skew they're are likely going to regret.

ECC introduces extra latencies and requires extra cycles whether artefacts are present or not, thus the big drop folks have reported. At least vendors could come up with updated vbios and/or NVidia with an updated driver. What HWBot/3DM _should have done_ is introduce (for example is SystemInfo / 3DM) artefact checkers - they can be quite small in terms of footprint, like the ancient one below (which still works re. artefacts). Also, they could have licensed it and custom-tuned such an artefact checker.


----------



## WayWayUp

Im not sure e cores have that much impact in FS.
I havent done standard as i just got this rig going but ive done a couple fire strike ultra runs

havent tuned e cores yet just left them on auto.
you physics scores seem a bit low?









this was one of the lower scores too i have some closer to 61k


----------



## yzonker

WayWayUp said:


> Im not sure e cores have that much impact in FS.
> I havent done standard as i just got this rig going but ive done a couple fire strike ultra runs
> 
> havent tuned e cores yet just left them on auto.
> you physics scores seem a bit low?
> 
> View attachment 2585672
> 
> this was one of the lower scores too i have some closer to 61k


Huge impact in FS (1080p version). GT1 AND GT2 are heavily CPU dependent and it apparently runs on the e-cores with them all enabled. Weirdly disabling just 3 or 4 of them greatly boosts the score. 

This is also why your combined score is nerfed. It should be 20k+. Disable 4 e-cores and run it again.

Here's what I got back when I first installed my 4090, 









I scored 57 063 in Fire Strike


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





And here's what I got yesterday with 4 e-cores disabled, 









I scored 66 231 in Fire Strike


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## WayWayUp

very interesting, i may try it out and see what comes up


yzonker said:


> This is also why your combined score is nerfed. It should be 20k+. Disable 4 e-cores and run it again.


please note that the score i posted is for first strike ultra. nobody in the world has scored 20k even with ln2 running 7.5Ghz 13900k

There is a top 10 score on the HOF with only 14,786 combined score

in fact my scores are good enough for top 10 in the world, i just need to manually tweak the curve so that in the first test it boosts higher when using less than 1.1v but i havent had time to mess around yet


----------



## yzonker

WayWayUp said:


> very interesting, i may try it out and see what comes up
> 
> 
> please note that the score i posted is for first strike ultra. nobody in the world has scored 20k even with ln2 running 7.5Ghz 13900k
> 
> There is a top 10 score on the HOF with only 14,786 combined score
> 
> in fact my scores are good enough for top 10 in the world, i just need to manually tweak the curve so that in the first test it boosts higher when using less than 1.1v but i havent had time to mess around yet


Yea looks like it's just FSE that scores about the same in the combined test as FS. Because of that I think I had assumed all 3 were similar. FSU does look lower for everyone.


----------



## Gking62

for the Strix 4090 OC, I'm looking for a diagram of the pcb that highlights the VRMs, coils, Hotspot etc. lmk ty.


----------



## Nizzen

J7SC said:


> I got a kick out of HWBot's 'article' as there really was no additional information whatsoever. As discussed before, I think this ECC thing has become a farce at HWBot (and 3DMark) due to management caving in to folks who screamed very loudly, for their own not entirely altruistic reasons. Nobody wants to compete against bugged scores, but HWBot has plenty of those anyhow. What concerns me is not so much the effect on RTX4K owners _competing against each other_, but the fact that they now require a different setting for just one family of cards, but not other GPUs (including by the same vendor). It introduces a questionable skew they're are likely going to regret.
> 
> ECC introduces extra latencies and requires extra cycles whether artefacts are present or not, thus the big drop folks have reported. At least vendors could come up with updated vbios and/or NVidia with an updated driver. What HWBot/3DM _should have done_ is introduce (for example is SystemInfo / 3DM) artefact checkers - they can be quite small in terms of footprint, like the ancient one below (which still works re. artefacts). Also, they could have licensed it and custom-tuned such an artefact checker.
> View attachment 2585670


You can't turn off ecc on 30 series nvidia?


----------



## J7SC

Nizzen said:


> You can't turn off ecc on 30 series nvidia?


...certainly not on my dual 2080 tis. My 3090 series is currently not mounted & awaiting re-deployment. That said, what about all the previous results various folks subbed w/ 30 series - rerun everything ? What about my 6900XT ? Then there's the question of upcoming AMD 7900 XT/X that supposedly compete with some of the RTX4K...

IMO, built-in artefacts checker would be far better !


----------



## Nizzen

J7SC said:


> ...certainly not on my dual 2080 tis. My 3090 series is currently not mounted & awaiting re-deployment. That said, what about all the previous results various folks subbed w/ 30 series - rerun everything ? What about my 6900XT ? Then there's the question of upcoming AMD 7900 XT/X that supposedly compete with some of the RTX4K...
> 
> IMO, built-in artefacts checker would be far better !


Isn't 30 series ecc on only? Then no need to rerun?


----------



## neteng101

Nizzen said:


> Isn't 30 series ecc on only? Then no need to rerun?


I believe its a lesser form of ECC that can't be turned off, but is mostly hitless to performance (no performance loss). That's likely on 4090s as well without ECC too... too high a memory OC will result in lower scores before finally crashing if you push it further.

This ECC toggle makes no sense for benchmarking - checking for rounding error levels of computation is excessive and unnecessary. Pretty sure even if you can't see it - there are errors in various validated runs of other GPUs that don't have an ECC mode toggle.


----------



## J7SC

Nizzen said:


> Isn't 30 series ecc on only? Then no need to rerun?


...your missing the point - it's not just 30 series but_ other GPUs_, as I keep on pointing out. _Why not_ use a bult-in artefacts-checker applied to _every_ GPU family, instead of introducing a bias ? Also, AFAIK, ECC toggle was introduced for 'Titan class' cards in case of workstation usage.


----------



## yzonker

All right, tested 8-14 ecores. Whew...

Best came from 13 ecores simply because for whatever reason the combined score is huge then. Kills the graphics score though.









I scored 67 812 in Fire Strike


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





9 ecores was close and I think it might be possible to score about the same with that. Everything in between is all over the place.









I scored 67 353 in Fire Strike


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Laithan

J7SC said:


> ...certainly not on my dual 2080 tis. My 3090 series is currently not mounted & awaiting re-deployment. That said, what about all the previous results various folks subbed w/ 30 series - rerun everything ? What about my 6900XT ? Then there's the question of upcoming AMD 7900 XT/X that supposedly compete with some of the RTX4K...
> 
> IMO, built-in artefacts checker would be far better !


If enabling ECC was not even an option then what? It really seems that this is a lousy "solution" given that they are depending on NVIDIA's option to do all the work... instead of addressing the problem (properly) themselves.


----------



## yzonker

Nizzen said:


> Isn't 30 series ecc on only? Then no need to rerun?


I never saw any regression on 30 series when running an XOC bios (but of course did on normal factory bios like everyone else). It would always increase in performance or crash. Same for both my 3080ti on the Galax 1kw bios and 3090 on the KP bios. 

I also have seen the same artifacts on those cards but I don't think it ever seemed to boost my score or possibly none of those completed. I don't remember for sure.


----------



## J7SC

...in 'ECC' news of a different kind, how about those 12 channel-DDR5 results for the new Epyc Genoa ? OK, latencies are not that great, but that bandwidth  ...Epyc Genoa 96 Core / 192 Threads (or two of those CPUs) might be fun to run in Cinebench 23 every morning...also, 128 PCIe lanes (up to 5.0 spec) of the Epyc Genoa could mean a whole nest full of 4090s.


----------



## yzonker

J7SC said:


> ...in 'ECC' news of a different kind, how about those 12 channel-DDR5 results for the new Epyc Genoa ? OK, latencies are not that great, but that bandwidth  ...Epyc Genoa 96 Core / 192 Threads (or two of those CPUs) might be fun to run in Cinebench 23 every morning...also, 128 PCIe lanes (up to 5.0 spec) of the Epyc Genoa could mean a whole nest full of 4090s.


Just needs a little memory tuning. How hard could it be...


----------



## yzonker

There's this one too. Graphics score looks a little weak though. Physics score is carrying me.









I scored 27 993 in Fire Strike Ultra


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## neteng101

yzonker said:


> I never saw any regression on 30 series when running an XOC bios (but of course did on normal factory bios like everyone else). It would always increase in performance or crash.


Wish I could say the same - there was a definite regression beyond +1000 memory on my 3080Ti with the Galax bios. +900/+1000 was about dead even with margin of error differences in scores. It took something much higher to crash though, around +1300/+1400, been a while so I forget the exact point. I was air cooled though.

I see regressions on the 4090 beyond +1500, crashing point is past +1700, no ECC enabled in NVCP.

I believe there's some form of limited memory error correction that was introduced in Ampere and carried on between gens, but its not the full blown ECC functionality toggle switch. And water cooled cards seem to always have more headroom, until you hit that crash and burn point.


----------



## yzonker

neteng101 said:


> Wish I could say the same - there was a definite regression beyond +1000 memory on my 3080Ti with the Galax bios. +900/+1000 was about dead even with margin of error differences in scores. It took something much higher to crash though, around +1300/+1400, been a while so I forget the exact point. I was air cooled though.
> 
> I see regressions on the 4090 beyond +1500, crashing point is past +1700, no ECC enabled in NVCP.
> 
> I believe there's some form of limited memory error correction that was introduced in Ampere and carried on between gens, but its not the full blown ECC functionality toggle switch. And water cooled cards seem to always have more headroom, until you hit that crash and burn point.


I think I have seen some plateauing on my 4090, but not actual regression. Maybe same for my 3080ti now that you mentioned it. But never regression. And only within a small range, like 100. IIRC, my 3080ti would pretty much top out around +1300 and score similarly up to +1400 before crashing. 

So maybe something, but not nearly as pronounced as say the original EVGA 3080ti bios which would regress. There's a difference anyway. That I'm sure of.


----------



## J7SC

From what I recall, +1275 to +1325 where (are) the _most efficient_ settings on VRAM for my 3090 Strix, slight shift from stock bios to 520 KPE. I am planning to redeploy that 3090 Strix and the 6900XT (both water-cooled w/extra heatsinks on the back) into a backup work+play setup running a Threadripper and two monitors, or else a 3950X setup...either one w/Win 10 Pro and Ubuntu dual-boot...

One day, the 4090 will go on a similar journey...unless I overdo it it with the GDD6X preheating  


Spoiler












leser installation







www.youtube.com


----------



## motivman

PhuCCo said:


> Can you post a picture of your card when you take it apart? Seeing the compression marks on the pads and the thermal paste spread on the cooler would maybe help show the issue.
> 
> The oddball sized pads are always a headache. You either try to add a buffer to thinner pads to try and get the extra thickness, or you try to compress thicker pads in hopes that they compress enough. You could use 1mm pads and add some thermal paste on top of the pads to make the material a bit thicker. Or just use some thermal putty in place of the 1.2mm pads. EVGA has thermal putty on their website that you can buy in a syringe and apparently it's pretty good stuff. I haven't personally tested it, but I am going to grab some and try it. You just extrude it out and it fills in all of the gaps.


How is your experience with the bykski block for the gaming OC? I am about to order it, but looking at the instructions online, it looks like it only covers less than half of the components with thermal pads compared to the stock cooler? did you use the stock pads that came with the block, and did you use any pads on the backplate? I was reading on reddit, one user said the got way better temps using stock pads that came with the air cooler.


----------



## motivman

mirkendargen said:


> I didn't measure mine, but eyeballing them compared to the 1.5mm Arctic pads I used the Bykski ones were *slightly* thicker. Now I don't know if the Arctic ones were exactly 1.5mm either, but they definitely made contact fine when I used them.


Did you use 1.5mm ARCTIC TP-3? did you cover just the components shown in the bykski online manual? did you use any thermal pads on the backplate? I am getting impatient waiting for the EK or phanteks block, so thinking about ordering the bykski block for my 4090 gaming OC


----------



## J7SC

@motivman ...temps w/ stock air and the Bykski block below. FYI, unlike the stock cooler, the Bykski block doesn't use any thermal pads for the chokes, but they don't really need it in the first place as it is the caps that get hot. For those, you may want to use either the stock pads that came with the Bykski (~ 1.5mm) or s.th. very slightly thicker but softer as @mirkendargen pointed out. FYI, I used thermal putty for the VRAM and on the back making contact with the backplate and its extra heatsink I mounted. VRAM on my card always ran very cool (even in stock at just ~ 20 C to 25 C delta over ambient) but thermal putty lowered it by another 8 C or so...


----------



## Panchovix

Just noticed there was a new VBIOS for the ASUS TUF non-OC lol
TIL I couldn't enable ECC (didn't test)


----------



## motivman

J7SC said:


> @motivman ...temps w/ stock air and the Bykski block below. FYI, unlike the stock cooler, the Bykski block doesn't use any thermal pads for the chokes, but they don't really need it in the first place as it is the caps that get hot. For those, you may want to use either the stock pads that came with the Bykski (~ 1.5mm) or s.th. very slightly thicker but softer as @mirkendargen pointed out. FYI, I used thermal putty for the VRAM and on the back making contact with the backplate and its extra heatsink I mounted. VRAM on my card always ran very cool (even in stock at just ~ 20 C to 25 C delta over ambient) but thermal putty lowered it by another 8 C or so...
> View attachment 2585821


wow, great temps with the bykski block!!! I found it on amazon, going to order it. what is "s.th. "? I was thinking about using the leftover 1.5mm artic TP-3 from when I repadded the air cooler instead of the stock pads.









Amazon.com: Bykski Copper GPU Water Cooling Block GPU Waterblock Graphics Card Water Cooling Block for Gigabyte Aorus GeForce RTX 4090 Master Gaming (5V ARGB RBW Aura Effect LED Lights GPU Block with Backplate) : Electronics


Buy Bykski Copper GPU Water Cooling Block GPU Waterblock Graphics Card Water Cooling Block for Gigabyte Aorus GeForce RTX 4090 Master Gaming (5V ARGB RBW Aura Effect LED Lights GPU Block with Backplate): Water Cooling Systems - Amazon.com ✓ FREE DELIVERY possible on eligible purchases



www.amazon.com


----------



## mirkendargen

motivman said:


> Did you use 1.5mm ARCTIC TP-3? did you cover just the components shown in the bykski online manual? did you use any thermal pads on the backplate? I am getting impatient waiting for the EK or phanteks block, so thinking about ordering the bykski block for my 4090 gaming OC


I put them on everything the stock cooler did plus a bunch on the backplate. You can't hurt anything by using extra thermal pads, particularly super soft ones that won't interfere with contact.

If I were doing it over again, I'd honestly probably use something worse on the VRAM (or leave one side of the plastic backing on) because I have quite cold coolant in the winter lol.


----------



## Sheyster

Panchovix said:


> Just noticed there was a new VBIOS for the ASUS TUF non-OC lol
> TIL I couldn't enable ECC (didn't test)
> View attachment 2585824


They've also released the V2.1 update for the Strix as well.


----------



## motivman

mirkendargen said:


> I put them on everything the stock cooler did plus a bunch on the backplate. You can't hurt anything by using extra thermal pads, particularly super soft ones that won't interfere with contact.
> 
> If I were doing it over again, I'd honestly probably use something worse on the VRAM (or leave one side of the plastic backing on) because I have quite cold coolant in the winter lol.
> View attachment 2585842


Awesome. so just to reiterate, to make sure I am getting this right, you used 1.5mm artic TP-3 on everything that was covered by stock air cooler and it compressed with the block just fine? those temps are fantastic. what thermal paste did you use for the core? I am thinking of going with KPX....


----------



## mirkendargen

motivman said:


> Awesome. so just to reiterate, to make sure I am getting this right, you used 1.5mm artic TP-3 on everything that was covered by stock air cooler and it compressed with the block just fine? those temps are fantastic. what thermal paste did you use for the core? I am thinking of going with KPX....


That is correct. I used Honeywell PTM7950 on the core. Those temps didn't have a heavy load on the max temps, I fired up Quake 2 RTX to get these, so these maxes are under a 550W load, the current temp is idle.









The problem I'm having is that under lighter 3d loads my VRAM only hits 24C, and I had to reduce it from +1400 to +1200 to not get artifacts when +1400 is totally fine at heavy load temperature.


----------



## motivman

mirkendargen said:


> That is correct. I used Honeywell PTM7950 on the core. Those temps didn't have a heavy load on the max temps, I fired up Quake 2 RTX to get these, so these maxes are under a 550W load, the current temp is idle.
> View attachment 2585849
> 
> 
> The problem I'm having is that under lighter 3d loads my VRAM only hits 24C, and I had to reduce it from +1400 to +1200 to not get artifacts when +1400 is totally fine at heavy load temperature.


how are your temps so low? are you using a chiller or something?


----------



## motivman

-----


----------



## motivman

J7SC said:


> @motivman ...temps w/ stock air and the Bykski block below. FYI, unlike the stock cooler, the Bykski block doesn't use any thermal pads for the chokes, but they don't really need it in the first place as it is the caps that get hot. For those, you may want to use either the stock pads that came with the Bykski (~ 1.5mm) or s.th. very slightly thicker but softer as @mirkendargen pointed out. FYI, I used thermal putty for the VRAM and on the back making contact with the backplate and its extra heatsink I mounted. VRAM on my card always ran very cool (even in stock at just ~ 20 C to 25 C delta over ambient) but thermal putty lowered it by another 8 C or so...
> View attachment 2585821


so with your installation, you used thermal pads on only the chokes like Bykski suggested, and no issues at all?


----------



## mirkendargen

motivman said:


> how are your temps so low? are you using a chiller or something?


Winter, rad in the basement.


----------



## motivman

mirkendargen said:


> Winter, rad in the basement.


nice. I am really thinking of implementing a chiller into my loop, but not sure how... haha.


----------



## mirkendargen

motivman said:


> nice. I am really thinking of implementing a chiller into my loop, but not sure how... haha.


Active Aqua aquarium chillers off Amazon seem like the most straightforward way.


----------



## dr/owned

motivman said:


> nice. I am really thinking of implementing a chiller into my loop, but not sure how... haha.


It's really not worth it. A lot of cost both up front and electric and you don't get temps low enough to matter anyways. A giant radiator remotely located from the case is the meta.


----------



## J7SC

...basically, you want the core just cold enough not to drop any / more than one speed bin (peak temp at 43 or less) while getting the VRAM higher (43 C or more) for ~ +1500 +- ...can be done, but tricky prep.


----------



## J7SC

motivman said:


> so with your installation, you used thermal pads on only the chokes like Bykski suggested, and no issues at all?
> View attachment 2585852


...only on the mosfets (also per Bykski's diagram above)


----------



## dr/owned

J7SC said:


> ...only on the mosfets (also per Bykski's diagram above)


Alphacool also doesn't bother cooling anything but the mosfets. I think stock the AIB's go too crazy putting thermal pads on everything including capacitors and inductors.


----------



## mirkendargen

dr/owned said:


> Alphacool also doesn't bother cooling anything but the mosfets. I think stock the AIB's go too crazy putting thermal pads on everything including capacitors and inductors.
> 
> View attachment 2585870


It doesn't matter for cooling, but soft pads on inductors and caps can help dampen coil whine. That's why I do it.


----------



## PhuCCo

motivman said:


> How is your experience with the bykski block for the gaming OC? I am about to order it, but looking at the instructions online, it looks like it only covers less than half of the components with thermal pads compared to the stock cooler? did you use the stock pads that came with the block, and did you use any pads on the backplate? I was reading on reddit, one user said the got way better temps using stock pads that came with the air cooler.


I'm sorry that I didn't see your message. 

The Bykski block is pretty awesome. The stock Gaming OC had thermal pads on absolutely everything, but most of them aren't necessary. I don't see how that user said that they used the stock air cooler pads as those are 1mm-1.2mm from my measurements. 

You can use the stock Bykski pads that come with the block. They're 1.8mm thick, but they have the consistency of chewing gum (literally) so they easily compress to below 1.5mm. You could use Arctic TP3 1.5mm pads like some others have posted. That's what I used. They aren't anywhere near as soft, but they don't have to be. I am curious on longevity though, as I played with them a bit and it didn't take too long for the pads to start crumbling and lose oil whereas the Bykski pads seem indestructible. 

The block doesn't actively cool the inductors or caps on the board, but the tolerances between the inductors and block are so close that I just put a dab of putty on each choke to have some contact. It isn't at all necessary, I just figured I would use the surface area since the tolerances are so close. I don't have an exact measurement of the gap, so I don't know if a 0.5mm pad would work or still be too thick. Just put some pads on the backplate side behind them.

For the backplate side you need 1.5mm and 3mm sizes. 3mm for the areas that you're touching the pcb and backplate (behind VRAM, behind VRM, behind front side caps, behind power connector if you want) and 1.5mm on the rear caps since they stick up off of the board. I stacked two 1.5mm pads to get to 3mm and the contact is still pretty good. Everything on the board has contact and the pcb has zero warp to it.


----------



## motivman

Awesome, thanks so much for the help guys. Can't wait to put this card on water. As long as I can run +1500 memory with cold temps, I will be happy.


----------



## PhuCCo

I haven't personally ran into the issues with cold temperatures when overclocking the memory, but I believe it. I would maybe purposefully use poopy 1.5mm thermal pads on the memory like others have suggested.


----------



## mirkendargen

PhuCCo said:


> I haven't personally ran into the issues with cold temperatures when overclocking the memory, but I believe it. I would maybe purposefully use poopy 1.5mm thermal pads on the memory like others have suggested.


I honestly think the way to go might be to leave the plastic backing on the block side of the VRAM pads.... Maybe you have 60C VRAM temps, but who cares? That's fine.


----------



## Jay-G30

The trick everyone is missing is NO Thermal pads  , Vram temps back into the 110deg like in the good old days of mining😂


----------



## motivman

mirkendargen said:


> I honestly think the way to go might be to leave the plastic backing on the block side of the VRAM pads.... Maybe you have 60C VRAM temps, but who cares? That's fine.


lol, I don't know about that, might see vram temps like the 3090's with that method. I think I will experiment with the EK stock thermal pads, but not sure those compress like the artic tp3 pads. I have those in all sizes though, due to years of buying EK waterblocks, haha.


----------



## mirkendargen

motivman said:


> lol, I don't know about that, might see vram temps like the 3090's with that method. I think I will experiment with the EK stock thermal pads, but not sure those compress like the artic tp3 pads. I have those in all sizes though, due to years of buying EK waterblocks, haha.


When my MoRa gets here in a month or so and I need to drain my loop anyway, I might try it for the hell of it. Or maybe half the backing on or something.


----------



## theilya

Any gains with upgrading to 600w bios for Zotac AMP?


----------



## Spin Cykle

Good day gentlemen! New 4090 Gaming OC owner checking in. My card seems to be very average. Core will boost to 3030mhz core stable and +1900 mems in Port Royal On air and 100% fan. I caved and. Bought the Bykski block on BF, $120. Doing some reading, I’ve also bought Arctic TP-3 1.5mm pads. Does anyone have a recommendation for thermal putty? Cheers!


----------



## Blameless

mirkendargen said:


> It doesn't matter for cooling, but soft pads on inductors and caps can help dampen coil whine. That's why I do it.


Colder inductors often have worse coil whine. Not sure why pads on capacitors would matter one way or another though.



Spin Cykle said:


> Does anyone have a recommendation for thermal putty? Cheers!


TG-PP-10.


----------



## Sheyster

Here is the new ASUS Strix V2.1 OC BIOS, Power limit 100%=500w, 120%=600w

ASUS changelog:

"Fix ECC couldn’t be enable issue."

*Rename the file extension to .bin before flashing.*


----------



## J7SC

Sheyster said:


> Here is the new ASUS Strix V2.1 OC BIOS, Power limit 100%=500w, 120%=600w
> 
> ASUS changelog:
> 
> "Fix ECC couldn’t be enable issue."
> 
> *Rename the file extension to .bin before flashing.*


Noice ! Quick question: rename .bin, not .rom ?


----------



## Sheyster

J7SC said:


> Noice ! Quick question: rename .bin, not .rom ?


I think both will work! I always use .bin...


----------



## J7SC

"FYI"


----------



## Alemancio

Kinda odd my Gigabyte Gaming OC will *only take +500 on memory*. Any idea what it could be? A dud? The temps are low.


----------



## KingEngineRevUp

dr/owned said:


> I think stock the AIB's go too crazy putting thermal pads on everything including capacitors and inductors.


Perhaps they do it to reduce coil whine?


----------



## kx11




----------



## rahkmae

strix with ekw block and grizzli minus pad 8


----------



## Benni231990

Can anybody upload the New Neptune OC BIOS from gamer Nexus?


----------



## ttnuagmada

Spin Cykle said:


> Good day gentlemen! New 4090 Gaming OC owner checking in. My card seems to be very average. Core will boost to 3030mhz core stable and +1900 mems in Port Royal On air and 100% fan. I caved and. Bought the Bykski block on BF, $120. Doing some reading, I’ve also bought Arctic TP-3 1.5mm pads. Does anyone have a recommendation for thermal putty? Cheers!


I always go liquid metal route


----------



## Nunzi

rahkmae said:


> strix with ekw block and grizzli minus pad 8
> View attachment 2585997


Loving my EK block!


----------



## J7SC

FYI, measuring for the right thermal pad height...


----------



## keikei

I went with a 5900X as my current chip was getting choked playing BF 2042. Big sale on right now @ amazon. It went for $340.


----------



## ianann

J7SC said:


> FYI, measuring for the right thermal pad height...


I don’t get the point why to use soft pads on RAM, (and even capacitors at all) and it ‘doesn’t matter’ to use hard Pads on the Inductor Coils (Why at all?) and the MOSFETs. I mean, pressure is pressure and you need to get a 0,4mm squeeze on that right ICs with hard Pads while you ‘should use’ soft Pads on even lower gaps of the RAM. This doesn’t really make sense to me.


----------



## Sheyster

Benni231990 said:


> Can anybody upload the New Neptune OC BIOS from gamer Nexus?


If you have the first one already flashed you can update it with this:

https://endownload.colorful.cn/EnDownload/GraphicsCard/2022/RTX4090_NEPTUNE_BIOS.rar

You might be able to find the new bios file in the Techpowerup vbios database, check the unverified ones as well. The versions you're looking for are:

RTX4090 Neptune Original frequency: *95.02.18.80.E3*, Overclocking: *95.02.18.80.E4*


----------



## MrB123

Spin Cykle said:


> Good day gentlemen! New 4090 Gaming OC owner checking in. My card seems to be very average. Core will boost to 3030mhz core stable and +1900 mems in Port Royal On air and 100% fan. I caved and. Bought the Bykski block on BF, $120. Doing some reading, I’ve also bought Arctic TP-3 1.5mm pads. Does anyone have a recommendation for thermal putty? Cheers!


Whats your temps? my Giga OC runs hot over 90c on the hotspot


----------



## WayWayUp

msi afterburner cant do a curve above 3Ghz?


----------



## coelacanth

WayWayUp said:


> msi afterburner cant do a curve above 3Ghz?


You have to edit the config file. Search the thread.


----------



## jootn2kx

coelacanth said:


> You have to edit the config file. Search the thread.


Could you post the link I was wondering the same, but can't really find it in the search


----------



## Riadon

jootn2kx said:


> Could you post the link I was wondering the same, but can't really find it in the search


I don't even know where the search function is so I looked in the install folder and it's MSIAfterburner.cfg, you need to change VFCurveEditorMaxFrequency

I changed VFCurveEditorMaxVoltage as well to 1100 since the cards are locked there anyways.


----------



## Betroz

Is it really worth it going with a 500/600W BIOS card like the Asus Strix OC over a 450/450W one like the Gainward GeForce RTX 4090 Phantom or Asus TUF non-OC?
One thing that could be worth it for me, is lower temps on the memory and hotspot due to a better designed cooler. But man the Strix card is TOO big


----------



## Nizzen

Betroz said:


> Is it really worth it going with a 500/600W BIOS card like the Asus Strix OC over a 450/450W one like the Gainward GeForce RTX 4090 Phantom or Asus TUF non-OC?
> One thing that could be worth it for me, is lower temps on the memory and hotspot due to a better designed cooler. But man the Strix card is TOO big


Not worth


----------



## ttnuagmada

Betroz said:


> Is it really worth it going with a 500/600W BIOS card like the Asus Strix OC over a 450/450W one like the Gainward GeForce RTX 4090 Phantom or Asus TUF non-OC?
> One thing that could be worth it for me, is lower temps on the memory and hotspot due to a better designed cooler. But man the Strix card is TOO big


You won't really be power limited at stock settings, but if you plan to OC it might be worth it.


----------



## mirkendargen

Betroz said:


> Is it really worth it going with a 500/600W BIOS card like the Asus Strix OC over a 450/450W one like the Gainward GeForce RTX 4090 Phantom or Asus TUF non-OC?
> One thing that could be worth it for me, is lower temps on the memory and hotspot due to a better designed cooler. But man the Strix card is TOO big


Really just depends what 5-10% performance is worth to you.


----------



## Krzych04650

Betroz said:


> Is it really worth it going with a 500/600W BIOS card like the Asus Strix OC over a 450/450W one like the Gainward GeForce RTX 4090 Phantom or Asus TUF non-OC?
> One thing that could be worth it for me, is lower temps on the memory and hotspot due to a better designed cooler. But man the Strix card is TOO big





mirkendargen said:


> Really just depends what 5-10% performance is worth to you.


It is not even 10%, it is like 2%, at most. It may seem like higher power limit does something significant because it stabilizes clocks and allows for that stable 3000 MHz at all times, but if you then power limit the same overclock to 450W, it will only lose anywhere from 0.5% to 2% performance.









This is using the same settings just different power limits, +130/+1250, target clock 3060 MHz. I am using Ryse because it is the most power-hungry game I could find, way more hungry than something like Port Royal that stops reporting power limit at around 520W, here it sill slams power limit at 560W, which is the highest it goes with 630W BIOS, so this is worst case scenario, similar to Metro Exodus which can also do this if pushed to the max at high resolution. Most games do not draw this much and the difference becomes even less.


----------



## lawson67

Betroz said:


> Is it really worth it going with a 500/600W BIOS card like the Asus Strix OC over a 450/450W one like the Gainward GeForce RTX 4090 Phantom or Asus TUF non-OC?
> One thing that could be worth it for me, is lower temps on the memory and hotspot due to a better designed cooler. But man the Strix card is TOO big


Not worth it, going from 450w to 500w brings 1 extra FPS in Tomb radar benchmark, going up to +1500mhz on Vram brings about 5 fps or more would have to check it again, Good Vram is what you want on these cards, i don't even bother running over 450w its a complete waste of power.


----------



## Artjsalina5

Too bad my card can only do +1300


----------



## motivman

MrB123 said:


> Whats your temps? my Giga OC runs hot over 90c on the hotspot


I thought I was the only one, lol...


----------



## Artjsalina5

Anyone that has experience with Honeywell PTM7950 .... Would it be worth to use on a waterblock? It seems the phase change occurs higher than the temps the waterblock would bring.


----------



## mirkendargen

Artjsalina5 said:


> Anyone that has experience with Honeywell PTM7950 .... Would it be worth to use on a waterblock? It seems the phase change occurs higher than the temps the waterblock would bring.


Works fine on mine at temperatures below 45C, although I have gotten it above 45C to initially melt it for good contact (turned the fans off on my rad for awhile to let my coolant heat up).


----------



## cheddardonkey

I didnt see any other posts on it so here's a look at Corsair's fe block. I know It's not the best block but, I'm not sure I want the best cooling for this card. Found a short time deal at a crazy good price, so took the gamble over an ekwb or bykski. It is heavier than I expected and looks to have some construction improvements over the 30 series blocks. Still though, hoping it performs well enough... and rgb meh..don't care, I like more black.


----------



## Riadon

Artjsalina5 said:


> Too bad my card can only do +1300


I can only do +1350, unfortunate because +1600 or more is no issue if the vram is warm. I was cruising through games until I hit the Dying Light 2 main menu where I will instantly crash at 1400 or higher.


----------



## kayawish24

pat182 said:


> hope its gonna fix my black screen issue


i have same PC and GPU like you have black screen every time when PC is idle.Any fix you found exceot that prefer performance mode ?


----------



## motivman

kayawish24 said:


> i have same PC and GPU like you have black screen every time when PC is idle.Any fix you found exceot that prefer performance mode ?


Is your memory overclocked? if it is, it may be overclocked too high, and when vram gets too cold, blackscreen.


----------



## WayWayUp

thanks for the info about tweaking afterburner config file
I’ve had some success

















I scored 27 803 in Fire Strike Ultra


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





Now I have the #15 score
I think I can improve into the top 10 with more tweaking but obviously can’t compete with Ln2

I’ve pretty much given up on Port Royal. Everyone Bork’s the memory to rig their scores. Plus im on the leaderboards already and lost interest to keep running that. For me just FSU and TSE


----------



## schoolofmonkey

Betroz said:


> Is it really worth it going with a 500/600W BIOS card like the Asus Strix OC over a 450/450W one like the Gainward GeForce RTX 4090 Phantom or Asus TUF non-OC?
> One thing that could be worth it for me, is lower temps on the memory and hotspot due to a better designed cooler. But man the Strix card is TOO big


I'd thought about it with my Galax 4090 SG, but it already has a 510w limit, I can hit 3Ghz in Time Spy Extreme, but it isn't really that stable.
I've just whacked a +150Mhz Core, +500Mhz Vrm and it's happy, boosts to 2950Mhz, stays cool at 70c, draws 495w, so I don't think it's really worth it overall for gaming, if you're chasing 3DMark numbers, then maybe..


----------



## bmagnien

Sheyster said:


> Here is the new ASUS Strix V2.1 OC BIOS, Power limit 100%=500w, 120%=600w
> 
> ASUS changelog:
> 
> "Fix ECC couldn’t be enable issue."
> 
> *Rename the file extension to .bin before flashing.*


This is giving me by far the largest power draw I've seen thus far, compared to the stock gigaoc 600w bios and the colorful neptune 630 watt bios: 605 watts! Will get benching to see if this paired with potentially tighter vram timings will yield any tangible performance increases.


----------



## J7SC

^nice ! I'm still on the Giga-OC stock vbios and should try that new Asus sooner or later.

For those contemplating water-cooling (I'm on the Bykski block), below are three temp comps...one each for FS 2020 and Cyberpunk '77 with mild oc (1.05v, 115% out of 133%, 24.5 C ambient). Then I tried a Firestrike Ultra run with almost full oc and to my surpise hit 14th in HoF - I hadn't run FSUltra in years. Same ambient as per the game numbers, and no special bench prep. Not bad for an 'old' 5950X. The system (ie. fans) is totally silent; a great plus with water-cooling. I wonder what this would do with the new Asus vbios and a bit of bench prep.


----------



## LtMatt

Krzych04650 said:


> It is not even 10%, it is like 2%, at most. It may seem like higher power limit does something significant because it stabilizes clocks and allows for that stable 3000 MHz at all times, but if you then power limit the same overclock to 450W, it will only lose anywhere from 0.5% to 2% performance.
> View attachment 2586097
> 
> 
> This is using the same settings just different power limits, +130/+1250, target clock 3060 MHz. I am using Ryse because it is the most power-hungry game I could find, way more hungry than something like Port Royal that stops reporting power limit at around 520W, here it sill slams power limit at 560W, which is the highest it goes with 630W BIOS, so this is worst case scenario, similar to Metro Exodus which can also do this if pushed to the max at high resolution. Most games do not draw this much and the difference becomes even less.


Where did you get a 630W bios? I thought the highest was 600W. Edit oh the colourful bios


----------



## Betroz

schoolofmonkey said:


> so I don't think it's really worth it overall for gaming, if you're chasing 3DMark numbers, then maybe..


No I am not. I care about gaming performance only. I play mostly BF2042 on my LG 3840x1600 160 Hz monitor, and my old 3080Ti is not fast enough to push over 160 fps at all times.

I will probably go for the cheapest Asus TUF card - if I can get hold of one before the retailer fases it out. From what I can gather, they only want to sell the TUF OC version... shocker...


----------



## schoolofmonkey

Betroz said:


> No I am not. I care about gaming performance only. I play mostly BF2042 on my LG 3840x1600 160 Hz monitor, and my old 3080Ti is not fast enough to push over 160 fps at all times.
> 
> I will probably go for the cheapest Asus TUF card - if I can get hold of one before the retailer fases it out. From what I can gather, they only want to sell the TUF OC version... shocker...


Yeah even this Galax 4090 kills the EVGA 3090 Hybrid I previously had, surprisingly it's even quieter than the Hybrid.

This was just a straight swap going from the 3090 to the 4090:


----------



## KingEngineRevUp

mirkendargen said:


> Really just depends what 5-10% performance is worth to you.


No way, a good chunk comes from memory OC and that doesn't require as much power as PC core at higher voltage.

5-6% of that 5-10% is from memory OC on average.


----------



## rahkmae

Riadon said:


> I can only do +1350, unfortunate because +1600 or more is no issue if the vram is warm. I was cruising through games until I hit the Dying Light 2 main menu where I will instantly crash at 1400 or higher.


I can also do +1350 and ek waterblock not change anything


----------



## yzonker

J7SC said:


> ^nice ! I'm still on the Giga-OC stock vbios and should try that new Asus sooner or later.
> 
> For those contemplating water-cooling (I'm on the Bykski block), below are three temp comps...one each for FS 2020 and Cyberpunk '77 with mild oc (1.05v, 115% out of 133%, 24.5 C ambient). Then I tried a Firestrike Ultra run with almost full oc and to my surpise hit 14th in HoF - I hadn't run FSUltra in years. Same ambient as per the game numbers, and no special bench prep. Not bad for an 'old' 5950X. The system (ie. fans) is totally silent; a great plus with water-cooling. I wonder what this would do with the new Asus vbios and a bit of bench prep.
> View attachment 2586147


That's interesting. The e-cores are still hurting the Intel scores, just like the 1080p FS, but not as severe. I ran this with all e-cores disabled which is much closer for the graphics score.

All e-cores disable:










All e-cores,


----------



## foresttree1

Hi,


I have iGame RTX 4090 Vulcan OC paired with a 13700KF and 5600mhz DDR5 ram. So far I am able to get +160 on the core and +1600mhz on the memory. But my port royal score seems a bit low compared to other with similar clock. I scored around 27100 consistently which seem lower than most which score 28000+ which is over 1000+ points. Any idea on why its much lower. I tried flashing Asus bios but I got nearly identical scores too. Any ideas as to why? Thanks

Port Royal


----------



## rahkmae

foresttree1 said:


> Hi,
> 
> 
> I have iGame RTX 4090 Vulcan OC paired with a 13700KF and 5600mhz DDR5 ram. So far I am able to get +160 on the core and +1600mhz on the memory. But my port royal score seems a bit low compared to other with similar clock. I scored around 27100 consistently which seem lower than most which score 28000+ which is over 1000+ points. Any idea on why its much lower. I tried flashing Asus bios but I got nearly identical scores too. Any ideas as to why? Thanks
> 
> Port Royal


Better when my score Asus strix and ekw I scored 26 273 in Port Royal

I think that mb z490 old pic-e bottleneck brake


----------



## foresttree1

rahkmae said:


> Better when my score Asus strix and ekw I scored 26 273 in Port Royal
> 
> I think that mb z490 old pic-e bottleneck brake


Weird. Is there a reason clock so close can have such big variance in score? Like this user has lower clocks than me but over 1000 more points Port Royal


----------



## MrB123

motivman said:


> I thought I was the only one, lol...


 something is not 100% with my Giga card . Is it realy that easy that its just a bad assembled card or ... is it somthing worse
It have the sticker on the screws so i cant open it (i may return it to the store). im really not to happy about the temps
or is it like this with all 600W cards?

i have my card on my desk with out a case and the temps is still way higher than my other 4090 (glaxa/KFA2 500w)


----------



## alasdairvfr

foresttree1 said:


> Weird. Is there a reason clock so close can have such big variance in score? Like this user has lower clocks than me but over 1000 more points Port Royal


There are a number of reasons, forcing reBAR can be 3-500 points difference. Also system memory latency and speed/timings will play a factor here. 

I had an issue where my PR score with similar clocks (core over 3000 and memory over 1500) was almost not even touching 27000, now it's 28300 or so. Steps included buying a new motherboard with PCIe4 which makes a difference so check your pcie version and number of lanes. You want to be 4.0x16 (I did x370-x570 upgrade), ssd (not to get a better bench but my ssd was small/full) so clean W10 install, redid my PBO (you are on intel so CPU settings), redid my memory from scratch, ensured I don't have any bloat/bullshit in my startup.

Incidentally I obviously remounted my AIO to the cpu with some fresh paste so possibly eeked a sliver of performance there too.

Doing all this was about 1300 points in PR for me. My cpu/memory are performing better than before as well. So while PR is a GPU bench more than a CPU/memory one, having a less tuned system could be costing you a few %, along with all the other things I mentioned (bloat, PCIe issues, cooling of non-GPU components)


----------



## alasdairvfr

MrB123 said:


> something is not 100% with my Giga card . Is it realy that easy that its just a bad assembled card or ... is it somthing worse
> It have the sticker on the screws so i cant open it (i may return it to the store). im really not to happy about the temps
> or is it like this with all 600W cards?
> 
> i have my card on my desk with out a case and the temps is still way higher than my other 4090 (glaxa/KFA2 500w)


2 or 3 other users have reported having poor temps on the Giga OC card. The general consensus is they are a very good product with occasional QC concerns. If you don't want to remount for warranty concerns, I think a swap out for another Giga OC you won't likely have this issue. User J7SC posted a photo of his where thermal pads were totally off the mark. Same high hotspot issue.

My Giga OC is perfect (for now) hotspot in the mid 80s under torture test (Furmark/Kombuster) but under normal full load stays in the high 70s.


----------



## foresttree1

alasdairvfr said:


> There are a number of reasons, forcing reBAR can be 3-500 points difference. Also system memory latency and speed/timings will play a factor here.
> 
> I had an issue where my PR score with similar clocks (core over 3000 and memory over 1500) was almost not even touching 27000, now it's 28300 or so. Steps included buying a new motherboard with PCIe4 which makes a difference so check your pcie version and number of lanes. You want to be 4.0x16 (I did x370-x570 upgrade), ssd (not to get a better bench but my ssd was small/full) so clean W10 install, redid my PBO (you are on intel so CPU settings), redid my memory from scratch, ensured I don't have any bloat/bullshit in my startup.
> 
> Incidentally I obviously remounted my AIO to the cpu with some fresh paste so possibly eeked a sliver of performance there too.
> 
> Doing all this was about 1300 points in PR for me. My cpu/memory are performing better than before as well. So while PR is a GPU bench more than a CPU/memory one, having a less tuned system could be costing you a few %, along with all the other things I mentioned (bloat, PCIe issues, cooling of non-GPU components)


Hmmm, Very interesting information. I have have Z790 board so its PCIE 4.0 ready. But I have not actually done any memory or cpu tuning yet as I am still waiting for my contact frame to arrive. Seem like an interesting experiment to try out as I was under the impression that cpu and memory did not matter in port royal. Thanks for the info. I will try what you suggested and see what comes out of it.


----------



## Jay-G30

rahkmae said:


> Better when my score Asus strix and ekw I scored 26 273 in Port Royal
> 
> I think that mb z490 old pic-e bottleneck brake


Enable XMP and run again , your ram is saying 2100mhz which is low ....

Z490 is not the issue here.


----------



## WayWayUp

yzonker said:


> That's interesting. The e-cores are still hurting the Intel scores, just like the 1080p FS, but not as severe. I ran this with all e-cores disabled which is much closer for the graphics score.
> 
> All e-cores disable:
> 
> View attachment 2586204
> 
> 
> All e-cores,
> 
> View attachment 2586205


i was wondering why my graphics score was lower while im running clocks as high or higher than ppl above me in the leaderboards

i wonder whats the best combo of ecores for this benchmark












but then i see my average clock is only 3102 and yours is 3121. so i guess i need more stability and that will come when i move my air card to water

so im not gona overthink the e cores thing


----------



## yzonker

WayWayUp said:


> i was wondering why my graphics score was lower while im running clocks as high or higher than ppl above me in the leaderboards
> 
> i wonder whats the best combo of ecores for this benchmark
> 
> either way i expect a jump once i move from my air card to water
> 
> View attachment 2586240
> 
> 
> 
> but then i see my average clock is only 3102 and yours is 3121. so i guess i need more stability and that will come with water block


Probably what I ran, 13 ecores. But only way to know for sure is to step through them one by one which is what I did for the original FS runs. Problem is, as the number of ecores increases, the less consistent the benchmark becomes. So I get quite a lot of run to run variation depending on the e-core configuration.

You can see what number of e-cores people ran in the CPU section.

Edit: actually might be 9.









I scored 28 647 in Fire Strike Ultra


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## WayWayUp

Don’t trust slinky’s scores
Anyways take a look at the #4 score









It’s not just that his graphics score is higher…
It’s WAY higher
He is getting a LN2 graphics result using ordinary clock speeds


----------



## Jay-G30

WayWayUp said:


> Don’t trust slinky’s scores
> Anyways take a look at the #4 score
> 
> View attachment 2586244
> 
> It’s not just that his graphics score is higher…
> It’s WAY higher
> He is getting a LN2 graphics result using ordinary clock speeds


Artifacting ? it is not just Port Royal it happens on , have seen it on many many benchmarks so would imagine Firestrike is no different ... i have seen it happen in Timespy Extreme on graphics test 1 and boost the score up at the end . It is pretty easy to tell which is genuine and which isn't by checking the fps numbers and comparing to a similar setup .


----------



## WayWayUp

I was thinking the same since his combined score is so normal
It’s really dumb and why I don’t take 3dmark score’s seriously this generation

I already ditched port royal that’s the only reason I’m doing fire strike, I was watching live streams of ln2 benchmarking with ppl posting their best scores only at 29k..

meanwhile I have a borked score higher than that lol
I will use the scores as just reference, but getting a top spot on the leaderboards used to mean something. It was more satisfying 
Now your competing both bugged out heavily boosted scores


----------



## 8472

What's the best benchmark/game for testing vram oc stability?


----------



## SilenMar

MrB123 said:


> something is not 100% with my Giga card . Is it realy that easy that its just a bad assembled card or ... is it somthing worse
> It have the sticker on the screws so i cant open it (i may return it to the store). im really not to happy about the temps
> or is it like this with all 600W cards?
> 
> i have my card on my desk with out a case and the temps is still way higher than my other 4090 (glaxa/KFA2 500w)


Removing the sticker won't void the warranty. You can even carefully peel it off completely with tweezers then stick it back.
But the thermal pads are probably brittle enough that you have to replace them after every disassembly. The thickness of the thermal pads have 0.25mm increments so that's inconvenient.


----------



## Frosted racquet

8472 said:


> What's the best benchmark/game for testing vram oc stability?











GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability


Vulkan compute tool for testing video memory stability - GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability




github.com


----------



## yzonker

WayWayUp said:


> I was thinking the same since his combined score is so normal
> It’s really dumb and why I don’t take 3dmark score’s seriously this generation
> 
> I already ditched port royal that’s the only reason I’m doing fire strike, I was watching live streams of ln2 benchmarking with ppl posting their best scores only at 29k..
> 
> meanwhile I have a borked score higher than that lol
> I will use the scores as just reference, but getting a top spot on the leaderboards used to mean something. It was more satisfying
> Now your competing both bugged out heavily boosted scores


Yea some of those runs are artifacted, but @J7SC showed that the graphics score can be quite a lot higher if it isn't being nerfed by some threads apparently running on Intel e-cores. It is primarily GT1 that is being affected, maybe because it runs at significantly higher framerates.

That's why I made that run this morning with e-cores disabled which was a 700pt increase over my previous run with all 16 e-cores enabled. 

So FSU is tricky to really determine the validity of someone's graphics score. It appears 28-29k graphics is definitely do-able on the right platform and/or configuration.


----------



## WayWayUp

look at graphics test 1. he has *194.56fps*

I just want to point out for reference...OGS, the #1 score in the world who is running a custom modified and volt moded power unlocked Galax HOF card with Ln2 @ 3705 MHZ and an 13900k @ 7.5Ghz only has *167.39 FPS* in Graphics test 1

i actually beat this dude in graphic test 2 as well as physics, but here he is sitting with the #4 score 

something so blatantly obvious should be reported


----------



## Sheyster

Big drop at Best Buy this morning, multiple models all sold out very quickly. I tried to grab an FE, got in "line", phone verified and nothing, out of stock. 

I have a friend who wants a card but does not OC it. I figure I'll sell him the FE or my GB OC depending on which is better OC-wise. I'll keep the better clocking card.


----------



## Sheyster

Betroz said:


> No I am not. I care about gaming performance only. I play mostly BF2042 on my LG 3840x1600 160 Hz monitor, and my old 3080Ti is not fast enough to push over 160 fps at all times.
> 
> I will probably go for the cheapest Asus TUF card - if I can get hold of one before the retailer fases it out. From what I can gather, they only want to sell the TUF OC version... shocker...


Yeah, it's crazy considering the TUF OC isn't binned. You're paying for marketing and a tiny core clock bump which is basically worthless. 

Buy the regular TUF and flash the Strix v2.1 BIOS I posted. This will get you a slightly higher default core bump and a 500w default power limit. If ASUS wanted to differentiate the two TUF cards a little more, they should have lowered the max power limit on the regular TUF. It also has a 600w max PL BIOS.


----------



## Jay-G30

WayWayUp said:


> View attachment 2586251
> 
> 
> look at graphics test 1. he has *194.56fps*
> 
> I just want to point out for reference...OGS, the #1 score in the world who is running a custom modified and volt moded power unlocked Galax HOF card with Ln2 @ 3705 MHZ and an 13900k @ 7.5Ghz only has *167.39 FPS* in Graphics test 1
> 
> i actually beat this dude in graphic test 2 as well as physics, but here he is sitting with the #4 score
> 
> something so blatantly obvious should be reported


100% a bugged score , no way in hell it is getting that much of a boost in GT1 only 

For reference here is mine using almost identical clocks (mine a touch higher ) on the core and memory and temp is identical at 42 deg ...

The 13900k is not giving him a 42fps jump in GT1 over a 10900k 









I scored 25 221 in Fire Strike Ultra


Intel Core i9-10900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## J7SC

yzonker said:


> That's interesting. The e-cores are still hurting the Intel scores, just like the 1080p FS, but not as severe. I ran this with all e-cores disabled which is much closer for the graphics score.
> 
> All e-cores disable:
> 
> View attachment 2586204
> 
> 
> All e-cores,
> 
> View attachment 2586205


....you're still one ahead at TSExtreme Graphics at HoF which btw is fine by me as I look mostly at my own previous best....still, I have to do s.th. on the VRAM temps and have some ideas I shared earlier...problem is the build is 'complex' due to work+play nature (below). May be I'll wait to mod GPU VRAM cooling for when I get a new CPU (13900KS? Ryzen 7kXD?) and pull it all apart again () . 5950X still doing ok for now though in several benches, especially with the super-tight DDR4.


----------



## Krzych04650

Some more power scaling 350W vs 450W vs 630W. Same curve for all with 3000 MHz target and +1250 memory. Resolution 5120x2160 or higher, max settings, RT where possible, SGSSAA where possible.


----------



## Sheyster

Krzych04650 said:


> Some more power scaling 350W vs 450W vs 630W. Same curve for all with 3000 MHz target and +1250 memory. Resolution 5120x2160 or higher, max settings, RT where possible, SGSSAA where possible.
> View attachment 2586298
> 
> View attachment 2586296
> 
> View attachment 2586287
> 
> View attachment 2586288
> 
> View attachment 2586289
> 
> View attachment 2586290
> View attachment 2586291
> View attachment 2586302
> View attachment 2586303
> View attachment 2586314
> 
> View attachment 2586305


Good stuff. I usually don't power limit but I am a big fan of undervolting. I have not attempted to undervolt my GB OC yet, but my 13700KF is undervolted with all cores locked.


----------



## GRABibus

For those who own a Gigabyte Gaming OC on* stock air cooler* with stock Bios "95.02.18.00.C1" , what are your both max fans speed in GPU-Z when you set them at 100% ?


Mines are ~3000rpm for the first one and ~2600rpm for the second :










Thank you for feedback


----------



## alasdairvfr

GRABibus said:


> For those who own a Gigabyte Gaming OC on* stock air cooler* with stock Bios "95.02.18.00.C1" , what are your both max fans speed in GPU-Z when you set them at 100% ?
> 
> 
> Mines are ~3000rpm for the first one and ~2600rpm for the second :
> 
> View attachment 2586319
> 
> 
> Thank you for feedback


This is mine:


----------



## GRABibus

alasdairvfr said:


> This is mine:
> 
> View attachment 2586323
> 
> 
> View attachment 2586324


Thanks


----------



## MrB123

alasdairvfr said:


> 2 or 3 other users have reported having poor temps on the Giga OC card. The general consensus is they are a very good product with occasional QC concerns. If you don't want to remount for warranty concerns, I think a swap out for another Giga OC you won't likely have this issue. User J7SC posted a photo of his where thermal pads were totally off the mark. Same high hotspot issue.
> 
> My Giga OC is perfect (for now) hotspot in the mid 80s under torture test (Furmark/Kombuster) but under normal full load stays in the high 70s.


I just check the screws on the center of the gpu 1 of them i could almost screw a full turn. the 2 others was also a bit loose. really sloppy of Gigabyte not to tighten the screws. the last screw have the sticker, i will check it later when i can remove the sticker. 
now the hotsport is in the mid 80c insteed of mid 90c  




SilenMar said:


> Removing the sticker won't void the warranty. You can even carefully peel it off completely with tweezers then stick it back.
> But the thermal pads are probably brittle enough that you have to replace them after every disassembly. The thickness of the thermal pads have 0.25mm increments so that's inconvenient.


Thanks for the tip


----------



## WayWayUp

anybody have delta data for bykski vs ekwb founders block?


----------



## Tideman

Alemancio said:


> Kinda odd my Gigabyte Gaming OC will *only take +500 on memory*. Any idea what it could be? A dud? The temps are low.


Actually kinda glad to find I'm not the only one. 

Mine can only do +700. Complete dud.

I'll take it over the horrible coil whine on my previous 4090 TUF though.


----------



## Nico67

Sheyster said:


> Yeah, it's crazy considering the TUF OC isn't binned. You're paying for marketing and a tiny core clock bump which is basically worthless.
> 
> Buy the regular TUF and flash the Strix v2.1 BIOS I posted. This will get you a slightly higher default core bump and a 500w default power limit. If ASUS wanted to differentiate the two TUF cards a little more, they should have lowered the max power limit on the regular TUF. It also has a 600w max PL BIOS.


I think the TUF non OC was more of a MSRP marketing ploy. Likely won't see many until demand drops back to normal as they will just put them all in TUF OC boxes and make extra $$$.


----------



## Sheyster

Tideman said:


> Actually kinda glad to find I'm not the only one.
> 
> Mine can only do +700. Complete dud.
> 
> I'll take it over the horrible coil whine on my previous 4090 TUF though.


I thought I was unlucky with a +1400 limit, I guess I should feel lucky it isn't worse. VRAM is a lottery too I guess.


----------



## Panchovix

Nico67 said:


> I think the TUF non OC was more of a MSRP marketing ploy. Likely won't see many until demand drops back to normal as they will just put them all in TUF OC boxes and make extra $$$.





Sheyster said:


> I thought I was unlucky with a +1400 limit, I guess I should feel lucky it isn't worse. VRAM is a lottery too I guess.


I got my TUF non-OC at MSRP, but damn it is bad, 3030Mhz core clocks in games and just +1100 on VRAM. Probably it isn't binned at all, also coil whine is horrible. +1500VRAM is probably the average overclock though.




Krzych04650 said:


> Some more power scaling 350W vs 450W vs 630W. Same curve for all with 3000 MHz target and +1250 memory. Resolution 5120x2160 or higher, max settings, RT where possible, SGSSAA where possible.


Damn nice comparison, the difference between 450W and 630W is so small, even at that higher resolution.


----------



## coelacanth

I just installed a new Zotac RTX 4090 Trinity OC with the latest driver (527.37). I am getting a lot of stuttering in all the games I've tried so far: Horizon Zero Dawn, Cyberpunk 2077, Far Cry 6, and Assassin's Creed Odyssey. I have an LG CX OLED and enabled G-Sync in the driver and the secret menu on the TV says G-Sync is on. A quick google search seems like a lot of people are having problems with stuttering with 4090s and LG OLEDs. The 4090 replaced a 3080 Ti where everything was buttery smooth, so the problem seems to be with the 4090. Everything else in my system is the same as before.

Anyone else experience this or have a fix?


----------



## lawson67

coelacanth said:


> I just installed a new Zotac RTX 4090 Trinity OC with the latest driver (527.37). I am getting a lot of stuttering in all the games I've tried so far: Horizon Zero Dawn, Cyberpunk 2077, Far Cry 6, and Assassin's Creed Odyssey. I have an LG CX OLED and enabled G-Sync in the driver and the secret menu on the TV says G-Sync is on. A quick google search seems like a lot of people are having problems with stuttering with 4090s and LG OLEDs. The 4090 replaced a 3080 Ti where everything was buttery smooth, so the problem seems to be with the 4090. Everything else in my system is the same as before.
> 
> Anyone else experience this or have a fix?


Yes i have the same stuttering with my LG CX Oled 65" , and yes seem worse on the new drivers, stuttering seems to come then go and its really smooth then it randomly comes back again, i just updated the new drivers i might DDU them and and do a fresh install, i have GSync on with Vertical Sync set to fast, limited frames to 118 and tried 117, frustrating


----------



## coelacanth

lawson67 said:


> Yes i have the same stuttering with my LG CX Oled 65" , and yes seem worse on the new drivers, stuttering seems to come then go and its really smooth then it randomly comes back again, i just updated the new drivers i might DDU them and and do a fresh install, i have GSync on with Vertical Sync set to fast, limited frames to 118 and tried 117, frustrating


I did DDU and fresh install when I put the card in and it's stutter city. I'll try some of the other things you mentioned. I checked the TV firmware and it's up to date.


----------



## Sheyster

lawson67 said:


> i have GSync on with *Vertical Sync set to fast*


I was confused about the best setting for G-Sync + V-Sync so I went directly to jorimt of Blurbusters.com. He said to set V-Sync ON, not FAST. More latency is introduced with it set to FAST apparently. He's the authority on this stuff.


----------



## lawson67

Sheyster said:


> I was confused about the best setting for G-Sync + V-Sync so I went directly to jorimt of Blurbusters.com. He said to set V-Sync ON, not FAST. More latency is introduced with it set to FAST apparently. He's the authority on this stuff.


Strange cos Nvidia control panel says V-sync on will give worse latency then set to fast


----------



## Sheyster

lawson67 said:


> Strange cos Nvidia control panel says V-sync on will give worse latency then set to fast
> 
> View attachment 2586382











G-SYNC 101: G-SYNC vs. Fast Sync | Blur Busters


Many G-SYNC input lag tests & graphs -- on a 240Hz eSports monitor -- via 2 months of testing via high speed video!




blurbusters.com





Towards the bottom there is a graph and he explains why not to use FAST (which he's referring to as "Fast Sync", I verified that with him.)


----------



## ttnuagmada

Im using a C1 OLED and ive had no stuttering problems. Using Gsync+Vsync with a 117fps cap. I'm hyper sensitive to stutter/microstutter so im positive i'd notice if something was up.


----------



## Nd4spdvn

Works great here too with a CX OLED, with no stuttering. G-Sync + V-Sync ON and 116fps cap. Was true also for when the initial 4090 drivers did not properly engage G-Sync and the TV stayed in VRR mode (early 4090 days).


----------



## Betroz

Panchovix said:


> I got my TUF non-OC at MSRP, but damn it is bad, 3030Mhz core clocks in games and just +1100 on VRAM. Probably it isn't binned at all, also coil whine is horrible. +1500VRAM is probably the average overclock though.


3030 Mhz core is not bad. Yes +1100 on VRAM is below average maybe. If you want better than this, then a Strix OC should theoretically clock better, but even then there is no guarantee to get 3100+ on the core and +1700 mem.


----------



## keikei

You guys think a 5900X is a sufficient chip for this card? Massive sale on amazon last weekend.


----------



## Betroz

keikei said:


> You guys think a 5900X is a sufficient chip for this card? Massive sale on amazon last weekend.


It depends on what games you are playing and at what resolution and in-game settings. At 1080p/1440p Low settings, then your CPU will be a bottleneck


----------



## xrb936

ianann said:


> What? Link or it doesn't happen!


Personally talked to the head manager of Asus Mainland China division. Will be released in Q1 2023, hopefully.


----------



## keikei

Betroz said:


> It depends on what games you are playing and at what resolution and in-game settings. At 1080p/1440p Low settings, then your CPU will be a bottleneck


What about 4k?


----------



## MrTOOSHORT

keikei said:


> What about 4k?


even a 13900k bottlenecks a 4090 at 4k. But you’ll do ok. Still worth it.


----------



## Betroz

MrTOOSHORT said:


> even a 13900k bottlenecks a 4090 at 4k. But you’ll do ok. Still worth it.


With 1% and 0.1% lows maybe, but avg fps? Which game? CS:GO?


----------



## Benni231990

MrTOOSHORT said:


> even a 13900k bottlenecks a 4090 at 4k.


This is not true in 4k you are absolute in a GPU limit


----------



## alasdairvfr

Benni231990 said:


> This is not true in 4k you are absolute in a GPU limit


Depends on the game. Trky playing MS Flight in 4k. That's probably the best example; while I'd say the CPU isn't usually a huge bottleneck, going to a faster CPU will yield more frames in most cases. Even going from 13th gen intel to faster you will probably see frame increases in a lot of [email protected]% games at 4k


----------



## Sheyster

coelacanth said:


> I did DDU and fresh install when I put the card in and it's stutter city. I'll try some of the other things you mentioned. I checked the TV firmware and it's up to date.


Seems like it's working for some CX/C1 users. In any case I'm going to skip this driver. Everything gaming-wise works fine with the previous one.


----------



## WayWayUp

the best gains from 13th gen will be from tweaking memory.
thats meta
something like 8000 cl36 with tweaked subs will be much faster than a 13900k with a plane jane 6000 ddr5 kit. even at 4k wtih 100% gpu utilization


----------



## rahkmae

Is this a normal temperature 60-61 in PR with EKW and Strix? I think it not, the backplate don't even get warm?


----------



## QuatroKiller

In the 2080TI Owner's Group OP post there is a TON of value add information beyond a list of every single card available. all the Q&A, undervolt/overvolt info, "which card to buy" things like that.

Will that type of write up join the 4090 OP post?

thanks (currently a 2080TI owner praying for a 4090 price cut in the next 6-9 months).


----------



## J7SC

alasdairvfr said:


> Depends on the game. Trky playing MS Flight in 4k. That's probably the best example; while I'd say the CPU isn't usually a huge bottleneck, going to a faster CPU will yield more frames in most cases. Even going from 13th gen intel to faster you will probably see frame increases in a lot of [email protected]% games at 4k


FS 2020 4K is my fav app and it typically gets well over half my game time budget in a given month. FS2020 used to be horrible re. CPU optimization (it would hammer one thread almost continuously at near 100%) but a gazillion patches later, it started to behave better. My 5950X still regularly maxes at 5050 to 5075 MHz on one core, but the latest patches combined with the improved DX12 beta for FS2020 and DLSS3 / Frame Insertion have literally transformed the whole thing !  ...At 4K max everything (including detail on 400/400) w/ DLSS3 and F.I. with DLSS on 'quality', it gets close to a consistent 120 fps even in trickier scenery such as flying low over a metro area.


----------



## motivman

rahkmae said:


> Is this a normal temperature 60-61 in PR with EKW and Strix? I think it not, the backplate don't even get warm?


might as well stick with air cooling with those temps...


----------



## alasdairvfr

J7SC said:


> FS 2020 4K is my fav app and it typically gets well over half my game time budget in a given month. FS2020 used to be horrible re. CPU optimization (it would hammer one thread almost continuously at near 100%) but a gazillion patches later, it started to behave better. My 5950X still regularly maxes at 5050 to 5075 MHz on one core, but the latest patches combined with the improved DX12 beta for FS2020 and DLSS3 / Frame Insertion have literally transformed the whole thing !  ...At 4K max everything (including detail on 400/400) w/ DLSS3 and F.I. with DLSS on 'quality', it gets close to a consistent 120 fps even in trickier scenery such as flying low over a metro area.


I found though while my performance went from basically unplayable 4k high settings to a VERY decent 100-120 fps like you say - I flew around Manhattan yesterday and was at around 60 fps doing low altitude flying around the buildings. This is still really good considering the insane amount of objects in the area, have you tried this? Normally I fly around home or places that I'm familiar with from my pilot days and get very good frames, but Manhattan really is a compute hog for me.

Either way I went from almost never flying to flying a few times a week as a really great way to unwind because of the amazing work they have done optimizing this thing.


----------



## mirkendargen

rahkmae said:


> Is this a normal temperature 60-61 in PR with EKW and Strix? I think it not, the backplate don't even get warm?


No one can tell you without knowing your coolant temp.


----------



## coelacanth

ttnuagmada said:


> Vsy





Nd4spdvn said:


> Works great here too with a CX OLED, with no stuttering. G-Sync + V-Sync ON and 116fps cap. Was true also for when the initial 4090 drivers did not properly engage G-Sync and the TV stayed in VRR mode (early 4090 days).


Tried V-Sync On and V-Sync Fast, V-Sync On seems to work a bit better but still a lot of stutters. I'll try frame capping next but the stutters are happening below 120 FPS anyway so not hopeful.

I usually don't buy brand new hardware because I don't want to be a beta tester. I bought new hardware and now I'm a beta tester. At this point for me the 3080 Ti is a much better gaming experience than the 4090. I don't have time to search the web trying to fix what Nvidia broke and testing all sorts of things out that may or may not help.


----------



## GraphicsWhore

rahkmae said:


> Is this a normal temperature 60-61 in PR with EKW and Strix? I think it not, the backplate don't even get warm?


No. Something is def wrong there. Perhaps poor contact?


----------



## LtMatt

Betroz said:


> With 1% and 0.1% lows maybe, but avg fps? Which game? CS:GO?


The Callisto Protocol, 4K Ultra Settings with RT enabled with is demanding on the CPU with a 4090.
The Callisto Protocol Benchmark | 7950X + 4090 GameRock | 2160P Ultra Settings - YouTube


----------



## J7SC

alasdairvfr said:


> I found though while my performance went from basically unplayable 4k high settings to a VERY decent 100-120 fps like you say - I flew around Manhattan yesterday and was at around 60 fps doing low altitude flying around the buildings. This is still really good considering the insane amount of objects in the area, have you tried this? Normally I fly around home or places that I'm familiar with from my pilot days and get very good frames, but Manhattan really is a compute hog for me.
> 
> Either way I went from almost never flying to flying a few times a week as a really great way to unwind because of the amazing work they have done optimizing this thing.


Flying-too- low is my middle-name  ...if I would have a pilot license, I would have lost it by now...
I put some screenies (including NY) in the spoiler...the last screenie btw (120 fps via RTX 3090 going into space), obviously has less ground detail). A couple of quick points. Before the 4090 and DLSS3 / Frame Insertion patches for FS2020, the built-in fps counter at FS2020 and the LG C1 fps counter were basically matching (ie. w/RTX 3090)- but not anymore. I haven't downloaded the updated FS2020 SDK yet so don't know if that is fixed, but the older one doesn't seem to pick up the F.I. I have seen 120 fps on 4K max detail flying low in Manhattan w/DLSS3&F.I.

Also, I run a huge (200 GB) local rolling cache which helps a lot, as does a 1 Gbps up/down connection. Finally, I get better results with resizable_BAR forced on in FS2020. AFAIK, it is the CPU that does a lot of the ground detail calculations, but from what I've read, DLSS3 / F.I. offloads a big chunk of that to the RTX4K GPU. Compared to the original FS2020 which was a bit of a mess re. optimizations, it has come a long way, even with the DX12 beta 4K I now play exclusively.



Spoiler


----------



## Krzych04650

LOL. This is just a regular 3840x1600 with no downsampling, but still


----------



## Shocchiz

I jumped on the 4090 wagon and I'm really happy with it.
I found a Gainward/Palit 4090, so not the best VRMs.
I was wandering: what the maximum highest but safe bios the card can handle? I guess 600W is too much, right?
What will happen if I use a too powerful PL bios? Could I damage the PCB?
Thanks for sharing any useful info, I'm worried about frying the card...


----------



## yzonker

J7SC said:


> Flying-too- low is my middle-name  ...if I would have a pilot license, I would have lost it by now...
> I put some screenies (including NY) in the spoiler...the last screenie btw (120 fps via RTX 3090 going into space), obviously has less ground detail). A couple of quick points. Before the 4090 and DLSS3 / Frame Insertion patches for FS2020, the built-in fps counter at FS2020 and the LG C1 fps counter were basically matching (ie. w/RTX 3090)- but not anymore. I haven't downloaded the updated FS2020 SDK yet so don't know if that is fixed, but the older one doesn't seem to pick up the F.I. I have seen 120 fps on 4K max detail flying low in Manhattan w/DLSS3&F.I.
> 
> Also, I run a huge (200 GB) local rolling cache which helps a lot, as does a 1 Gbps up/down connection. Finally, I get better results with resizable_BAR forced on in FS2020. AFAIK, it is the CPU that does a lot of the ground detail calculations, but from what I've read, DLSS3 / F.I. offloads a big chunk of that to the RTX4K GPU. Compared to the original FS2020 which was a bit of a mess re. optimizations, it has come a long way, even with the DX12 beta 4K I now play exclusively.
> 
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2586467
> 
> View attachment 2586468
> 
> View attachment 2586469
> 
> View attachment 2586470
> 
> View attachment 2586471
> 
> View attachment 2586472
> 
> View attachment 2586473
> View attachment 2586474
> 
> View attachment 2586476
> 
> View attachment 2586475


I guess you don't do any VR? Last time I tried it on my 3090 it was still a hot mess. Framerates were a lot better but still juddery. I need to fire it up on my current rig though. I've upgraded the CPU and GPU since last I tested it.

And is that a drone in your pics? Is that built in or an add on? I bought a couple of additional planes at one point.


----------



## J7SC

yzonker said:


> I guess you don't do any VR? Last time I tried it on my 3090 it was still a hot mess. Framerates were a lot better but still juddery. I need to fire it up on my current rig though. I've upgraded the CPU and GPU since last I tested it.
> 
> And is that a drone in your pics? Is that built in or an add on? I bought a couple of additional planes at one point.


...no VR, it tends to make me feel a bit queasy - besides, I sit really close to that 48 inch OLED...the way I tend to 'fly', that's all the immersion I can take 

The 'Volocopter' is sort of an electric helicopter (two seats) in real life produced in Germany for air-taxi fleet orders...in FS2020, it has been around since about two patches ago. Note though that even on unlimited fuel settings, it will eventually run out of juice in FS2020. The Volocopter is perfect for exploring all the nooks and crannies - be that in a cityscape at ground level, or a mountain range.


----------



## yzonker

J7SC said:


> ...no VR, it tends to make me feel a bit queasy - besides, I sit really close to that 48 inch OLED...the way I tend to 'fly', that's all the immersion I can take
> 
> The 'Volocopter' is sort of an electric helicopter (two seats) in real life produced in Germany for air-taxi fleet orders...in FS2020, it has been around since about two patches ago. Note though that even on unlimited fuel settings, it will eventually run out of juice in FS2020. The Volocopter is perfect for exploring all the nooks and crannies - be that in a cityscape at ground level, or a mountain range.


Yea the big monitor gets it closer to VR immersion but still really isn't close. I don't get sick, but I do find the comfort level to be a problem for longer sessions. Really only want an hour or 2 at most per session. 

I'll have to get it updated this weekend and look for the Volocoptor. It would be great for exploring which is what I do mostly anyway. Like @alasdairvfr mentioned, I have a tendency to fly places I've been to, etc... to check out the view from the sky and see how well the game renders the area.


----------



## Betroz

LtMatt said:


> The Callisto Protocol, 4K Ultra Settings with RT enabled with is demanding on the CPU with a 4090.
> The Callisto Protocol Benchmark | 7950X + 4090 GameRock | 2160P Ultra Settings - YouTube


Or just a poorly optimized game. From what I saw in the clip, that should be GPU limited.


----------



## yzonker

Betroz said:


> Or just a poorly optimized game. From what I saw in the clip, that should be GPU limited.


Yea there's a few negative comments. 









The Callisto Protocol™ on Steam


Survive to escape the horrors of Callisto and uncover the dark secrets of Jupiter’s dead moon.




store.steampowered.com


----------



## J7SC

yzonker said:


> Yea the big monitor gets it closer to VR immersion but still really isn't close. I don't get sick, but I do find the comfort level to be a problem for longer sessions. Really only want an hour or 2 at most per session.
> 
> I'll have to get it updated this weekend and look for the Volocoptor. It would be great for exploring which is what I do mostly anyway. Like @alasdairvfr mentioned, I have a tendency to fly places I've been to, etc... to check out the view from the sky and see how well the game renders the area.


Not sure if it makes a difference, but I have the 'Premium Deluxe' version of FS2020 which may/may not have some extra aircraft included. The latest patch also added a regular small helicopter that I haven't even tried yet, but it could also be good to explore local nooks and crannies...I also tend to gravitate towards places I know and have been to (below on a flight back to my home on a route from the East I know well). Still, going to new places is also a lot of fun, especially with a 4090 setup.


----------



## tryout1

Panchovix said:


> I got my TUF non-OC at MSRP, but damn it is bad, 3030Mhz core clocks in games and just +1100 on VRAM. Probably it isn't binned at all, also coil whine is horrible. +1500VRAM is probably the average overclock though.
> 
> 
> 
> Damn nice comparison, the difference between 450W and 630W is so small, even at that higher resolution.


Yes i know how you feel, but you loose some and you win some, my core still maxes out at 2900mhz 1.1v but the vram does better, don't forget tho it's still a 4090 and it's probably just 1-2fps difference between good settings and max oc.


----------



## keikei

LtMatt said:


> The Callisto Protocol, 4K Ultra Settings with RT enabled with is demanding on the CPU with a 4090.
> The Callisto Protocol Benchmark | 7950X + 4090 GameRock | 2160P Ultra Settings - YouTube


I suspect the devs will further work on the game, if they ever want to get out of that black hole steam review (mostly negative). I still do want to play the game.


----------



## yzonker

keikei said:


> I suspect the devs will further work on the game, if they ever want to get out of that black hole steam review (mostly negative). I still do want to play the game.


It's like so many games these days. Wait six months and then play it.


----------



## mouacyk

yzonker said:


> It's like so many games these days. Wait six months and then play it.


Free 25% fps from -drm


----------



## J7SC

yzonker said:


> It's like so many games these days. Wait six months and then play it.


After all the patches for FS 2020, I wanted to play some Cyberpunk 2077 (hadn't played for about a week)...but first things first...A *8.3 GB* patch🥴 ...works great though:


----------



## chispy

Anyone else it's having problems with the pre-order water blocks from the shop at EK ?
I pre-order one water block for my msi gaming 4090 over a month ago at the ek shop and when i did ordered the block it stated " Estimated Delivery date early November " and i said ok fine i can wait a little , then that estimated date changed to shipping late November and i said hmmm... this is going to take a while ! , again they changed the date to estimated early December " and it changed again to late december , i checked today and it now it says early January , i said to myself , nop , heck no i'm not waiting months for this overpriced water block from EK , and now on my already paid for pre-order under my account it says estimated delivery date January 12 , 2023  , what gives a .... ???

I already opened a ticket about a week ago on customer service asking when my block will be shipped , and yep you guess correctly , i have not received an answer back from EK customer support , nada , zero , 0 , nilch . I canceled that pre-order asap as i think the most important thing i expect from a company is customer support and communication , EK failed . This company will go to my black list from now on. EK again as has gone downhill as time goes by. Zero customer support and their jacked up prices has done it for me. Almost $400US Dollars shipped for their water block for the 4090 Gaming trio. ** do not trust EK pre-order dates **

Lucky me i found on ebay a Bykski block for the 4090 msi gaming , really cheap for $139US Dollars and with a coupon code -$13.90US Dollars ( 10% Off from ebay ) + i paid $25 for expedited shipping + Tax; Total shipped = $167.36 from China and shall be on my hands by December 16 firm date and has already shipped. I have had great experience with Bykski water blocks , i had one for my nvidia rtx 3070 and amd RX 6900 and they did perform great on par or even better temps and quality than EK blocks. From now on i will stick to Bykski  due to price/performance ratio and quality , plus they do have great customer support directly from US as i have used their customer support before.


** If anyone needs a water block for MSI RTX 4090 Gaming Trio , Gaming X Trio , Suprim they all share the same pcb and this Byski block will fit or the one from Barrow and it's in stock and readily available for shipping at ebay. "" Links below:









Bykski GPU Water Block for MSI GeForce RTX4090 Suprim / GAMING X TRIO 24G | eBay


Find many great new & used options and get the best deals for Bykski GPU Water Block for MSI GeForce RTX4090 Suprim / GAMING X TRIO 24G at the best online prices at eBay! Free shipping for many products!



www.ebay.com













Barrow GPU Water Cooling Block for MSI RTX 4090 Suprim X 24G/GAMING X TRIO 24G | eBay


Find many great new & used options and get the best deals for Barrow GPU Water Cooling Block for MSI RTX 4090 Suprim X 24G/GAMING X TRIO 24G at the best online prices at eBay! Free shipping for many products!



www.ebay.com


----------



## B1gD4ddy

my 450w inno3d x3 oc is utter trash.
only does +200/+1000.


----------



## Betroz

B1gD4ddy said:


> my 450w inno3d x3 oc is utter trash.
> only does +200/+1000.


.....but the performance is still great yeah?  
Not all cards will do crazy high clocks, especially a 450W locked one.


----------



## bigfootnz

B1gD4ddy said:


> my 450w inno3d x3 oc is utter trash.
> only does +200/+1000.


I do not think that there is trash 4090. My PNY 4090 XLR8 is just same like yours, and if you are not chasing 3Dmark top scores you will almost not noticed difference between 450W and 600W 4090.

For example, 4090 runs BF5 at [email protected] (with some settings dial back just to have as best visibility for multiplayer) with just 115W. Most of my games which I've tried didn't managed to draw even close 400W. Probably with 4k, I'll be closer to 450W but even that should be enough.


----------



## a_Criminai

hmatt1981 said:


> This is going to be a stupid question no doubt.
> 
> But am I able to install the Zotac Trinitiy OC bios on a normal Trinity card, as they both look the same with one just having a boost.


Yes. You can also install the Zotac Extreme bios.


----------



## Krzych04650

Done some more power scaling testing, I have 30 games in total now, so I think that is enough. I will put the screenshots in the spoiler because 30 is a lot of screenshots. Rules are the same as before +130/+1250 1100mV overclock with 3000 MHz target, 350W vs 450W vs 630W on Neptune BIOS. Resolution from 5120x2160 21:9 up to 8K 21:9 depending on the game, RT/DLSS where possible, MSAA/SGSSAA where possible, so generally it doesn't get any harder than this.


Spoiler: 350W vs 450W vs 630W





















































































































































































































































I've also made a summary in excel with efficiency rating and average power draw for 630W BIOS since it doesn't actually draw that much and varies a lot from game to game, the actual average is actually only 499W.









Looks like 450W is almost entirely sufficient even for OC. Could have easily been 350W card stock and avoid all the power draw drama. I find it really hilarious how the card that was rumored to be 600W or even 800W and came with comically gigantic coolers because of that is actually the pinnacle of power efficiency to the point of ridiculousness, it is literally twice as efficient as Ampere, this is even slightly better than what Pascal did.

EDIT
Added Quake RTX and fixed some mistakes in the table
Added Path of Exile
UPDATE for 666W Galax BIOS here


----------



## Shocchiz

Krzych04650 said:


> Done some more power scaling testing, I have 28 games in total now, so I think that is enough. I will put the screenshots in the spoiler because 28 is a lot of screenshots. Rules are the same as before +130/+1250 1100mV overclock with 3000 MHz target, 350W vs 450W vs 630W on Neptune BIOS. Resolution from 5120x2160 21:9 up to 8K 21:9 depending on the game, RT/DLSS where possible, MSAA/SGSSAA where possible, so generally it doesn't get any harder than this


Legendary post, thanks *a lot*.

I suggest trying Quake 2 RTX too, it's the most power hungry game I found, _maybe _a 600W power limit will show an improvement.


----------



## Riadon

I wonder why LOTR of all games has such a high power draw


----------



## Krzych04650

Riadon said:


> I wonder why LOTR of all games has such a high power draw


Yea it is quite interesting. It is with 8xMSAA+4xSGSSAA, but so is NieR Automata and Mirror's Edge and they are not nearly this heavy. This is the only game I've found so far that would actually need that full 630W to not power throttle, unfortunately there is something else at play with those cards because it is not possible to actually hit that 630W under any circumstances and power limit seems to change depending on the load and different games are reporting power limit at different wattages, anywhere from 510W to 570W, which I haven't seen before.

LOTRO is generally an interesting case, it runs absolutely horrendously bad, it is some of the worst performing games in the whole world, but it scales with absolutely everything, on the GPU side, on CPU side, on RAM side. Anything you throw at it, it just eats. There are places where even fully tuned 13900K drops into low 50s and you can also absolutely murder GPUs with SGSSAA in dense wooded areas, so it has basically endless scaling and any gains are meaningful since the baseline is still so low.

I am sure this is not the only MMO that works like this. If for example you were used to mainstream CPU reviews made in modern multithreaded games and concluding that CPU doesn't matter because everything runs in hundreds of FPS anyway then you would fall out of your chair after seeing some of those. The fact that it was New World specifically that was burning 3090s is not a coincidence either, that game is completely out of hinges, with massive stutters, freezers, spikes, you can even hear what it does to the power delivery of the card by how coil whine constantly stops, resumes, stops, resumes, playing this game feels like the whole thing is about to explode, you may just as well run some power virus.



Shocchiz said:


> Legendary post, thanks *a lot*.
> 
> I suggest trying Quake 2 RTX too, it's the most power hungry game I found, _maybe _a 600W power limit will show an improvement.


Added to the list. It has the second biggest drop at 350W, down to 92,5%, so definitely power hungry.


----------



## MrTOOSHORT

Did some benching tonight/this morning...

Broke into the top ten in the Firestrikes:

NVIDIA GeForce RTX 4090 video card benchmark result - Intel Core i9-13900K Processor,EVGA Corp. Z690 DARK KINGPIN (3dmark.com)

NVIDIA GeForce RTX 4090 video card benchmark result - Intel Core i9-13900K Processor,EVGA Corp. Z690 DARK KINGPIN (3dmark.com)

NVIDIA GeForce RTX 4090 video card benchmark result - Intel Core i9-13900K Processor,EVGA Corp. Z690 DARK KINGPIN (3dmark.com)


----------



## tryout1

Ok, maybe i found something out, it seems my 4090 in terms of core oc is still below average but i was just playing CP2077 about 1hr stable (when my card is unstable it normally crashes within seconds) with 1.1v 2970-2955mhz at about 59 °C and FANs pinned at 100%, it looks to me like my core is very temp sensitive as with stock fans only offset of +105 is stable with same 1.1v which is about ~2900mhz, somebody else noticed the same on their card?


----------



## WayWayUp

chispy said:


> Anyone else it's having problems with the pre-order water blocks from the shop at EK ?
> I pre-order one water block for my msi gaming 4090 over a month ago at the ek shop and when i did ordered the block it stated " Estimated Delivery date early November " and i said ok fine i can wait a little , then that estimated date changed to shipping late November and i said hmmm... this is going to take a while ! , again they changed the date to estimated early December " and it changed again to late december , i checked today and it now it says early January , i said to myself , nop , heck no i'm not waiting months for this overpriced water block from EK , and now on my already paid for pre-order under my account it says estimated delivery date January 12 , 2023  , what gives a .... ???


I almost bought from ekwb but yesterday i saw that performance pcs had plenty in stock. they already shipped.. i will get it on tuesday. I always avoid buying direct from ekwb if i can



B1gD4ddy said:


> my 450w inno3d x3 oc is utter trash.
> only does +200/+1000.


thats not the best memory OC but honestly this thread will make you jaded 
I thought i had just a slightly above average card when i first played around with my card but it turns out it's really good. I posted a top 25 Time Spy Extreme score the other day with air cooling

what clocks do you get with +200? if its over 3k you cant really call it a trash card


----------



## Blameless

Riadon said:


> I wonder why LOTR of all games has such a high power draw


A load that is (usually completely coincidentally) balanced to the strengths of the architecture is going to be hot. IQ is usually an extremely poor predictor of power draw.

Hottest game I have ever seen on Ampere (which was a very shader heavy architecture) was Path of Exile with global illumination enabled, for example. Haven't seen that tested on Lovelace, yet.


----------



## Krzych04650

Blameless said:


> A load that is (usually completely coincidentally) balanced to the strengths of the architecture is going to be hot. IQ is usually an extremely poor predictor of power draw.
> 
> Hottest game I have ever seen on Ampere (which was a very shader heavy architecture) was Path of Exile with global illumination enabled, for example. Haven't seen that tested on Lovelace, yet.


Still is it seems, second highest margins right after LOTRO. Added to the list.









EDIT Galax 666W BIOS


----------



## alasdairvfr

yzonker said:


> Yea the big monitor gets it closer to VR immersion but still really isn't close. I don't get sick, but I do find the comfort level to be a problem for longer sessions. Really only want an hour or 2 at most per session.
> 
> I'll have to get it updated this weekend and look for the Volocoptor. It would be great for exploring which is what I do mostly anyway. Like @alasdairvfr mentioned, I have a tendency to fly places I've been to, etc... to check out the view from the sky and see how well the game renders the area.


I have a Quest 2 and I actually prefer playing on the monitor. I have also tried an HP Reverb and even that higher pixel density its too blurry for my liking. I once tried one of the Varjo hmds on a really tricked out flight sim back in 2020 and that was pretty good but its also like 5x the price of a normal hmd.

I've visited some places I used to fly and the generic buildings on the ground are pretty inaccurate to say the least. Not that it really matters. Also the default runway is wrong on a number of airports, not that it matters for the sim but anyone familiar with the area would notice.


----------



## Morteen199

Hello, i have the 4090 gigabyte gaming oc. and i updated to the newest Vbios on the gigabyte site F2.. After the update my card wont boost past 2685 Mhz before it boosted 2760 Mhz ++ even when i up the power target to 133% and gpu temp target to 88% it stil wont boost past 2685 Mhz when i up the gpu boost clock it goes higer but is static .. reflashed to my old bios that i saved.. same thing ... tried strix bios same thing but then it's 2700mhz and wont boost past.. what did gigabyte do whit the F2 bios? and how can i fix it? want it as it was before i updated to F2 bios on the gigabyte gaming OC site.... :/


----------



## Sheyster

Morteen199 said:


> Hello, i have the 4090 gigabyte gaming oc. and i updated to the newest Vbios on the gigabyte site F2.. After the update my card wont boost past 2685 Mhz before it boosted 2760 Mhz ++ even when i up the power target to 133% and gpu temp target to 88% it stil wont boost past 2685 Mhz when i up the gpu boost clock it goes higer but is static .. reflashed to my old bios that i saved.. same thing ... tried strix bios same thing but then it's 2700mhz and wont boost past.. what did gigabyte do whit the F2 bios? and how can i fix it? want it as it was before i updated to F2 bios on the gigabyte gaming OC site.... :/


Did you reinstall drivers after flashing? This might be a case where you need to use DDU to clean up, then reinstall them.

DDU link:









Display Driver Uninstaller Download version 18.0.5.9


Here you can Download Display Driver Uninstaller, this Display Driver Uninstaller is a driver removal utility that can help you completely uninstall AMD/NVIDIA graphics card drivers and packages from your system, with...




www.guru3d.com


----------



## yzonker

Morteen199 said:


> Hello, i have the 4090 gigabyte gaming oc. and i updated to the newest Vbios on the gigabyte site F2.. After the update my card wont boost past 2685 Mhz before it boosted 2760 Mhz ++ even when i up the power target to 133% and gpu temp target to 88% it stil wont boost past 2685 Mhz when i up the gpu boost clock it goes higer but is static .. reflashed to my old bios that i saved.. same thing ... tried strix bios same thing but then it's 2700mhz and wont boost past.. what did gigabyte do whit the F2 bios? and how can i fix it? want it as it was before i updated to F2 bios on the gigabyte gaming OC site.... :/


What voltage is it at when boosting to 2685? Is it just hitting the voltage limit? If so then GB must have lowered the VF curve on the new bios.


----------



## Sheyster

yzonker said:


> What voltage is it at when boosting to 2685? Is it just hitting the voltage limit? If so then GB must have lowered the VF curve on the new bios.


He said he also tried the Strix BIOS, so probably not that.


----------



## Morteen199

Sheyster said:


> Did you reinstall drivers after flashing? This might be a case where you need to use DDU to clean up, then reinstall them.
> 
> DDU link:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Display Driver Uninstaller Download version 18.0.5.9
> 
> 
> Here you can Download Display Driver Uninstaller, this Display Driver Uninstaller is a driver removal utility that can help you completely uninstall AMD/NVIDIA graphics card drivers and packages from your system, with...
> 
> 
> 
> 
> www.guru3d.com



jepp have tried reinstall the drivers whit DDU.. still the same 2685mhz whit 1045 /1050 GPU voltage (mV)


----------



## dr/owned

Need to rant: seriously I expect better from a german company than aquatuning / alphacool (presumably same people) are performing. Ordered their Trio block right out of the gate middle of november with a 12 day shipment timeframe. Date is now reset to another 12 days with no communication about it. And of course they take payment when you make the order instead of when it ships.

Meanwhile I can go spend $120 for a Bykski or Barrow block and they'll get it to me in 2 weeks without any problems.

Do better Germany...do better. (this is also a play-by-play repeat of what happened on the 3090 launch too. Like they just have zero clue how to keep to a production schedule)


----------



## bmagnien

Krzych04650 said:


> Done some more power scaling testing, I have 30 games in total now, so I think that is enough. I will put the screenshots in the spoiler because 30 is a lot of screenshots. Rules are the same as before +130/+1250 1100mV overclock with 3000 MHz target, 350W vs 450W vs 630W on Neptune BIOS. Resolution from 5120x2160 21:9 up to 8K 21:9 depending on the game, RT/DLSS where possible, MSAA/SGSSAA where possible, so generally it doesn't get any harder than this.
> 
> 
> Spoiler: 350W vs 450W vs 630W
> 
> 
> 
> 
> View attachment 2586581
> 
> View attachment 2586591
> 
> View attachment 2586592
> 
> View attachment 2586588
> 
> View attachment 2586590
> 
> View attachment 2586593
> 
> View attachment 2586587
> 
> View attachment 2586586
> 
> View attachment 2586594
> 
> View attachment 2586582
> 
> View attachment 2586580
> 
> View attachment 2586578
> 
> View attachment 2586576
> 
> View attachment 2586589
> 
> View attachment 2586584
> 
> View attachment 2586574
> 
> View attachment 2586573
> 
> View attachment 2586570
> 
> View attachment 2586585
> 
> View attachment 2586583
> 
> View attachment 2586579
> 
> View attachment 2586569
> 
> View attachment 2586577
> 
> View attachment 2586595
> 
> View attachment 2586575
> 
> View attachment 2586572
> 
> View attachment 2586571
> 
> View attachment 2586568
> 
> View attachment 2586614
> View attachment 2586639
> 
> 
> 
> I've also made a summary in excel with efficiency rating and average power draw for 630W BIOS since it doesn't actually draw that much and varies a lot from game to game, the actual average is actually only 499W.
> View attachment 2586640
> 
> 
> Looks like 450W is almost entirely sufficient even for OC. Could have easily been 350W card stock and avoid all the power draw drama. I find it really hilarious how the card that was rumored to be 600W or even 800W and came with comically gigantic coolers because of that is actually the pinnacle of power efficiency to the point of ridiculousness, it is literally twice as efficient as Ampere, this is even slightly better than what Pascal did.
> 
> EDIT
> Added Quake RTX and fixed some mistakes in the table
> Added Path of Exile


this is an incredibly valuable suite of data - so thanks for pulling this together. I might say that the 630w Neptune bios is not a good representation of a quality high performance/high power draw bios, so the results may be getting skewed slightly not by the power draw, but by the quality of the bios table as it’s written (based on my own personal experience and the recent GamersNexus video about their vbios issues). The most recent v2 strix oc bios posted a few pages back has provided me my highest power draw, and highest most stable OC in games of any bios I’ve tried. Rather than proposing you to retest all with that vbios to see if the gain is more, I might try selecting a handful of the lowest performers and see if they improve. Also, shouldn’t the higher draw bios be clocked higher to represent the tangible gains one might be able to achieve with draw above 450w? Otherwise you’re just adding power while artificially limiting clocks so the efficiency will appear even worse than it could be.

Wouldn’t the most fair way of doing this be to establish the max oc that each power limit can run (350w, 450w, 550w), and then test with that?


----------



## yzonker

Made some improvements. I upgraded my thermal pads on my backplate heaters. That bought me maybe 50mhz. Backplate was as hot as a stock 3090 mining ETH! lol



Spoiler: Heaters























I scored 20 255 in Time Spy Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





This one is just totally annoying due to it being very inconsistent depending one where, I think, threads fall on the 13900k. Just have to run it a few times and wait for a good score. lol









I scored 68 339 in Fire Strike


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





This may do the same thing to a lesser degree, but I just ran it once.









I scored 48 593 in Fire Strike Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





FSU is much more consistent at least,









I scored 28 233 in Fire Strike Ultra


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Still stuck at about the same spot in PR though. Just a tiny increase. I think my cards lack of core clock hurts me here maybe more than some of the others.









I scored 29 040 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Morteen199

yzonker said:


> What voltage is it at when boosting to 2685? Is it just hitting the voltage limit? If so then GB must have lowered the VF curve on the new bios.


2685mhz 1.1v 1045 /1050 GPU voltage (mV)


----------



## yzonker

Morteen199 said:


> 2685mhz 1.1v 1045 /1050 GPU voltage (mV)


I'm not sure I follow. Post a screenshot of the VF curve in AB with the 1100mv point selected (card at default). Also post a screenshot with the card under load of GPUZ with all of the readings shown (core clock, voltage, etc...) shown.


----------



## Sheyster

coelacanth said:


> I just installed a new Zotac RTX 4090 Trinity OC with the latest driver (527.37). I am getting a lot of stuttering in all the games I've tried so far: Horizon Zero Dawn, Cyberpunk 2077, Far Cry 6, and Assassin's Creed Odyssey. I have an LG CX OLED and enabled G-Sync in the driver and the secret menu on the TV says G-Sync is on. A quick google search seems like a lot of people are having problems with stuttering with 4090s and LG OLEDs. The 4090 replaced a 3080 Ti where everything was buttery smooth, so the problem seems to be with the 4090. Everything else in my system is the same as before.
> 
> Anyone else experience this or have a fix?


FWIW, I installed the latest Windows 11 update (KB5020044) today and the new nVidia driver, no issues at all with the 4090 + LG CX.

This Windows update is the fix for the broken 22H2 cumulative update that caused performance problems with nVidia cards/drivers. Microsoft actually took the previous one down. This is the fixed version that was released a few days ago on Windows Update.

For reference, I frame cap (117) and use G-Sync + V-Sync.


----------



## Morteen199

Morteen199 said:


> Hello, i have the 4090 gigabyte gaming oc. and i updated to the newest Vbios on the gigabyte site F2.. After the update my card wont boost past 2685 Mhz before it boosted 2760 Mhz ++ even when i up the power target to 133% and gpu temp target to 88% it stil wont boost past 2685 Mhz when i up the gpu boost clock it goes higer but is static .. reflashed to my old bios that i saved.. same thing ... tried strix bios same thing but then it's 2700mhz and wont boost past.. what did gigabyte do whit the F2 bios? and how can i fix it? want it as it was before i updated to F2 bios on the gigabyte gaming OC site.... :/





yzonker said:


> I'm not sure I follow. Post a screenshot of the VF curve in AB with the 1100mv point selected (card at default). Also post a screenshot with the card under load of GPUZ with all of the readings shown (core clock, voltage, etc...) shown.





yzonker said:


> I'm not sure I follow. Post a screenshot of the VF curve in AB with the 1100mv point selected (card at default). Also post a screenshot with the card under load of GPUZ with all of the readings shown (core clock, voltage, etc...) shown.






















https://gpuz.techpowerup.com/22/12/04/rzk.png


----------



## Laithan

1.100v is max voltage so set the curve so that 1100mV is sitting at 2900Mhz or so


----------



## Morteen199

Laithan said:


> 1.100v is max voltage so set the curve so that 1100mV is sitting at 2900Mhz or so


okei.. but did hapen after i updated to f2 bios and after i Reflased back to my old saved bios same thing.. so what did gigabyte do? lower the power of the card? cant the bios be fixed as it was before? :/


----------



## yzonker

Morteen199 said:


> okei.. but did hapen after i updated to f2 bios and after i Reflased back to my old saved bios same thing.. so what did gigabyte do? lower the power of the card? cant the bios be fixed as it was before? :/


No, it's working correctly based on your GPUZ screenshot. It's just that the default curve is lower than before.

It would be hard to find, but if you search back toward the beginning of this thread you would find some of us comparing the default curve. The unique thing about the 4090 is that the same exact model of a card can have different default curves from one card to the next. My guess is whatever sets that default curve differently from card to card (same model) has been changed on yours and the vBios does not completely control this. 

You may just have to compensate for it by setting a core offset in Afterburner as I don't think we have a way to alter this additional offset I'm referring to other than manually changing it.


----------



## Laithan

Morteen199 said:


> okei.. but did hapen after i updated to f2 bios and after i Reflased back to my old saved bios same thing.. so what did gigabyte do? lower the power of the card? cant the bios be fixed as it was before? :/


MSI can sometimes get weird with BIOS flashes.. rare but this could just be a software issue. Uninstall MSI AB and the most important part is to answer NO to keeping your settings. Reboot and after windows boots again just re-install MSI AB (download from the official website only) and you should be good. Don't forget to unlock the voltage adjustments and max the power and test again. You can always manually set the curve also.


----------



## SilenMar

Krzych04650 said:


> Done some more power scaling testing, I have 30 games in total now, so I think that is enough. I will put the screenshots in the spoiler because 30 is a lot of screenshots. Rules are the same as before +130/+1250 1100mV overclock with 3000 MHz target, 350W vs 450W vs 630W on Neptune BIOS. Resolution from 5120x2160 21:9 up to 8K 21:9 depending on the game, RT/DLSS where possible, MSAA/SGSSAA where possible, so generally it doesn't get any harder than this.
> 
> 
> Spoiler: 350W vs 450W vs 630W
> 
> 
> 
> 
> View attachment 2586581
> 
> View attachment 2586591
> 
> View attachment 2586592
> 
> View attachment 2586588
> 
> View attachment 2586590
> 
> View attachment 2586593
> 
> View attachment 2586587
> 
> View attachment 2586586
> 
> View attachment 2586594
> 
> View attachment 2586582
> 
> View attachment 2586580
> 
> View attachment 2586578
> 
> View attachment 2586576
> 
> View attachment 2586589
> 
> View attachment 2586584
> 
> View attachment 2586574
> 
> View attachment 2586573
> 
> View attachment 2586570
> 
> View attachment 2586585
> 
> View attachment 2586583
> 
> View attachment 2586579
> 
> View attachment 2586569
> 
> View attachment 2586577
> 
> View attachment 2586595
> 
> View attachment 2586575
> 
> View attachment 2586572
> 
> View attachment 2586571
> 
> View attachment 2586568
> 
> View attachment 2586614
> View attachment 2586639
> 
> 
> 
> I've also made a summary in excel with efficiency rating and average power draw for 630W BIOS since it doesn't actually draw that much and varies a lot from game to game, the actual average is actually only 499W.
> View attachment 2586640
> 
> 
> Looks like 450W is almost entirely sufficient even for OC. Could have easily been 350W card stock and avoid all the power draw drama. I find it really hilarious how the card that was rumored to be 600W or even 800W and came with comically gigantic coolers because of that is actually the pinnacle of power efficiency to the point of ridiculousness, it is literally twice as efficient as Ampere, this is even slightly better than what Pascal did.
> 
> EDIT
> Added Quake RTX and fixed some mistakes in the table
> Added Path of Exile


You guys really play games?

Keep the card at 1.10V instead of 1.045V you will have 560W-600W easily with 4K max setting.


----------



## yzonker

SilenMar said:


> You guys really play games?
> 
> Keep the card at 1.10V instead of 1.045V you will have 560W-600W easily with 4K max setting.


@Krzych04650 ran all of those at 1100mv.


----------



## SilenMar

yzonker said:


> @Krzych04650 ran all of those at 1100mv.


There is no average voltage showing up. I don't think the voltage is kept at 1.10v all the time.
Or it is just simply CPU is bottlenecked.


----------



## Shocchiz

Laithan said:


> MSI can sometimes get weird with BIOS flashes.. rare but this could just be a software issue. Uninstall MSI AB and the most important part is to answer NO to keeping your settings. Reboot and after windows boots again just re-install MSI AB (download from the official website only) and you should be good. Don't forget to unlock the voltage adjustments and max the power and test again. You can always manually set the curve also.


I think you are right, I needed to reset AB while I tried a different bios for my 4090 (I used the AB button, didn't reinstall it).
I wasn't nvidia driver, it was AB, it was stuck at 100& PL.
I can add that I ended up with the Gigabyte bios and update to F2 version, no problem at all, it's not the bios.


----------



## J7SC

Nothing like a nice long weekend gaming session with the 4090  ...some more fun with CP 2077 and FS 2020... FYI, another patch for FS 2020, and it turned out to be the 'NVIDIA Reflex Low Latency mode' option. It is automatically on when DLSS3 _and_ Frame Insertion are active. However, you can toggle the 'Reflex Low Latency mode' on/off if you don't play with Frame Insertion.

FS 2020 was always fascinating, but it was also a bit of a sow's ear re. code optimizations in the early days; lately, it is starting to look a bit more like a silk purse...










...CP 2077 w/RTX Ultra is also delicious eye candy


----------



## WayWayUp

Finally ran timespy








I scored 38 076 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





decent result.Haven’t touched my card since I got it. Can’t wait for the waterblock to come in!


----------



## th3illusiveman

coelacanth said:


> I just installed a new Zotac RTX 4090 Trinity OC with the latest driver (527.37). I am getting a lot of stuttering in all the games I've tried so far: Horizon Zero Dawn, Cyberpunk 2077, Far Cry 6, and Assassin's Creed Odyssey. I have an LG CX OLED and enabled G-Sync in the driver and the secret menu on the TV says G-Sync is on. A quick google search seems like a lot of people are having problems with stuttering with 4090s and LG OLEDs. The 4090 replaced a 3080 Ti where everything was buttery smooth, so the problem seems to be with the 4090. Everything else in my system is the same as before.
> 
> Anyone else experience this or have a fix?


i limit my FPS to 115 on my 48C1, For some reason, anything higher results in weird stutters even if MSI afterburner shows a solid FPS line. I think afew FPS buffer is required for VRR to work well. You could try 110 and work your way up to see how high you can go before stutter.


----------



## Krzych04650

bmagnien said:


> this is an incredibly valuable suite of data - so thanks for pulling this together. I might say that the 630w Neptune bios is not a good representation of a quality high performance/high power draw bios, so the results may be getting skewed slightly not by the power draw, but by the quality of the bios table as it’s written (based on my own personal experience and the recent GamersNexus video about their vbios issues). The most recent v2 strix oc bios posted a few pages back has provided me my highest power draw, and highest most stable OC in games of any bios I’ve tried. Rather than proposing you to retest all with that vbios to see if the gain is more, I might try selecting a handful of the lowest performers and see if they improve.


The 630W BIOS I am using works properly, it allowed for proportionally higher draw compared to 600W BIOSes and it has the NVIDIA UEFI fix already applied, I guess this is why I wasn't losing signal during boot compared to Gigabyte or MSI 600W BIOS. There is nothing wrong with this BIOS as far as I can tell.



bmagnien said:


> Also, shouldn’t the higher draw bios be clocked higher to represent the tangible gains one might be able to achieve with draw above 450w? Otherwise you’re just adding power while artificially limiting clocks so the efficiency will appear even worse than it could be.
> 
> Wouldn’t the most fair way of doing this be to establish the max oc that each power limit can run (350w, 450w, 550w), and then test with that?


That is exactly what I did and the main post mentions that explicitly. The maximum stable overclock is first found, +130/+1250 for this card in particular which results in 3060 MHz target clock which really is 3015-3000 MHz actual during load, and only then the card is power limited. The same curve is used for all three power targets so it already is the maximum stable OC for all of them.

If there are cases with 450W and 630W having the same clock then it means that 450W was enough to hit full 1100mv 3000 MHz overlock. There is no way to have higher voltage that 1100mv.

I did try pushing 350W one a bit higher since lower temperatures could theoretically allow for a bit better clock, but it isn't always stable and adding another 15MHz on the card that runs at 3000 MHz is really splitting hairs and it would not change the scores even by 0.1%.



SilenMar said:


> You guys really play games?
> 
> Keep the card at 1.10V instead of 1.045V you will have 560W-600W easily with 4K max setting.


You guy really read posts you are replying to? The card is at 1100mv and again, the main post mentions that explicitly, and it is even written on the excel table. There are even screenshots provided for each game. If there are some games that do not run at 1100mv then it is because of whatever other phantom limiters those cards have, since you cannot actually reach that 600 and 630W in reality and the card reports power limit at varying wattages depending on the load. Out of 30 games there are only 4 that did that and ran slightly lower than 1100mv. I would prefer to have those cards behave normally and actually be able hit that 630W power limit, but for some reason they cannot do that and all BIOSes I've tried have this behavior, and I've tried like 5 different ones.


----------



## rahkmae

GraphicsWhore said:


> No. Something is def wrong there. Perhaps poor contact?










Remove and reinstall all again, i think there was nothing wrong with the contact...maybe a little too much thermal paste. I put less now and nothing changed, same temps. I don't now what is the problem. 3 radiators 2 x 360 and 1 x 280, EKW D5 pump, all max and full oc temp 61-64...







Valley test +195 gpu and +1300 memory







And PR pump 4000 rmp and all vent 1000 rmp, now the temperatures are even worse than before...


----------



## KingEngineRevUp

Card is on water. I can game for hours, run 3dmark stress test, etc. But at idle, or light load like watching YouTube, sometimes my screen will green screen. Happens mostly when watching YouTube.

I have +1500 on my memory. I'm assuming the green screening is due to memory OC and memory being too chilled? I know sometimes the screen artifacts and then will green screen.

Memory temps are in the high 30s or mid 40s when watching YouTube videos.

I'm assuming I'm going to have to dial the memory back because of this? Seems like memory temps are too low?


----------



## lawson67

KingEngineRevUp said:


> Card is on water. I can game for hours, run 3dmark stress test, etc. But at idle, or light load like watching YouTube, sometimes my screen will green screen. Happens mostly when watching YouTube.
> 
> I have +1500 on my memory. I'm assuming the green screening is due to memory OC and memory being too chilled? I know sometimes the screen artifacts and then will green screen.
> 
> Memory temps are in the high 30s or mid 40s when watching YouTube videos.
> 
> I'm assuming I'm going to have to dial the memory back because of this? Seems like memory temps are too low?


My memory is at 38c on air while typing this with a +1500 offset so i don't think its a temp problem


----------



## yzonker

lawson67 said:


> My memory is at 38c on air while typing this with a +1500 offset so i don't think its a temp problem


That depends on the quality of the VRAM. I have to go over +1700 to get a black screen/artifacts on the desktop. But my card will definitely exhibit the same behavior if I push the mem too high. Although mine will black screen, not green.


----------



## yzonker

J7SC said:


> Nothing like a nice long weekend gaming session with the 4090  ...some more fun with CP 2077 and FS 2020... FYI, another patch for FS 2020, and it turned out to be the 'NVIDIA Reflex Low Latency mode' option. It is automatically on when DLSS3 _and_ Frame Insertion are active. However, you can toggle the 'Reflex Low Latency mode' on/off if you don't play with Frame Insertion.
> 
> FS 2020 was always fascinating, but it was also a bit of a sow's ear re. code optimizations in the early days; lately, it is starting to look a bit more like a silk purse...
> View attachment 2586751
> 
> 
> 
> ...CP 2077 w/RTX Ultra is also delicious eye candy
> View attachment 2586752


I did get FS2020 installed and running yesterday. It is really impressive with DLSS 3 turned on. But I did notice that even with my 13900k, I was still seeing main thread limits flying around NY/Manhattan. I was still getting 100fps thanks to DLSS 3 though.

Nevertheless, I see a 7950x3D in your future.... 😎 That'll definitely be the CPU to have for FS2020 anyway. I'd have to hook up my 5800x3D box again to test, but I think it may still be a little faster than my 13900k in that game.


----------



## lawson67

yzonker said:


> That depends on the quality of the VRAM. I have to go over +1700 to get a black screen/artifacts on the desktop. But my card will definitely exhibit the same behavior if I push the mem too high. Although mine will black screen, not green.


Yeah i was lucky with my Zotac Extreme Vram, i can use the vulkan Vram test and i don't get any errors until +1950, ive never seen one artifact or black screen in any benchmark or on desktop even with +2000 on Vram, there's no security stickers on any of the screws which a good thing in Europe as we cant normally strip down the cards and repaste without risking warranty, ive striped mine down and put LM on the core so should not need to repaste ever again, very pleased with Zotac Amp extreme the core overclocks well also i run it at 24/7 @ +220 core +1500 Vram and but can bench at +270 or more, not even tried any higher and not bothered about adding a higher wattage Bios as its simply not worth it on these cards


----------



## yzonker

lawson67 said:


> Yeah i was lucky with my Zotac Extreme Vram, i can use the vulkan Vram test and i don't get any errors until +1950, ive never seen one artifact or black screen in any benchmark or on desktop even with +2000 on Vram, there's no security stickers on any of the screws which a good thing in Europe as we cant normal strip down the cards and repaste without risking warranty, ive striped mine down and put LM on the core so should not need to repaste ever again, very pleased with Zotac Amp extreme


Well sadly my card could go +1800 or more with the air cooler and not black screen. The fan stop on the air cooler helps a lot to keep the mem warm enough to not exhibit that behavior. 

Wasn't quite as good as yours, but I think the mem tester didn't throw errors until +1900 or so.


----------



## yzonker

WayWayUp said:


> Finally ran timespy
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 38 076 in Time Spy
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> decent result.Haven’t touched my card since I got it. Can’t wait for the waterblock to come in!


That's one benchmark that really loves DDR5. I can't get close to your CPU score with my DDR4 board. Barely breaks 26k. Oddly most of the other 3dmark cpu benchmarks don't seem to be so bandwidth starved.


----------



## SilenMar

Krzych04650 said:


> You guy really read posts you are replying to? The card is at 1100mv and again, the main post mentions that explicitly, and it is even written on the excel table. There are even screenshots provided for each game. If there are some games that do not run at 1100mv then it is because of whatever other phantom limiters those cards have, since you cannot actually reach that 600 and 630W in reality and the card reports power limit at varying wattages depending on the load. Out of 30 games there are only 4 that did that and ran slightly lower than 1100mv. I would prefer to have those cards behave normally and actually be able hit that 630W power limit, but for some reason they cannot do that and all BIOSes I've tried have this behavior, and I've tried like 5 different ones.


So it is the card cannot keep 1.10V all the time. Use tools such as K-boost from EVGA to keep it 1.10V.


----------



## KingEngineRevUp

rahkmae said:


> View attachment 2586766
> 
> Remove and reinstall all again, i think there was nothing wrong with the contact...maybe a little too much thermal paste. I put less now and nothing changed, same temps. I don't now what is the problem. 3 radiators 2 x 360 and 1 x 280, EKW D5 pump, all max and full oc temp 61-64...
> View attachment 2586767
> 
> Valley test +195 gpu and +1300 memory
> View attachment 2586768
> 
> And PR pump 4000 rmp and all vent 1000 rmp, now the temperatures are even worse than before...


You're probably not tightening the screws enough.


----------



## J7SC

Krzych04650 said:


> The 630W BIOS I am using works properly, it allowed for proportionally higher draw compared to 600W BIOSes and it has the NVIDIA UEFI fix already applied, I guess this is why I wasn't losing signal during boot compared to Gigabyte or MSI 600W BIOS. There is nothing wrong with this BIOS as far as I can tell.
> (...)
> If there are some games that do not run at 1100mv then it is because of whatever other phantom limiters those cards have, since you cannot actually reach that 600 and 630W in reality and the card reports power limit at varying wattages depending on the load. Out of 30 games there are only 4 that did that and ran slightly lower than 1100mv. I would prefer to have those cards behave normally and actually be able hit that 630W power limit, but for some reason they cannot do that and all BIOSes I've tried have this behavior, and I've tried like 5 different ones.


I noticed that going from air to water-cooling on my Giga-G-OC dropped typical max peak power in GPU-Z and HWInfo by about 30W to 40W (w/o impacting performance). I assume that it is the allotment for the stock fans and RGB that makes a difference, or perhaps the lower temps also contribute via some algorithm. Then again, running OCCT GPU tests brings it right back up to about 590W...games though are a different matter and your tables are very much appreciated.



yzonker said:


> I did get FS2020 installed and running yesterday. It is really impressive with DLSS 3 turned on. But I did notice that even with my 13900k, I was still seeing main thread limits flying around NY/Manhattan. I was still getting 100fps thanks to DLSS 3 though.
> 
> Nevertheless, I see a 7950x3D in your future.... 😎 That'll definitely be the CPU to have for FS2020 anyway. I'd have to hook up my 5800x3D box again to test, but I think it may still be a little faster than my 13900k in that game.


...yup, I've been holding on to the 5950X Asus DarkH for now as I'm hoping to see AMD offer a 7950X3D (CES '23 ?). If that fails, 13900 KS might come to join up. You are right about the 5800X3D...it is currently the best CPU for FS2020, though with DLSS3/FI and my 5950X showing actual clocks of > 5050 MHz in FS 2020, I can't really complain - DLSS3/FI made a huge difference in FS2020 for me. 32GB of CL 14-13-13 CR1 3800 DDR4 also helps the 5950X 'stay youthful'.

As to RTX 4090 VRAM in benches, I can literally get an extra bin or two just by turning up central heat by 2 C...with 1320 / 63 triple-core rads, push-pull fans and multi-D5s, my typical GPU temp doesn't exceed the 40 C range, with VRAM about the same.


----------



## rahkmae

rahkmae said:


> View attachment 2586766
> 
> Remove and reinstall all again, i think there was nothing wrong with the contact...maybe a little too much thermal paste. I put less now and nothing changed, same temps. I don't now what is the problem. 3 radiators 2 x 360 and 1 x 280, EKW D5 pump, all max and full oc temp 61-64...
> View attachment 2586767
> 
> Valley test +195 gpu and +1300 memory
> View attachment 2586768
> 
> And PR pump 4000 rmp and all vent 1000 rmp, now the temperatures are even worse than before...


And now I scored 26 402 in Port Royal change thermal pads to EKW def pads...


----------



## GRABibus

New build 7950X + ASUS Strix X670E-E Gaming Wifi + 32GB DDR5 @ 6200MHz CL30.

GIGABYTE GAMING OC on stock air cooler.


















I scored 29 134 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Average temperature = 53°C.

No significant gain versus my last build 5950X and 32GB DDR4 @ 3800MHz CL14.

Happy to be > 29100


----------



## Krzych04650

J7SC said:


> I noticed that going from air to water-cooling on my Giga-G-OC dropped typical max peak power in GPU-Z and HWInfo by about 30W to 40W (w/o impacting performance). I assume that it is the allotment for the stock fans and RGB that makes a difference, or perhaps the lower temps also contribute via some algorithm. Then again, running OCCT GPU tests brings it right back up to about 590W...games though are a different matter and your tables are very much appreciated.


I am still waiting for waterblock, but it is certainly possible. Lower temperatures should just by themselves reduce power draw at any given VF point. Especially here, Neptune BIOS has a drawback of having 2400 RPM limit since it is using normal case fans on its AIO, which is a lot less than Inno3D does by default at 100% speed, and the difference in temps is pretty big, around 10C, and if you leave it at that 550W+ for like 30 minutes with just 2400 RPM it will eventually hit 80C core and 99C hotspot. My tests were quick so it didn't have the time to get that hot, but still, waterblock should reduce temps by a good 25-30C compared to that. On top of removing the fans and LEDs it should increase power headroom by quite a bit. 

I wouldn't really expect 630W scores to get much better since 26 out of 30 games can already maintain full 3000 [email protected], but it may hold effective clock a little better. It will be interesting though how it affects 350W and 450W limiters, because lower power draw due to much lower temperatures should allow them to hold higher clocks and close the gap a bit. Needing 40W less than now for the same clock would be huge when power limited.


----------



## chispy

I feel you bro , same thing happened to me with EK at the EK Shop , but finally they agreed to refund me as on the email that i was sent they stated that they do not have a real and firm release date for msi gaming trio / gaming X trio / Suprim pre-order water block yet  estimated time for shipping is January 2023  probably that date will change again and again , it's all have been a lie , all those estimated shipping dates for pre-orders are made up by EK nothing more nothing less , they just want your money and hope you wait months for it  . I'm not falling again on the pre-order lie.

Read this my previous post down here , i also waited more than a month for my pre-order without any communication whatsoever from EK until yesterday that i received my full refund.




chispy said:


> Anyone else it's having problems with the pre-order water blocks from the shop at EK ?
> I pre-order one water block for my msi gaming 4090 over a month ago at the ek shop and when i did ordered the block it stated " Estimated Delivery date early November " and i said ok fine i can wait a little , then that estimated date changed to shipping late November and i said hmmm... this is going to take a while ! , again they changed the date to estimated early December " and it changed again to late december , i checked today and it now it says early January , i said to myself , nop , heck no i'm not waiting months for this overpriced water block from EK , and now on my already paid for pre-order under my account it says estimated delivery date January 12 , 2023  , what gives a .... ???
> 
> I already opened a ticket about a week ago on customer service asking when my block will be shipped , and yep you guess correctly , i have not received an answer back from EK customer support , nada , zero , 0 , nilch . I canceled that pre-order asap as i think the most important thing i expect from a company is customer support and communication , EK failed . This company will go to my black list from now on. EK again as has gone downhill as time goes by. Zero customer support and their jacked up prices has done it for me. Almost $400US Dollars shipped for their water block for the 4090 Gaming trio. ** do not trust EK pre-order dates **
> 
> Lucky me i found on ebay a Bykski block for the 4090 msi gaming , really cheap for $139US Dollars and with a coupon code -$13.90US Dollars ( 10% Off from ebay ) + i paid $25 for expedited shipping + Tax; Total shipped = $167.36 from China and shall be on my hands by December 16 firm date and has already shipped. I have had great experience with Bykski water blocks , i had one for my nvidia rtx 3070 and amd RX 6900 and they did perform great on par or even better temps and quality than EK blocks. From now on i will stick to Bykski  due to price/performance ratio and quality , plus they do have great customer support directly from US as i have used their customer support before.
> 
> 
> ** If anyone needs a water block for MSI RTX 4090 Gaming Trio , Gaming X Trio , Suprim they all share the same pcb and this Byski block will fit or the one from Barrow and it's in stock and readily available for shipping at ebay. "" Links below:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bykski GPU Water Block for MSI GeForce RTX4090 Suprim / GAMING X TRIO 24G | eBay
> 
> 
> Find many great new & used options and get the best deals for Bykski GPU Water Block for MSI GeForce RTX4090 Suprim / GAMING X TRIO 24G at the best online prices at eBay! Free shipping for many products!
> 
> 
> 
> www.ebay.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Barrow GPU Water Cooling Block for MSI RTX 4090 Suprim X 24G/GAMING X TRIO 24G | eBay
> 
> 
> Find many great new & used options and get the best deals for Barrow GPU Water Cooling Block for MSI RTX 4090 Suprim X 24G/GAMING X TRIO 24G at the best online prices at eBay! Free shipping for many products!
> 
> 
> 
> www.ebay.com
> 
> 
> 
> 
> 
> View attachment 2586564
> 
> View attachment 2586565





dr/owned said:


> Need to rant: seriously I expect better from a german company than aquatuning / alphacool (presumably same people) are performing. Ordered their Trio block right out of the gate middle of november with a 12 day shipment timeframe. Date is now reset to another 12 days with no communication about it. And of course they take payment when you make the order instead of when it ships.
> 
> Meanwhile I can go spend $120 for a Bykski or Barrow block and they'll get it to me in 2 weeks without any problems.
> 
> Do better Germany...do better. (this is also a play-by-play repeat of what happened on the 3090 launch too. Like they just have zero clue how to keep to a production schedule)


----------



## chispy

dr/owned said:


> Need to rant: seriously I expect better from a german company than aquatuning / alphacool (presumably same people) are performing. Ordered their Trio block right out of the gate middle of november with a 12 day shipment timeframe. Date is now reset to another 12 days with no communication about it. And of course they take payment when you make the order instead of when it ships.
> 
> Meanwhile I can go spend $120 for a Bykski or Barrow block and they'll get it to me in 2 weeks without any problems.
> 
> Do better Germany...do better. (this is also a play-by-play repeat of what happened on the 3090 launch too. Like they just have zero clue how to keep to a production schedule)



^ read up here ^


----------



## KingEngineRevUp

yzonker said:


> That depends on the quality of the VRAM. I have to go over +1700 to get a black screen/artifacts on the desktop. But my card will definitely exhibit the same behavior if I push the mem too high. Although mine will black screen, not green.


Yeah, my screen might be green due to HDR? 

Annoying behavior regardless... Who would have thought water cooling would hurt performance. Went from +1700 and now probably got to try +1400 unless if the card is warmed up while gaming.


----------



## yzonker

KingEngineRevUp said:


> Yeah, my screen might be green due to HDR?
> 
> Annoying behavior regardless... Who would have thought water cooling would hurt performance. Went from +1700 and now probably got to try +1400 unless if the card is warmed up while gaming.


Could try either setting it to prefer max performance or just lock it to max voltage in AB. Raises mem temp a little bit. Obviously won't idle but not a big power difference.


----------



## coelacanth

4090 Zotac Trinity OC, 5950X, both stock, 32GB 3,600 MT/s CL16 RAM.

Port Royal 25,446. Max GPU power draw during the run 438W. GPU max temp 63C, hot spot, 71C. Fans on the GPU max at out 1,124 RPM (31%).

No coil whine and card is super quiet.


----------



## SilenMar

yzonker said:


> @Krzych04650 ran all of those at 1100mv.


Pretty sure he didn't lock the voltage. Or the CPU and ram are not doing a proper work.
I don't play games. But when I do there is easy 580W on my sleeper/work PC with just 600W vBIOS.


----------



## yzonker

SilenMar said:


> Pretty sure he didn't lock the voltage. Or the CPU and ram are not doing a proper work.
> I don't play games. But when I do there is easy 580W on my sleeper/work PC with just 600W vBIOS.
> 
> 
> 
> 
> 
> 
> View attachment 2586831


My card is the same as @Krzych04650 shows. It will power limit in some games quite a bit below the 600w limit. I see nothing wrong with the results. And most games don't draw more than 500w at the 1100mv limit. 

Even CP2077 max settings, 4k, DLSS off, RT psycho only hits about 500w. And that matches very closely to the result shown in the table.


----------



## J7SC

...the only way I get close to the 600 W limit on the stock Giga-G-OC on water-cooling is running things like OCCT (left in pic below) - which I rather not. Normally, my w-cooled temps stay in the 40 C range for the GPU and VRAM, but not with OCCT.

I have been focusing on graphics scores (right side of the pic below), trying to delineate which one is more sensitive to VRAM and which one to GPU core (ditto for CPU score). To get to s.th. like +1526 on the VRAM, I have to run it at a temp which starts to downclock the GPU core by at least one if not two bins...not the end of the world, but still. I got a few 3DM scores in reserve I haven't subbed yet (apart from the screwy ones like PR at 29.8k or even 30.3k), and I am wondering if it is worth flashing the Asus Strix V2 vbios on my card or not. I don't care so much about trying to get the core speed up further, but I would like to have slightly more juice for the VRAM (re. heat w/o affecting the core) and also tighter timings. Thoughts on switching vbios ?


----------



## Nico67

rahkmae said:


> View attachment 2586766
> 
> Remove and reinstall all again, i think there was nothing wrong with the contact...maybe a little too much thermal paste. I put less now and nothing changed, same temps. I don't now what is the problem. 3 radiators 2 x 360 and 1 x 280, EKW D5 pump, all max and full oc temp 61-64...
> View attachment 2586767
> 
> Valley test +195 gpu and +1300 memory
> View attachment 2586768
> 
> And PR pump 4000 rmp and all vent 1000 rmp, now the temperatures are even worse than before...


The pic of the core looks like its making good contact on the top left corner and less so on the rest. Could be the stand offs aren't inserted well. Try gently loosening and retightening them, or don't tighten the top left screw as hard as the others.
The choke pads do look very compressed and them memory ones look good. I think cutting pads to the exact width of the components, maybe even slightly under would allow them to compress better without loosing any cooling. Cutting memory pads narrower might even help keep the temps up a bit, say leaving 1mm either side uncovered.


----------



## J7SC

@chispy ...I know you like Bykski blocks ...did some more light modding for the fun of it; two Bykski blocks living peacefully in one case


----------



## KingEngineRevUp

This is funny.

+1500 Memory at 25-28C, can only watch YouTube for 10-15 minutes before green screen. But can game, stress test, etc for 4-8 hours straight at 45-50C.

Had to dial memory back to +1350 to watch YouTube for 30 minutes, no green screens. What a generation, water-cooling hurting performances.


----------



## yzonker

KingEngineRevUp said:


> This is funny.
> 
> +1500 Memory at 25-28C, can only watch YouTube for 10-15 minutes before green screen. But can game, stress test, etc for 4-8 hours straight at 45-50C.
> 
> Had to dial memory back to +1350 to watch YouTube for 30 minutes, no green screens. What a generation, water-cooling hurting performances.


The 4090 is an amazing piece of tech, but for someone like me that likes to OC, tinker, run benchmarks, the card is kinda boring. Killer for gaming so it performs its primary task well though. 

I've actually toyed with picking up a RDNA3 card at some point as they are still using standard GDDR6 chips which will likely be much more tolerant of lower temps.


----------



## J7SC

yzonker said:


> The 4090 is an amazing piece of tech, but for someone like me that likes to OC, tinker, run benchmarks, the card is kinda boring. Killer for gaming so it performs its primary task well though.
> 
> I've actually toyed with picking up a RDNA3 card at some point as they are still using standard GDDR6 chips which will likely be much more tolerant of lower temps.


...made just for you then...extra cores, 48 GB of regular (non-X) GDDR6...lots of $dough, though


----------



## motivman

got my bykski 4090 waterblock for my gigabyte gaming OC. My original plan was to use this waterblock for a few weeks, while I wait for EKWB or phanteks block to become available. Well I am so impressed that those plans are cancelled. Quality and install is top notch and straight forward. I ended up using artic 1.5mm TP-3 thermal pads on the core side of the block, and really nothing on the backplate. Here are my results running control at max settings and 550W power draw.

Ambient Temp -- 27C
Coolant Temp -- 32C
Core Temp --- 55C
Mem Junction Temp --- 48C
Hotspot Temp --- 67C

So 23C delta at 550W, not bad...

at stock with 450W, my delta is about 18C


----------



## KingEngineRevUp

motivman said:


> got my bykski 4090 waterblock for my gigabyte gaming OC. My original plan was to use this waterblock for a few weeks, while I wait for EKWB or phanteks block to become available. Well I am so impressed that those plans are cancelled. Quality and install is top notch and straight forward. I ended up using artic 1.5mm TP-3 thermal pads on the core side of the block, and really nothing on the backplate. Here are my results running control at max settings and 550W power draw.
> 
> Ambient Temp -- 27C
> Coolant Temp -- 32C
> Core Temp --- 55C
> Mem Junction Temp --- 48C
> Hotspot Temp --- 67C
> 
> So 23C delta at 550W, not bad...
> 
> at stock with 450W, my delta is about 18C
> View attachment 2586883
> 
> View attachment 2586882


Wow I always wondered what a build with just pure fittings would look like haha, awesome.

Those temps are in line with the EKWB so there's no point in waiting for it.


----------



## KingEngineRevUp

yzonker said:


> The 4090 is an amazing piece of tech, but for someone like me that likes to OC, tinker, run benchmarks, the card is kinda boring. Killer for gaming so it performs its primary task well though.
> 
> I've actually toyed with picking up a RDNA3 card at some point as they are still using standard GDDR6 chips which will likely be much more tolerant of lower temps.


Yeah, I have a post somewhere with the 10 or 20 series where I said to another user OC will not be as fun anymore thanks to boost as it evolves. All the OC is done for us already.

I guess imagine if the card ran at base and we had to OC the numbers ourselves lol, we would all be so happy.


----------



## SilenMar

Just wondering is there anyone else has actually run 580W+ on their games? It seems I'm the only one.

Maybe one of my cards has superheated components.


----------



## chispy

J7SC said:


> @chispy ...I know you like Bykski blocks ...did some more light modding for the fun of it; two Bykski blocks living peacefully in one case
> View attachment 2586851



J , that's a beautiful built bro , looking good. What case it's that one ? i see 2x PCs in one .Inception = ( a PC within a PC ) . Awesome built there bro 👌. Thanks for sharing. I cannot wait to get my Bykski block.


----------



## Laithan

SilenMar said:


> Just wondering is there anyone else has actually run 580W+ on their games? It seems I'm the only one.
> 
> Maybe one of my cards has superheated components.


Each individual game draws different power and on top there are different resolutions and game settings that affect power draw. I think what people are saying/showing is that most games will not max out @ 600W (or near) but _some_ will. If you are seeing close to 600W ("580W+") on EVERY game then there is something worthy of a deep dive on your system as that isn't typical.


----------



## J7SC

chispy said:


> J , that's a beautiful built bro , looking good. What case it's that one ? i see 2x PCs in one .Inception = ( a PC within a PC ) . Awesome built there bro 👌. Thanks for sharing. I cannot wait to get my Bykski block.


Thanks !...work + play 2x ATX X570 mobo build...case is a TT Core P8 which 'ran into a Dremel tool'


----------



## Dreams-Visions

Just hooked up my FE and did some testing. Looking like my best, stable config is for max benchmarks is:
+133
+185 core
+2000 mem (maxed)

On air, that's about 2910MHz steady in testing, ~60C. No coil whine.









I scored 27 725 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 30 015 in Time Spy


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 18 032 in Time Spy Extreme


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 23 450 in Fire Strike Ultra


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 37 673 in Fire Strike Extreme


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Is this a keeper? Looking for a second opinion before I fully commit and put it under water. I have a TUF OC in shrink wrap, trying to figure out if it's even worth taking out of the box.


----------



## EarlZ

Dreams-Visions said:


> Just hooked up my FE and did some testing. Looking like my best, stable config is for max benchmarks is:
> +133
> +185 core
> +2000 mem (maxed)
> 
> On air, that's about 2910MHz steady in testing, ~60C. No coil whine.
> 
> I scored 27 725 in Port Royal
> 
> Is this a keeper? Looking for a second opinion before I fully commit and put it under water. I have a TUF OC in shrink wrap, trying to figure out if it's even worth taking out of the box.


What process do you go through or use to validate the stability for the +2000 on mem?


----------



## Dreams-Visions

EarlZ said:


> What process do you go through or use to validate the stability for the +2000 on mem?


I dial until they fail in various 3D mark tests, then over to the games I play. Been doing this for about 15 years so I typically have a good idea of when I'm around its limit. I'll cross-check with this to complete the process









GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability


Vulkan compute tool for testing video memory stability - GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability




github.com














Looks good so far.


----------



## EarlZ

Dreams-Visions said:


> I dial until they fail in various 3D mark tests, then over to the games I play. Been doing this for about 15 years so I typically have a good idea of when I'm around its limit. I'll cross-check with this to complete the process
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability
> 
> 
> Vulkan compute tool for testing video memory stability - GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability
> 
> 
> 
> 
> github.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks good so far.


Thanks for the share! I was unfamilair with the process and I wanted to gain more insight.


----------



## J7SC

Dreams-Visions said:


> I dial until they fail in various 3D mark tests, then over to the games I play. Been doing this for about 15 years so I typically have a good idea of when I'm around its limit. I'll cross-check with this to complete the process
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability
> 
> 
> Vulkan compute tool for testing video memory stability - GitHub - GpuZelenograd/memtest_vulkan: Vulkan compute tool for testing video memory stability
> 
> 
> 
> 
> github.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks good so far.


I am not sure I understand your GB/sec numbers vs ' +2000' - see the test below (at +1500) but keeping in mind that I am using an older version of memtest_vulkan. 

In any event, it is a bit of a tough call between the NVidia 4090 FE and the TUF OC. The former will probably keep a bit better resale value, but unlike the FE, you can flash other custom vbios onto the TUF OC (ie. Strix). I guess you don't want to open the still-sealed TUF OC to do head-to-head comparison. Also, is the 2910 MHz core for the FE you referenced above with 1.05V or 1.10V ?


----------



## Dreams-Visions

EarlZ said:


> Thanks for the share! I was unfamilair with the process and I wanted to gain more insight.


cheers!



J7SC said:


> I am not sure I understand your GB/sec numbers vs ' +2000' - see the test below (at +1500) but keeping in mind that I am using an older version of memtest_vulkan.
> 
> In any event, it is a bit of a tough call between the NVidia 4090 FE and the TUF OC. The former will probably keep a bit better resale value, but unlike the FE, you can flash other custom vbios onto the TUF OC (ie. Strix). I guess you don't want to open the still-sealed TUF OC to do head-to-head comparison. Also, is the 2910 MHz core for the FE you referenced above with 1.05V or 1.10V ?
> 
> View attachment 2586909


I'm not sure. I don't know that I can change the V on the FE? It's greyed out in Afterburner.


----------



## Betroz

motivman said:


> got my bykski 4090 waterblock for my gigabyte gaming OC.


Awesome build! But I don't understand WHY you would want to have your PC like that out of pure practicality - like when/if you need to swap out your motherboard. ALL that hassle to do only that. But then again I'm a simple man, so what do I know 😌


----------



## J7SC

Dreams-Visions said:


> cheers!
> 
> 
> I'm not sure. I don't know that I can change the V on the FE? It's greyed out in Afterburner.


...a test like Port Royal on max clocks and power limit w/ HWInfo open in the background should show the max GPU voltage


----------



## Zarbain

Did anyone update their Vbios on asus tuf to the v2.1 and see higher temps? My delta pushes 20c for hotspot in some games now and I don't remember it doing this before.


----------



## Dreams-Visions

J7SC said:


> ...a test like Port Royal on max clocks and power limit w/ HWInfo open in the background should show the max GPU voltage


It was .900 or .910V. I had to uninstall Afterburner and install the beta for it to show something other than "0mV". After updating and setting to 1.1V:









I scored 29 882 in Time Spy


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 18 369 in Time Spy Extreme


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 38 752 in Fire Strike Extreme


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 24 284 in Fire Strike Ultra


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 28 324 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Definitely increased, and it looks like it will be at least 3030MHz under water with the temps controlled given it's doing that on air.

Was surprised to see it almost hit 600w along the way.

Should I hold onto this one or check that TUF?


----------



## KingEngineRevUp

Dreams-Visions said:


> It was .900 or .910V. I had to uninstall Afterburner and install the beta for it to show something other than "0mV". After updating and setting to 1.1V:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 882 in Time Spy
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 369 in Time Spy Extreme
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 38 752 in Fire Strike Extreme
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 24 284 in Fire Strike Ultra
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 324 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Definitely increased, and it looks like it will be at least 3030MHz under water with the temps controlled given it's doing that on air.
> 
> Was surprised to see it almost hit 600w along the way.
> 
> Should I hold onto this one or check that TUF?


The FE is a more sought after card. It has a higher resale value and is more rate than the TUF. Keep it.

OC 4090 this generation sucks anyways, not going to gain notable performance OC one card to another unless you have a dud. If you're doing+2000 on the memory perfectly fine, that is a golden memory silicon.


----------



## motivman

Betroz said:


> Awesome build! But I don't understand WHY you would want to have your PC like that out of pure practicality - like when/if you need to swap out your motherboard. ALL that hassle to do only that. But then again I'm a simple man, so what do I know 😌


Look again… do you notice the quick disconnects? Swapping my motherboard takes me less than 20 minutes….


----------



## motivman

KingEngineRevUp said:


> The FE is a more sought after card. It has a higher resale value and is more rate than the TUF. Keep it.
> 
> OC 4090 this generation sucks anyways, not going to gain notable performance OC one card to another unless you have a dud. If you're doing+2000 on the memory perfectly fine, that is a golden memory silicon.


I went through 5 4090’s looking for a golden sample… they all performed similarly tbh. If I could do it all over again, would have kept my 2nd FE. The FE just looks premium AF compared to the gigabyte gaming oc and msi trio, plus it used way less power and had a lower variation between core and effective clock. I just wanted to see 3ghz in my osd while gaming, but that doesn’t matter when your effective clock is 100 mhz lower than core clock with these aibs, lol.


----------



## Dreams-Visions

KingEngineRevUp said:


> The FE is a more sought after card. It has a higher resale value and is more rate than the TUF. Keep it.
> 
> OC 4090 this generation sucks anyways, not going to gain notable performance OC one card to another unless you have a dud. If you're doing+2000 on the memory perfectly fine, that is a golden memory silicon.


Understood. TYVM!


----------



## KedarWolf

Zarbain said:


> Did anyone update their Vbios on asus tuf to the v2.1 and see higher temps? My delta pushes 20c for hotspot in some games now and I don't remember it doing this before.
> View attachment 2586916


I think the default fan curve is broken.

I set a custom fan curve, and even though it's saying the RPM is about the same as the default one, my temps are 20C lower.


----------



## Betroz

Is it possible to flash a Strix OC BIOS to a TUF OC card?


----------



## Nizzen

Betroz said:


> Is it possible to flash a Strix OC BIOS to a TUF OC card?


I flashed strix oc 4090 bios to tuf oc 4090 

You got the 4090 tuf?


----------



## J7SC

Running the water-cooled Giga-G-OC at warmer temps to help the VRAM and improved on my previous Superposition 4K/8K (stock settings) by ~ 200 points...Benching via the central heat wall thermostat


----------



## tomerturbo

i have asus strix 4090 oc during time spy extreme my hot spot temp is 81c IS IT SAFE? gpu temp 66 memory temp 64


----------



## Betroz

Nizzen said:


> You got the 4090 tuf?


Not yet. Waiting for Proshop to let me buy it. They have it in stock, but on "notify me" as of know. Waiting to see what happens after 14:00. TUF OC version. The standard non-oc comes next week maybe. Waiting will save me some money though....but feel I have waited long enough 

Edit : TUF OC order placed. If only my patience would allow me to wait for the non-oc version and save some money....but noooooo.


----------



## alasdairvfr

KingEngineRevUp said:


> Card is on water. I can game for hours, run 3dmark stress test, etc. But at idle, or light load like watching YouTube, sometimes my screen will green screen. Happens mostly when watching YouTube.
> 
> I have +1500 on my memory. I'm assuming the green screening is due to memory OC and memory being too chilled? I know sometimes the screen artifacts and then will green screen.
> 
> Memory temps are in the high 30s or mid 40s when watching YouTube videos.
> 
> I'm assuming I'm going to have to dial the memory back because of this? Seems like memory temps are too low?


This with displayport or hdmi? This can be a result of HDMI handshake error


----------



## motivman

tomerturbo said:


> i have asus strix 4090 oc during time spy extreme my hot spot temp is 81c IS IT SAFE? gpu temp 66 memory temp 64


VERY SAFE, LOL


----------



## motivman

alasdairvfr said:


> This with displayport or hdmi? This can be a result of HDMI handshake error


Nope, this is the cold memory bug on the 4090. His memory cannot run +1500 cold. He should try maybe +1400 or even +1200 when not gaming and memory is cold. On my 4090, I can bench and play at +1850 all day. But If I keep that overclock active, and the memory gets to below 30C (while working or browsing internet), my screen goes black. At +1500, doesn't matter how cold my memory gets, it remains stable an no blackscreen.


----------



## KedarWolf

tomerturbo said:


> i have asus strix 4090 oc during time spy extreme my hot spot temp is 81c IS IT SAFE? gpu temp 66 memory temp 64


New 2.1 bios Auto fan curve is bugged. High temps. Use Afterburner and set a custom fan curve. Even though it said my RPM was about the same, my temps all dropped about 20C.


----------



## Panchovix

Betroz said:


> Is it possible to flash a Strix OC BIOS to a TUF OC card?


It is possible, and probably it would work good.
I have the TUF non-OC and the Strix VBIOS doesn't work that good, I have less effective clocks and more power consumption than with my default TUF VBIOS or the TUF OC VBIOS (which I flashed)




Zarbain said:


> Did anyone update their Vbios on asus tuf to the v2.1 and see higher temps? My delta pushes 20c for hotspot in some games now and I don't remember it doing this before.
> View attachment 2586916


Not sure if after the VBIOS update, but the TUF itself is hot af, but probably had "lower" temps in the stock VBIOS. My TUF is still with stock air cooler, and I had to vertically mount my TUF, open case and add some fans directly to the card to get max 70°C-75°C hotspot temps and 60-65°C core lol, with 25-30°C ambient temp (77°F-86°F). Before adding the extra fans I had like 70°C core and 90°C hotspot.


----------



## ttnuagmada

You guys who are watercooled: What kind of memory temps are you seeing?


----------



## motivman

ttnuagmada said:


> You guys who are watercooled: What kind of memory temps are you seeing?


----------



## alasdairvfr

motivman said:


> Nope, this is the cold memory bug on the 4090. His memory cannot run +1500 cold. He should try maybe +1400 or even +1200 when not gaming and memory is cold. On my 4090, I can bench and play at +1850 all day. But If I keep that overclock active, and the memory gets to below 30C (while working or browsing internet), my screen goes black. At +1500, doesn't matter how cold my memory gets, it remains stable an no blackscreen.


I suspect you are right. I am used to green screen from my Nvidia Shield on wake. Never happens randomly.

Probably running low mem oc profile for desktop is the quick solution; longer term would be to remount the block and use a less effective pad for the memory, probably another 5-10C average temp higher would resolve this.


----------



## motivman

alasdairvfr said:


> I suspect you are right. I am used to green screen from my Nvidia Shield on wake. Never happens randomly.
> 
> Probably running low mem oc profile for desktop is the quick solution; longer term would be to remount the block and use a less effective pad for the memory, probably another 5-10C average temp higher would resolve this.


What I do is I disable my overclock when not gaming, granted +1500 works for me no matter how cold my vram gets. I like to play at +1800 vram, so what I do is I load up the game, let vram heat up, ALT+tab to afterburner and enable my +1800 overclock profile.. haha.


----------



## WayWayUp

Dreams-Visions said:


> Just hooked up my FE and did some testing. Looking like my best, stable config is for max benchmarks is:
> +133
> +185 core
> +2000 mem (maxed)
> 
> On air, that's about 2910MHz steady in testing, ~60C. No coil whine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 27 725 in Port Royal
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 30 015 in Time Spy
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 032 in Time Spy Extreme
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 23 450 in Fire Strike Ultra
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 37 673 in Fire Strike Extreme
> 
> 
> AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Is this a keeper? Looking for a second opinion before I fully commit and put it under water. I have a TUF OC in shrink wrap, trying to figure out if it's even worth taking out of the box.


are you dedicated to that platform? its hard to tell how you are doing in benchmarks mainly because your cpu/ram combo seem to be bottlenecking you in all 3dmark applications
can you tweak your ram timings at least if your be devoted to 3600Mhz x4 dimm ?

did you do any kind of tuning to your ram? please post in the amd ddr4 thread and we can give some conservative suggestions that will surely be much better than current

your memory OC is golden bin and your core clocks seem fine. 
definitely a keeper


----------



## yzonker

alasdairvfr said:


> I suspect you are right. I am used to green screen from my Nvidia Shield on wake. Never happens randomly.
> 
> Probably running low mem oc profile for desktop is the quick solution; longer term would be to remount the block and use a less effective pad for the memory, probably another 5-10C average temp higher would resolve this.


I don't think low quality pads will solve black screening at idle. The mem doesn't make hardly any heat at idle, so it will still cold soak to about the same temp (very near water temp). Load temps will just be higher. It would help for benchmarking probably, but not for daily use.


----------



## Panchovix

https://store.nvidia.com/de-de/geforce/store/?page=1&limit=9&locale=de-de



Official price drop of the 4090FE in Europe it seems (almost 100 euros less), maybe it will drop more if AMD releases something like a 7950XTX?


----------



## Sheyster

KingEngineRevUp said:


> I guess imagine if the card ran at base and we had to OC the numbers ourselves lol, we would all be so happy.


If you use High Performance power mode in NVCP, it's still kind of the same. Of course there will be some bins dropped once temps go up.


----------



## supersym

Hi !
I can't run over 110 % the power with my aorus waterforce Xrem... do you know if I can overide those 110 %


----------



## motivman

yzonker said:


> I don't think low quality pads will solve black screening at idle. The mem doesn't make hardly any heat at idle, so it will still cold soak to about the same temp (very near water temp). Load temps will just be higher. It would help for benchmarking probably, but not for daily use.


simple solution is to turn off overclock when not gaming... enable high vram overclock after warming up the memory in game... that has been working for me, even with my waterblock. The thing is that with a waterblock, I have to enable my profile quickly and get back in game because vram temps drop very fast compared to when I was air cooled.


----------



## SilenMar

Laithan said:


> Each individual game draws different power and on top there are different resolutions and game settings that affect power draw. I think what people are saying/showing is that most games will not max out @ 600W (or near) but _some_ will. If you are seeing close to 600W ("580W+") on EVERY game then there is something worthy of a deep dive on your system as that isn't typical.


Small games like Ghost Buster has 70-80fps with 16GB vram easily. 
Old games like Resident Evil at 8K hit 580W all the time. 
New games are going to push hardware limits. So saying games run at 450W doesn't stand against time. It's typical for now since CPU and RAM are bottlenecked, but not for long.


----------



## Krzych04650

motivman said:


> What I do is I disable my overclock when not gaming, granted +1500 works for me no matter how cold my vram gets. I like to play at +1800 vram, so what I do is I load up the game, let vram heat up, ALT+tab to afterburner and enable my +1800 overclock profile.. haha.


You can have profiles on hotkeys, I am using CTRL + arrows. Useful for quickly switching to different profiles to see the impact. Also some games really don't like alt-tabbing.


----------



## yzonker

I personally haven't even been bothering to OC my card while gaming. Nothing I'm playing really needs it at all. Certainly no reason to jump through hoops to run +1800 mem vs +1600 mem. Sacrilege in an OCN thread. lol


----------



## Panchovix

yzonker said:


> I personally haven't even been bothering to OC my card while gaming. Nothing I'm playing really needs it at all. Certainly no reason to jump through hoops to run +1800 mem vs +1600 mem. Sacrilege in an OCN thread. lol


It is true though, probably most people leave the card as it is, or undervolt (and maybe undervolt + overclock VRAM); at the end, the power/performance above 450W is not that worth lol (for games at least)


----------



## Zarbain

KedarWolf said:


> New 2.1 bios Auto fan curve is bugged. High temps. Use Afterburner and set a custom fan curve. Even though it said my RPM was about the same, my temps all dropped about 20C.


Thanks for letting me know! I have afterburner installed but I reinstalled it just to be certain. With afterburner do I need to set auto and then custom fan curve for it to work or is there a better method? I am new to it having a 6900xt the last few years and then a 1070ti before that I didnt really mess around with afterburner. I am wondering if its really this new samsung neo 4k monitor I am running everything on now and I just didnt notice until recently.


----------



## Laithan

Krzych04650 said:


> You can have profiles on hotkeys, I am using CTRL + arrows. Useful for quickly switching to different profiles to see the impact. Also some games really don't like alt-tabbing.


Simple yet a good idea for sure. I have always had keyboards with custom keys and never really used them.. 

Looks like you can use MSI AB profiles tab to automate this (I didn't test it but should work)


----------



## KingEngineRevUp

alasdairvfr said:


> This can be a result of HDMI handshake error


That's a good observation. I have had issues with the HDMI cards I am using. Even though they advertised they're rated for 8K and HDMI 2.1, on my 3080 Ti, they did have similar green screens. But disconnecting and reconnecting the HDMI cord would always fix it. 

For the 4090, when I disconnect and reconnect it's still a green screen, but I can hear video playback in the background still. 

What do you think it is? There is still the matter of some rare occurrences where it artifacts before green screening. It hasn't happened since I put my memory down to +1350 though.


----------



## SilenMar

Panchovix said:


> It is true though, probably most people leave the card as it is, or undervolt (and maybe undervolt + overclock VRAM); at the end, the power/performance above 450W is not that worth lol (for games at least)


Not true at all. The value of a card or any product is all about these details of how hard it can be pushed or how good quality it can be built. 

There is money on it. The more it can be pushed, the better quality, the more price it has. 

That's why you can just make $1000 selling the Strix now if you've specifically/particularly bought it at launch. You can still make a profit selling it later. 

But if you bought a 450W card simply for games, there is nothing more you can make out of them or you can even lose money in comparison.


----------



## Krzych04650

SilenMar said:


> Pretty sure he didn't lock the voltage. Or the CPU and ram are not doing a proper work.
> I don't play games. But when I do there is easy 580W on my sleeper/work PC with just 600W vBIOS.
> 
> 
> 
> 
> 
> 
> View attachment 2586831


You are talking a lot but have nothing to show for it. My post is very transparent with list of games and screenshots provided for each game, with 26 out of 30 games showing clearly that they are at 1100mv, save for 4 that still hit power limit even with 630W BIOS, and every single game is at 99% GPU usage with very high resolutions used. If you think this is incorrect, then run the same games and show us how you run all of them at 580W+.



yzonker said:


> I personally haven't even been bothering to OC my card while gaming. Nothing I'm playing really needs it at all. Certainly no reason to jump through hoops to run +1800 mem vs +1600 mem. Sacrilege in an OCN thread. lol





Panchovix said:


> It is true though, probably most people leave the card as it is, or undervolt (and maybe undervolt + overclock VRAM); at the end, the power/performance above 450W is not that worth lol (for games at least)


A simple +120/+1200 overclock that any card should be able to achieve brings 7-10% performance increase and doesn't require going above 450W, it is a free performance that doesn't require extra cooling or power, you are just tightening away the headroom that factory settings had to maintain, no reason not to take it.


----------



## Dreams-Visions

WayWayUp said:


> are you dedicated to that platform? its hard to tell how you are doing in benchmarks mainly because your cpu/ram combo seem to be bottlenecking you in all 3dmark applications
> can you tweak your ram timings at least if your be devoted to 3600Mhz x4 dimm ?
> 
> did you do any kind of tuning to your ram? please post in the amd ddr4 thread and we can give some conservative suggestions that will surely be much better than current
> 
> your memory OC is golden bin and your core clocks seem fine.
> definitely a keeper


ty for the response.

Yep, ram is tuned, CPU isn't beyond some basics. I expect to move up to a 7950X3D whenever they come out in Q1


----------



## motivman

Krzych04650 said:


> You can have profiles on hotkeys, I am using CTRL + arrows. Useful for quickly switching to different profiles to see the impact. Also some games really don't like alt-tabbing.


NICE... didn't even know that. This is a game changer for me, lol.


----------



## SilenMar

Krzych04650 said:


> You are talking a lot but have nothing to show for it. My post is very transparent with list of games and screenshots provided for each game, with 26 out of 30 games showing clearly that they are at 1100mv, save for 4 that still hit power limit even with 630W BIOS, and every single game is at 99% GPU usage with very high resolutions used. If you think this is incorrect, then run the same games and show us how you run all of them at 580W+.


That is because the CPU and RAM are bottlenecked. This won't stay for long.


----------



## WayWayUp

i kinda got screwed over with the x3d. rumors suggested we would only see 6 and 8 core variants and this was coming from reliable leakers. this rumor lasted a while

now they say there WILL be a 7950x3d... after i finally pulled the trigger and bought a 13900k.
unless its 10% or more faster i will stick with current cpu. but i cant compare the review benchmarks as the 13900ks arnt tuned properly for memory which yields large gains + they are not overclocked and the x3d chips dont really OC

so i need a framechasers comparison sadly


----------



## Panchovix

SilenMar said:


> Not true at all. The value of a card or any product is all about these details of how hard it can be pushed or how good quality it can be built.
> 
> There is money on it. The more it can be pushed, the better quality, the more price it has.
> 
> That's why you can just make $1000 selling the Strix now if you've specifically/particularly bought it at launch. You can still make a profit selling it later.
> 
> But if you bought a 450W card simply for games, there is nothing more you can make out of them or you can even lose money in comparison.





Krzych04650 said:


> You are talking a lot but have nothing to show for it. My post is very transparent with list of games and screenshots provided for each game, with 26 out of 30 games showing clearly that they are at 1100mv, save for 4 that still hit power limit even with 630W BIOS, and every single game is at 99% GPU usage with very high resolutions used. If you think this is incorrect, then run the same games and show us how you run all of them at 580W+.
> 
> 
> 
> 
> A simple +120/+1200 overclock that any card should be able to achieve brings 7-10% performance increase and doesn't require going above 450W, it is a free performance that doesn't require extra cooling or power, you are just tightening away the headroom that factory settings had to maintain, no reason not to take it.


I mean yes, we do it because we like to overclock (and as Krzych04650 says, I do overclock but with 450W even if I can use 600W), but the average buyer/user (maybe not the 4090 users tho, since it is on the enthusiast end) won't bother to overclock.

For example, I'm inclined to think that if you buy a Strix, you will overclock. Not necessary true, but people that like to overclock often get Strix cards.

Anyways at the end, it depends of every user, and we here are inclined to overclock, because it's fun.


----------



## dr/owned

yzonker said:


> I personally haven't even been bothering to OC my card while gaming. Nothing I'm playing really needs it at all. Certainly no reason to jump through hoops to run +1800 mem vs +1600 mem. Sacrilege in an OCN thread. lol


For me, the latest update of Fortnite to UE5.1 basically is back to killing FPS if you turn on the new crap like Nanite texture mapping. I think I maybe get like 100-120fps at 4K. Unless you turn on blur-mode aka render lower and then hope upscale magic fixes it...which seems to defeat the point of doing high res stuff in the first place. Ray tracing is still a disaster for fps which is kinda lame on a 3rd gen RTX product.


----------



## yzonker

dr/owned said:


> For me, the latest update of Fortnite to UE5.1 basically is back to killing FPS if you turn on the new crap like Nanite texture mapping. I think I maybe get like 100-120fps at 4K. Unless you turn on blur-mode aka render lower and then hope upscale magic fixes it...which seems to defeat the point of doing high res stuff in the first place. Ray tracing is still a disaster for fps which is kinda lame on a 3rd gen RTX product.


But can you tell the difference if you drop the mem OC by 100 or 200? That is the only point I was really making, that pushing the mem OC to the point that you have to jump through hoops to make it work is not worth the effort (for me anyway). There's no meaningful performance difference.


----------



## Krzych04650

WayWayUp said:


> i kinda got screwed over with the x3d. rumors suggested we would only see 6 and 8 core variants and this was coming from reliable leakers. this rumor lasted a while
> 
> now they say there WILL be a 7950x3d... after i finally pulled the trigger and bought a 13900k.
> unless its 10% or more faster i will stick with current cpu. but i cant compare the review benchmarks as the 13900ks arnt tuned properly for memory which yields large gains + they are not overclocked and the x3d chips dont really OC
> 
> so i need a framechasers comparison sadly


Yea, it is increasingly difficult the get any value from review numbers. A lot of reviews are going to show Zen4X3D as faster than 13900K, but this will be the comparison between X3D already at its max vs gimped stock 13900K with random XMP. These numbers are only relevant for people who leave things stock.
Though with the massive game to game inconsistency that Zen4 has I don't think X3D is going to be that good. Some games will fly, but some will remain much slower than 13900K. It doesn't matter really, the chapter is closed already, next big thing is 2H 2024 with Arrow Lake and whatever AMD will have. Anything in between will not have big enough margins vs 13900K.


----------



## J7SC

ttnuagmada said:


> You guys who are watercooled: What kind of memory temps are you seeing?


Hi 30s to low 50s C, depending on ambient temps and load. 90% of the time, it lingers in the low 40s C.



yzonker said:


> But can you tell the difference if you drop the mem OC by 100 or 200? That is the only point I was really making, that pushing the mem OC to the point that you have to jump through hoops to make it work is not worth the effort (for me anyway). There's no meaningful performance difference.


...for all but hard benching, I leave the card either on stock or give it a mild oc. Even Cyberpunk 2077 and FS2020 don't get more than my max gaming oc which is 1.05V, 115% PL, +220 core, +1400 on the VRAM.



Krzych04650 said:


> Yea, it is increasingly difficult the get any value from review numbers. A lot of reviews are going to show Zen4X3D as faster than 13900K, but this will be the comparison between X3D already at its max vs gimped stock 13900K with random XMP. These numbers are only relevant for people who leave things stock.
> Though with the massive game to game inconsistency that Zen4 has I don't think X3D is going to be that good. Some games will fly, but some will remain much slower than 13900K. It doesn't matter really, the chapter is closed already, next big thing is 2H 2024 with Arrow Lake and whatever AMD will have. Anything in between will not have big enough margins vs 13900K.


...there are just a few reviewers I trust (including GN) and it really does come down what RAM is used and how it is set up for the CPU, never mind CPU options such as PBO/2, CO etc (on AMD) and equivalent performance steps on for example the 13900K.

In a way, the 4090 eliminated the need for an _immediate_ mobo/CPU/RAM upgrade from my 5950X combo, especially with DLSS3 & Co, for my fav apps. Anyway, I am still undecided re. AM5 vs Meteor Lake / Arrow Lake. AM5 boards (7950X3D?) should have a much longer useful life incl. future Zen chips than Intel boards as Intel likes to throw new chipsets out there even when not needed. For a variety of reasons, I need a board that can run at least 64GB of RAM (128 GB preferred, so 4 RAM slots), along with dual network chips so that gets me into a different clas$ of mobos.


----------



## yzonker

J7SC said:


> Hi 30s to low 50s C, depending on ambient temps and load. 90% of the time, it lingers in the low 40s C.
> 
> 
> 
> ...for all but hard benching, I leave the card either on stock or give it a mild oc. Even Cyberpunk 2077 and FS2020 don't get more than my max gaming oc which is 1.05V, 115% PL, +220 core, +1400 on the VRAM.
> 
> 
> 
> ...there are just a few reviewers I trust (including GN) and it really does come down what RAM is used and how it is set up for the CPU, never mind CPU options such as PBO/2, CO etc (on AMD) and equivalent performance steps on for example the 13900K.
> 
> In a way, the 4090 eliminated the need for an _immediate_ mobo/CPU/RAM upgrade from my 5950X combo, especially with DLSS3 & Co, for my fav apps. Anyway, I am still undecided re. AM5 vs Meteor Lake / Arrow Lake. AM5 boards (7950X3D?) should have a much longer useful life incl. future Zen chips than Intel boards as Intel likes to throw new chipsets out there even when not needed. For a variety of reasons, I need a board that can run at least 64GB of RAM (128 GB preferred, so 4 RAM slots), along with dual network chips so that gets me into a different clas$ of mobos.


You'll want to look carefully at motherboard selection since DDR5 still struggles with 4 dimms. Hopefully that will improve with ML/AL. Right now I think you still need a 2 dimm board to get close to 8000 on Z790. Probably good to wait as long as possible.


----------



## J7SC

yzonker said:


> You'll want to look carefully at motherboard selection since DDR5 still struggles with 4 dimms. Hopefully that will improve with ML/AL. Right now I think you still need a 2 dimm board to get close to 8000 on Z790. Probably good to wait as long as possible.


Yup, I just watched Level1Techs on that a few days ago. While primarily looking at Zen4, there are also some comps at the end to Intel's Rocket Lake. FYI, the same channel looked at the new Epyc 12-channel DDR5 (w/ECC), and the Zen 4 cores in the Epyc Genoa seem to have better IMC and/or micro-code than the current Zen4 in Ryzen 7k.

For now, the 5950X / DarkH is still making me happy (bottom screenies). Granted, it can't compete on some benches, but it is running 4 sticks of DDR4 / 3800 CR1 at 14-13-13. Depending on how I set it, it can still boogie - but a nice shiny AM5 7950X3D would be mighty tempting.


----------



## alasdairvfr

yzonker said:


> I personally haven't even been bothering to OC my card while gaming. Nothing I'm playing really needs it at all. Certainly no reason to jump through hoops to run +1800 mem vs +1600 mem. Sacrilege in an OCN thread. lol


I have my "light" OC of +150 core and +1200 memory that I want to say has never crashed a game but then again 40K Darktide crashed for me like 50x (the game was/is unstable) so maybe the OC didn't help. I run this at a minimum, I can usually get away with my "heavy" OC +225 on core and +1500 on memory but some games don't like this.

I always run the "light" OC though, my panel is 4k/144Hz so it can make a difference to be worth it.


----------



## SilenMar

Panchovix said:


> I mean yes, we do it because we like to overclock (and as Krzych04650 says, I do overclock but with 450W even if I can use 600W), but the average buyer/user (maybe not the 4090 users tho, since it is on the enthusiast end) won't bother to overclock.
> 
> For example, I'm inclined to think that if you buy a Strix, you will overclock. Not necessary true, but people that like to overclock often get Strix cards.
> 
> Anyways at the end, it depends of every user, and we here are inclined to overclock, because it's fun.


An average buyer just take a card at a higher price without knowing it is actually a 450W built-in-cheap. 

Also, better bench the game that runs 600W instead of 450W. Bottlenecked games, poorly optimized games runs among 450W.


----------



## neteng101

WayWayUp said:


> now they say there WILL be a 7950x3d... after i finally pulled the trigger and bought a 13900k.


It won't matter IMO - 7950X3D will be the 7900XTX to a 13900K being the 4090. Without stronger IPC which is pretty much where AMD went wrong by not adopting big-little CPU micro-architecture, they're just repeating the bulldozer approach ie. more cores = higher end SKU, but most games will never scale beyond 8 physical cores currently. 7600X3D and 7800X3D would actually make sense, the 7950X3D is just pumping money down the drain that will not really mean much for gaming.


----------



## yzonker

alasdairvfr said:


> I have my "light" OC of +150 core and +1200 memory that I want to say has never crashed a game but then again 40K Darktide crashed for me like 50x (the game was/is unstable) so maybe the OC didn't help. I run this at a minimum, I can usually get away with my "heavy" OC +225 on core and +1500 on memory but some games don't like this.
> 
> I always run the "light" OC though, my panel is 4k/144Hz so it can make a difference to be worth it.


Metro Exodus EE is my go to game to test OC's. It will usually crash when the rest of my games appear stable including CP2077, RDR2, GTAV, SotTR, etc...


----------



## alasdairvfr

yzonker said:


> Metro Exodus EE is my go to game to test OC's. It will usually crash when the rest of my games appear stable including CP2077, RDR2, GTAV, SotTR, etc...


Is there a specific graphic setting or just max it all out? I probably played 10 min of this game since getting the card so not much of a test tbh


----------



## yzonker

alasdairvfr said:


> Is there a specific graphic setting or just max it all out? I probably played 10 min of this game since getting the card so not much of a test tbh


Yea, max settings in 4k with RT.


----------



## ttnuagmada

I can't speak for Warzone 2, but Warzone 1 would find stability issues that wouldn't show up anywhere else on my 3080 and 3090.


----------



## 8472

Just tested out my second Tuf non-OC, second one has loud coil whine too. Smh.


----------



## yzonker

J7SC said:


> Yup, I just watched Level1Techs on that a few days ago. While primarily looking at Zen4, there are also some comps at the end to Intel's Rocket Lake. FYI, the same channel looked at the new Epyc 12-channel DDR5 (w/ECC), and the Zen 4 cores in the Epyc Genoa seem to have better IMC and/or micro-code than the current Zen4 in Ryzen 7k.
> 
> For now, the 5950X / DarkH is still making me happy (bottom screenies). Granted, it can't compete on some benches, but it is running 4 sticks of DDR4 / 3800 CR1 at 14-13-13. Depending on how I set it, it can still boogie - but a nice shiny AM5 7950X3D would be mighty tempting.
> 
> 
> 
> 
> 
> 
> View attachment 2587026



Well 128gb is obviously not an option. Maybe 64 though.


----------



## Panchovix

ttnuagmada said:


> I can't speak for Warzone 2, but Warzone 1 would find stability issues that wouldn't show up anywhere else on my 3080 and 3090.


I get pretty weird artifacts at 3045Mhz+ on Warzone 2, on all other games it seems fine at that clocks or 3030Mhz, so pretty good game to test OC stability


----------



## GraphicsWhore

rahkmae said:


> View attachment 2586766
> 
> Remove and reinstall all again, i think there was nothing wrong with the contact...maybe a little too much thermal paste. I put less now and nothing changed, same temps. I don't now what is the problem. 3 radiators 2 x 360 and 1 x 280, EKW D5 pump, all max and full oc temp 61-64...
> View attachment 2586767
> 
> Valley test +195 gpu and +1300 memory
> View attachment 2586768
> 
> And PR pump 4000 rmp and all vent 1000 rmp, now the temperatures are even worse than before...


Something physically wrong with the block? Is your CPU blocked and if so, are those temps normal?


----------



## MrB123

I will get my Bykski water block to my Giga OC on wednesday.
any suggestion how to keep the temp up on the memory? my loop water is about 14c.

maybe cut the pads slimmer only cover 50%-70% of the mem, or will that damage the card if used permanently?


----------



## KingEngineRevUp

yzonker said:


> I personally haven't even been bothering to OC my card while gaming. Nothing I'm playing really needs it at all. Certainly no reason to jump through hoops to run +1800 mem vs +1600 mem. Sacrilege in an OCN thread. lol


Agreed, besides it being a hassle switching profiles, I have OC+UV my card and it is smashing most games to my monitor refresh rate of [email protected] fps and GPU usage is lower than 95% utilization, a lot of times at 80%. My max OC would yield like +4%. That would mean I would need it if I was at 115 fps and needed to hit 120 fps. 

But running 1.1v vs. 0.950v is quite a bit more heat, enough to be unpleasant in my desert climate.


----------



## yzonker

KingEngineRevUp said:


> Agreed, besides it being a hassle switching profiles, I have OC+UV my card and it is smashing most games to my monitor refresh rate of [email protected] fps and GPU usage is lower than 95% utilization, a lot of times at 80%. My max OC would yield like +4%. That would mean I would need it if I was at 115 fps and needed to hit 120 fps.
> 
> But running 1.1v vs. 0.950v is quite a bit more heat, enough to be unpleasant in my desert climate.


Yea maybe I need to consider UV again. I just tested 950mv with +150/+1600. Switching from that UV to 1100mv was only 3-4 fps in CP2077 (71 to 74-75). Power went from ~325w to ~450w. 

Although I always ran either 950mv or 1000mv on my 3090 as well, but it would still pull nearly 500w in some games. Really shows the efficiency of the 4090.


----------



## Sheyster

yzonker said:


> I personally haven't even been bothering to OC my card while gaming.


Same here.. I'm running the ASUS Strix BIOS with the default (small) OC over stock and default 500w power limit for gaming. I guess if you have the 4K240 Samsung G8 monitor you might need to, but I'm at 4K120.


----------



## J7SC

Sheyster said:


> Same here.. I'm running the ASUS Strix BIOS with the default (small) OC over stock and default 500w power limit for gaming. I guess if you have the 4K240 Samsung G8 monitor you might need to, but I'm at 4K120.


I wonder if you do me/us a favour if you have the time. Can you run memtest_vulkan (per below) at 1500 MHz VRAM speed per GPU-Z, with everything else being bone-stock (ie. voltage, PL, core), please ? I am wondering about the Strix vbios timings etc at a given speed when flashed onto the Gigabyte Gaming OC vs. stick vbios (below)...bandwidth numbers could be a bit of a proxy. That way, I could compare...


----------



## Sheyster

J7SC said:


> I wonder if you do me/us a favour if you have the time. Can you run memtest_vulkan (per below) at 1500 MHz VRAM speed per GPU-Z, with everything else being bone-stock (ie. voltage, PL, core), please ? I am wondering about the Strix vbios timings etc at a given speed when flashed onto the Gigabyte Gaming OC vs. stick vbios (below)...bandwidth numbers could be a bit of a proxy. That way, I could compare...
> View attachment 2587067


I could, BUT keep in mind I was not able to get past +1400 on memory on the GB-OC BIOS. This card isn't a memory lottery winner.


----------



## Dreams-Visions

neteng101 said:


> It won't matter IMO - 7950X3D will be the 7900XTX to a 13900K being the 4090. Without stronger IPC which is pretty much where AMD went wrong by not adopting big-little CPU micro-architecture, they're just repeating the bulldozer approach ie. more cores = higher end SKU, but most games will never scale beyond 8 physical cores currently. 7600X3D and 7800X3D would actually make sense, the 7950X3D is just pumping money down the drain that will not really mean much for gaming.


This feels like an odd oversimplification of the situation.

The 7950X was already very competitive in 95% of games and will likely get a clear and significant boost from that v-cache (as reflected in the rumored pricing). Further, it provides a strong entry point for people who would like to build a new rig sometime between now and summer, as they can buy with the confidence of knowing the new AMD platform will be able to accept new CPU upgrades for at least the next 3 years (generations) while they enjoy what is likely to be the best range of consumer-grade CPUs on the market until the fall.

While upgrading from a 12xxxK or similar to a 13xxxK makes plenty of sense since one presumably already owns an appropriate mobo and memory, building Intel from scratch right now means buying into an EOL platform which makes little sense. And depending on how big you like to swing for your mobo (these things can go up to and beyond $1,000 USD now), that investment overall feels very, very poorly considered. For builders looking at DDR-5 based platforms over the next 6 months, the arrival of X3D means it's time to jump in. Regardless of your strange GPU comparisons. Implying that they will perform *poorly* with v-cache is just a bad look for you if your intention was to appear objective.


----------



## J7SC

Sheyster said:


> I could, BUT keep in mind I was not able to get past +1400 on memory on the GB-OC BIOS. This card isn't a memory lottery winner.


ok, thanks ! 1400 works, too.


----------



## neteng101

Dreams-Visions said:


> Regardless of your strange GPU comparisons. Implying that they will perform *poorly* with v-cache is just a bad look for you if your intention was to appear objective.


Its not the v-cache, just the CPU architecture. Those CPUs are nearly tapped out from AMD, if you're not overclocking you're getting most of the performance out of the box. The ridiculousness of PBO overclocking only yielding a bit more clock will mean they will never be competitive to overclocked tuned Intel CPUs in gaming, save for a few niche titles that favor that oversized cache. AMD's boards don't really age well either - older platforms like the B450 were pretty ****ty for overclocking the next gen of CPUs and took forever for AMD to even support them in Agesa correctly. And aside from the extreme OC tweakers, you don't need to go for top end Intel boards - 99% of the performance is there from a reasonably priced board.

But I digress - go throw your money away on way too many cores for games to take advantage of CPUs without understanding what it really takes to get the most from games and your money at the same time. If you want to get one, I'd suggest sticking with the 6/8 core ones for gaming, a 7950X3D is just flushing extra money down the toilet. You complain about Intel board costs and quote a CPU that's a bad value for gaming at the same time - you really need to sort your priorities out.


----------



## SilenMar

yzonker said:


> I personally haven't even been bothering to OC my card while gaming. Nothing I'm playing really needs it at all. Certainly no reason to jump through hoops to run +1800 mem vs +1600 mem. Sacrilege in an OCN thread. lol


It doesn't even feel like an OC thread. It is more like an UV thread lol.


----------



## KingEngineRevUp

yzonker said:


> Yea maybe I need to consider UV again. I just tested 950mv with +150/+1600. Switching from that UV to 1100mv was only 3-4 fps in CP2077 (71 to 74-75). Power went from ~325w to ~450w.
> 
> Although I always ran either 950mv or 1000mv on my 3090 as well, but it would still pull nearly 500w in some games. Really shows the efficiency of the 4090.


Yeah, my test to myself shows it's definitely worth UV+OC.

I compared my stable clocks to other scenarios. Note, this is just booting my PC up with RGB software, virus scanner running, etc. How I use my PC on a normal basis as that was the idea, how is my performance running all my bloat.









Result







www.3dmark.com





Performance is -8% from my best OC but power is about 20% less.

I guess I'll throw on the max OC if I'm playing a game where I'm just shy of reaching 120 FPS.



SilenMar said:


> It doesn't even feel like an OC thread. It is more like an UV thread lol.


UV can be a form of OC or uh.. under clocking (UC). In most instances and cases, I have always seen it as a form of OC. 

I mean... there are some instances I met users who neutered their 3090 to run like a 3070... But those people are the ones that are now running 4090s with 4790Ks thinking it's still okay.


----------



## KingEngineRevUp

neteng101 said:


> Those CPUs are nearly tapped out from AMD [...] AMD's boards don't really age well either - older platforms like the B450 were pretty ****ty for overclocking the next gen of CPUs and took forever for AMD to even support them in Agesa correctly.


If the CPU come boosting and self OC to be nearly tapped out already... Is the motherboard really the problem then? Dropping a 5800X3D into a B450 is a hassle free kick ass way for the B450 owner to end their use of AM4. 

Manual OC is dying. Boost does it all for us.


----------



## Betroz

KingEngineRevUp said:


> I have OC+UV my card and it is smashing most games to my monitor refresh rate of [email protected] fps and GPU usage is lower than 95% utilization, a lot of times at 80%.


CPU or RAM bottleneck most likely.


----------



## Betroz

KingEngineRevUp said:


> Manual OC is dying. Boost does it all for us.


Yes. If you OC 13900K from stock allcore 5500 to 5800 allcore (about max) - that is a 5.45 % increase in clockspeed, and *IF* gaming performance would do the same, it is still a puny 5.45 %...


----------



## mirkendargen

MrB123 said:


> I will get my Bykski water block to my Giga OC on wednesday.
> any suggestion how to keep the temp up on the memory? my loop water is about 14c.
> 
> maybe cut the pads slimmer only cover 50%-70% of the mem, or will that damage the card if used permanently?


If you want to experiment for science, try leaving the plastic backing on the block side of the pads on the memory and let us know what the temps are. I've been considering doing it but don't want to go through the hassle of taking my block off.


----------



## WayWayUp

Memory OC is meta anyways for 13th gen
Also, boost is good for casual gamers, but that’s it. The maximum boost on 13th gen is 5.8Ghz

My slowest p core is 5.8…. And I boost to 6.2
But also the ring clock. I can’t imagine just leaving that stock
And then the e cores tuning…

I would say manual OC is alive and kicking
I have seen some of the largest disparity between stock vs tweaking this gen!

BUT, I agree for most of the population, it’s better to just plug in and play. Not worry about anything and get good performance out the gate

there is a fine line being a pc gamer though….
to be hands on with the hardware is part of the pc community identity!

There is super mainstream gaming, just turn on and play… that’s console gaming.

Pc master race requires the complexities that come along with it, or else its just an extension of consoles


----------



## MrB123

mirkendargen said:


> If you want to experiment for science, try leaving the plastic backing on the block side of the pads on the memory and let us know what the temps are. I've been considering doing it but don't want to go through the hassle of taking my block off.


wont the plastic melt?


----------



## SilenMar

KingEngineRevUp said:


> UV can be a form of OC or uh.. under clocking (UC). In most instances and cases, I have always seen it as a form of OC.


OC is still OC. There is XOC BIOS. 

These GPUs are not OCed enough. 1.10v is kind of pathetic. 1.035V can push 500W, 1.10v can push 600W. 

1.20V at 40C can push beyond 600W easily if the GPU is built for high load.


----------



## tomerturbo

i have a question i have asus strix rtx 4090 oc when i overclock the card beyond 3GHZ in 3dmark i have somteing on the screen like artifacts when i lowering clock on the gpu to 2985 everything is ok
does it normal? not only memory overclock casue this artifacts ?


----------



## supersym

supersym said:


> Hi !
> I can't run over 110 % the power with my aorus waterforce Xrem... do you know if I can overide those 110 %


Hi !
No ideas ?
I use a corsair AX 1500 i with a z690 hero and a 13900kf


----------



## J7SC

MrB123 said:


> wont the plastic melt?


...another option re. your original question is to order the lowest W/mK thermal pads from Amazon...s.th. around the 4 -5 W/mK should do nicely. My Gigabyte Gaming OC w/Bykski block is the opposite (thermal putty on the VRAM and low temps). Still, even when VRAM temps are cold ( ~ 40 C), it can be adjusted by about +1472 MHz; but when VRAM temp is higher by about 10 C, it will be +1536 or so (so far, still testing). The only problem when running higher ambient temps is the max core clock loses an extra speed bin.


----------



## StreaMRoLLeR

SilenMar said:


> It doesn't even feel like an OC thread. It is more like an UV thread lol.


Because even Elmor modded and 750w pulling Strix have no marginal gain compared to stock 1.1v 500w cards. This gen killed overclock and excitement of doing overclock.


----------



## SilenMar

It is still very nice to see a GPU XOC to 1,000W. This is different from the chip lottery. 

A better GPU is always the one that pulls more power at comfortable temperature. It has better build quality this way. It also has more value compared to others at the same price.


----------



## KingEngineRevUp

Betroz said:


> CPU or RAM bottleneck most likely.


My monitor is 120 Hz and I cap it so it doesn't go past that.


----------



## Sheyster

J7SC said:


> I wonder if you do me/us a favour if you have the time. Can you run memtest_vulkan (per below) at 1500 MHz VRAM speed per GPU-Z, with everything else being bone-stock (ie. voltage, PL, core), please ? I am wondering about the Strix vbios timings etc at a given speed when flashed onto the Gigabyte Gaming OC vs. stick vbios (below)...bandwidth numbers could be a bit of a proxy. That way, I could compare...
> View attachment 2587067


As expected, the memtest errors out quickly at +1500 for me. *NOTE: I am using a newer version than you are.*

Here are results for +1400:











EDIT: I found out I can actually pass at +1450 but only after the memory has warmed up a bit. +1500 is still a no-go even with warm memory. For a gaming OC I probably won't exceed +1300.


----------



## wirx

What BIOS you guys recommend for Gigabyte Xtreme 4090 Waterforce?
At moment there is 450W, but looking for 600W. Also minimum fan speed is 30%, I like to use Fan Stop.
Talking about card itself, I was trying to buy MSI Liquid X but it was impossible to find it within 2 months, so took a Gigabyte instead. Really quiet card when comparing 3090 and fast also.
Out of the box Port Royal was 26474 (fans ~1000rpm, all default) I scored 26 474 in Port Royal
After some tweaking Port Royal is 28256 I scored 28 256 in Port Royal
Power 111%, gpu +179, mem +1595, voltage +88 win11 pbo, fans max, room temp 24c


----------



## X909

Panchovix said:


> I get pretty weird artifacts at 3045Mhz+ on Warzone 2, on all other games it seems fine at that clocks or 3030Mhz, so pretty good game to test OC stability


Do you want to test unigine heaven 4.0 to compare (highest preset, 4xmsaa and window mode, so you can easily adjust clocks on the fly)? Thats the second hardest app (1st one is DCS world but thats a big download ) to find artefacts that I identifed so far.


----------



## alasdairvfr

SilenMar said:


> It is still very nice to see a GPU XOC to 1,000W. This is different from the chip lottery.
> 
> A better GPU is always the one that pulls more power at comfortable temperature. It has better build quality this way. It also has more value compared to others at the same price.


Not necessarily true. There have been posts on this very thread in the past discussing # watts per frame and in some cases the cards pulling ~600w are producing about the same frames as other cards ~500w, meaning they just use more electricity to generate the same results. Using more electricity != better GPU.


----------



## Netherwind

I must try again - anyone with a Gigabyte Gaming OC, have you noticed erratic fan RPM below a certain threshold? Like the fans spinning up and down every second?
IRL example : www.youtube.com/watch?v=JUQ2n2U7ZgI
Gigabyte Control Center readout : www.youtube.com/watch?v=FVALKJJqkks


----------



## jediblr

Sheyster said:


> Here are results for +1400:


this is mine +1400


----------



## Jay-G30

tomerturbo said:


> i have a question i have asus strix rtx 4090 oc when i overclock the card beyond 3GHZ in 3dmark i have somteing on the screen like artifacts when i lowering clock on the gpu to 2985 everything is ok
> does it normal? not only memory overclock casue this artifacts ?


Yes the core can produce artifacts if overclocked to much , each card will be different but happens with mine when core clock gets to 3180Mhz in Heaven , it is fine at 3165Mz but starts to artifact next bin higher at 3180Mhz. 3165Mhz is max stable for my core.


----------



## coelacanth

KingEngineRevUp said:


> If the CPU come boosting and self OC to be nearly tapped out already... Is the motherboard really the problem then? Dropping a 5800X3D into a B450 is a hassle free kick ass way for the B450 owner to end their use of AM4.
> 
> Manual OC is dying. Boost does it all for us.


It seems a lot of the time nowadays it may not be pure overclocking but more tuning to fit one's needs. There is still plenty of tinkering to be done with tuning RAM, CPUs, video cards. The returns depend on what one if trying to do. If gaming at 4K there are definitely diminishing returns to overclocking CPU and RAM.

As far as plug and play, this is as good as it's been. Just get a 5800X3D, set XMP/DOCP, with an AMD GPU turn SAM on, and the performance is great.

I'm enjoying the 4090. I was playing Assassin's Creed Origins and it's locked at 120Hz at ~250W with the 4090 power limited at 80%. With the 3080 Ti it was around 85 FPS average at 300W.


----------



## J7SC

Sheyster said:


> As expected, the memtest errors out quickly at +1500 for me. *NOTE: I am using a newer version than you are.*
> 
> Here are results for +1400:
> 
> View attachment 2587131
> 
> 
> 
> EDIT: I found out I can actually pass at +1450 but only after the memory has warmed up a bit. +1500 is still a no-go even with warm memory. For a gaming OC I probably won't exceed +1300.


Thanks much @Sheyster (and also @jediblr) ... the purpose for asking was to establish whether the Asus Strix V2 bios running on a Gigabyte G-OC was using different VRAM settings as reflected by the vulkan throughput tests. I added my results below (now via memtest_vulkan v0.5.0) for comparison with the stock Giga-G-OC bios.

*Quick follow-up question for @Sheyster*: With all else equal (i.e. core, PL, voltage settings), did the Asus Strix V2 bios allow for a bump in stable memory speed, ie. from 1400 to 1450 on your Giga-G-OC ?


----------



## Sheyster

coelacanth said:


> It seems a lot of the time nowadays it may not be pure overclocking but more tuning to fit one's needs. There is still plenty of tinkering to be done with tuning RAM, CPUs, video cards. The returns depend on what one if trying to do. If gaming at 4K there are definitely diminishing returns to overclocking CPU and RAM.


I agree with what you're saying. I'm undervolting my 13700KF and it's using less than 200W at full load (CB R23), much less during gaming. As I stated before, for gaming I just installed the ASUS Strix BIOS and use the default core speed and power limit (100%@500w). I don't even bother to run MSI AB to set up anything. It's power on and go. If I run into a very demanding game I have some headroom. The card can do 3105 stable, memory is only stable at +1400 (or +1450 if it's warm). I'm actually enjoying not having to tinker too much.


----------



## KingEngineRevUp

coelacanth said:


> but more tuning to fit one's needs. [...] The returns depend on what one if trying to do.


I know we're on OC.net, but lets take a step back and be realistic here. We're tuning for a few measly percent in performance, especially when it comes to a CPU.

And even when we tune a CPU to get a large percentage in performance in very specific scenarios, like at 500 FPS where if anyone held a gun to your head, you couldn't tell the difference between 200 and 1000 FPS.

The tuning for "my needs" is just a desire to tinker, learn and see how we can manipulate hardware. It hardly has a true impact on my life in most cases. Zipping a file a few ms faster? Gaining 1-3 fps? Maybe there's a very important case where every second does matter to someone performing workloads where time saved = money. I know I just tune for fun.



Sheyster said:


> I'm undervolting my 13700KF and it's using less than 200W at full load (CB R23), much less during gaming.


The original post we're discussing is that UV seems to be "lame" and we're not real OC anymore. _shrugs_


----------



## Sheyster

J7SC said:


> *Quick follow-up question for @Sheyster*: With all else equal (i.e. core, PL, voltage settings), did the Asus Strix V2 bios allow for a bump in stable memory speed, ie. from 1400 to 1450 on your Giga-G-OC ?


I never tried the warm memory trick on the GB-OC BIOS, but I'm inclined to say it didn't help too much if at all. By the way, @jediblr had a +core on his test, mine was stock settings on the ASUS V2.1 BIOS. Not sure if this impacts the Vulcan memory test much, but I'm inclined to say it does if the GPU has a memory controller.

FWIW, the only reason I'm using the ASUS V2.1 BIOS is that I like the stock settings for gaming, and the fans go slightly above 3000 RPM so no issue there. I was considering the 630W Colorful BIOS until I learned that the fans top out at ~2400 RPM. Not surprising since it is an AIO cooled card.


----------



## mirkendargen

MrB123 said:


> wont the plastic melt?


Melt? You have serious issues if the temp of your waterblock is hot enough for plastic to melt... You'd be melting the acrylic of your block too, not to mention boiling your coolant lol.

Plastic is a poor conductor of heat, but it's not some magical perfect insulator. It conducts heat ~10x better than air, ~10x worse than thermal pad. I don't think you're going to get a 50C+ delta over 0.1MM of hard plastic at the level of heat output the memory has.


----------



## Sheyster

KingEngineRevUp said:


> The original post we're discussing is that UV seems to be "lame" and we're not real OC anymore. _shrugs_


I thought we were discussing "needs" now. I pushed the CPU with limits off and increased multi's for P and E cores, as well as ring, during the first week I had it.

My needs are for 4K gaming. Also, in So-Cal our KW/H rates are absolutely ridiculous, 3 times as high as some parts of the country. So why push everything hard for 1-5 FPS? Plus, I have a 4K120 monitor (LG CX) frame capped at 117.

I hope this clears up my "needs". This is just a friendly discussion after all, with many points of view. To each his own.


----------



## SilenMar

alasdairvfr said:


> Not necessarily true. There have been posts on this very thread in the past discussing # watts per frame and in some cases the cards pulling ~600w are producing about the same frames as other cards ~500w, meaning they just use more electricity to generate the same results. Using more electricity != better GPU.


You can get easy 5fps from 500W to 600W in Resident Evil at 4K with 170% scale. They didn't test games so well. It didn't even show the full capability of a card. Some games are also massively CPU/RAM bottlenecked. 

A good card is always the one pulls more power easily. This is the job of AIB. 

When some AIB think it is fine at 450W like you do, they will use every component as cheap as possible to just pull 450W no more. But the price can be the same or more than the better built GPU that is designed to pull 600W. 

There is a difference between an option to do more and no option at all. There is a even bigger difference when selling a 450W and a 600W at the same price.


----------



## J7SC

Sheyster said:


> I never tried the warm memory trick on the GB-OC BIOS, but I'm inclined to say it didn't help too much if at all. By the way, @jediblr had a +core on his test, mine was stock settings on the ASUS V2.1 BIOS. Not sure if this impacts the Vulcan memory test much, but I'm inclined to say it does if the GPU has a memory controller.
> 
> FWIW, the only reason I'm using the ASUS V2.1 BIOS is that I like the stock settings for gaming, and the fans go slightly above 3000 RPM so no issue there. I was considering the 630W Colorful BIOS until I learned that the fans top out at ~2400 RPM. Not surprising since it is an AIO cooled card.


...it is probably not so much a 'warm memory trick', but avoiding 'low temps'. At 25 C ambient, memtest_vulkan gets the 4090 VRAM to 50 C (everything else bone stock), and 50 C to 60 C seems to be the best possible range for the 2GB GDDR6X chips. All that said, mid +1400s is still possible even at a low 18 C ambient (as mentioned before, these days I bench via adjusting central heat in our place 🥴). The main issue with 4090s is that is what is good for the goose (VRAM) is _not _good for the ganders (GPU and CPU core) when it gets to higher ambient temps with a big loop, so there's the need to find the right combo per bench if so inclined re. benchmarking.

Speaking of which, the latest gens of CPU and GPU are generally more maxed from the factory whereby before, they would have stock values orientated on the lowest common denominator (dud chip). Still, with boost algorithms ruling the roosts these days that use temps as a major input, one of the first things you can do is add some serious cooling to the CPU and GPU (see above for the special case of 2 GB GDDR6X chips). Beyond that, there are a myriad of other 'non-traditional oc' adjustments possible...my 5950X went from about 28k w/ 'base' PBO in Cinebench R23 multi-core to to > 32.4k just by exploring the deeper layers of the system bios. On CPU-Z single-core, my 5950X went from the 650 range to just under 710. If anything, oc-ing for improving on one own's earlier results has become much more complicated (leaving out sub-ambient and sub-zero for now) than before, with _different_ parameters to maximize / minimize. While an entirely different class, the latest Epyc Genoa monsters have two optimization paths affecting various settings - one is for max performance, the other for max power saving.

@bmagnien & anyone with a Giga-G-OC and custom vbios...per an earlier post you did (can't locate it right now), did the Asus Strix V2 bios (or other custom vbios) change your max VRAM speeds and perhaps raise VRAM temps temps when loaded onto your Giga-G-OC ? How about max core clocks ? ...I'm doing a survey of sorts


----------



## Krzych04650

SilenMar said:


> You can get easy 5fps from 500W to 600W in Resident Evil at 4K with 170% scale. They didn't test games so well. It didn't even show the full capability of a card. Some games are also massively CPU/RAM bottlenecked.
> 
> A good card is always the one pulls more power easily. This is the job of AIB.
> 
> When some AIB think it is fine at 450W like you do, they will use every component as cheap as possible to just pull 450W no more. But the price can be the same or more than the better built GPU that is designed to pull 600W.
> 
> There is a difference between an option to do more and no option at all. There is a even bigger difference when selling a 450W and a 600W at the same price.


Again, you keep talking fantasies and theorizing with nothing to show for it. Where is your testing?


----------



## KingEngineRevUp

Sheyster said:


> I thought we were discussing "needs" now. I pushed the CPU with limits off and increased multi's for P and E cores, as well as ring, during the first week I had it.
> 
> My needs are for 4K gaming. Also, in So-Cal our KW/H rates are absolutely ridiculous, 3 times as high as some parts of the country. So why push everything hard for 1-5 FPS? Plus, I have a 4K120 monitor (LG CX) frame capped at 117.
> 
> I hope this clears up my "needs". This is just a friendly discussion after all, with many points of view. To each his own.


Yeah I am also in So. Cal. I am undervolting my system as well. It's nice seeing the 4090 only drawing 300-350W while performing just as good as stock.

We got solar panels in my new home, so we just buy electricity from the solar company. $0.16/ KHW.

I'm running a 42" C2 capping at 116-117 also.


----------



## neteng101

KingEngineRevUp said:


> Dropping a 5800X3D into a B450 is a hassle free kick ass way for the B450 owner to end their use of AM4.


Dropping an expensive processor into a dinosaur board is just being penny wise, pound foolish. Even a B550 board is terribly lacking by today's standards, sheesh it can't even support 2 PCIe 4.0 M2 drives. Stop pouring gasoline on your money and lighting it on fire with a dumpster fire AM4 platform - its long past its useful life!



> Manual OC is dying. Boost does it all for us.


Its not dying - only when something is fully tapped out is when you don't need to OC anymore. Boost is harder to control predictably, at least Nvidia's boost can be managed by power limit and curve tuning. PBO doesn't allow you to directly determine clock targets which is why its totally lame. OC done right - your Nvidia GPU should be running at a flat target clock and ignoring the rest of the curve, you don't have to lock it for that either.


----------



## KingEngineRevUp

neteng101 said:


> Dropping an expensive processor into a dinosaur board is just being penny wise, pound foolish.


What kind of losses does one see dropping a 5800X3D into a B450 vs a x570? Legitimate question.


----------



## Sheyster

KingEngineRevUp said:


> Yeah I am also in So. Cal. I am undervolting my system as well. It's nice seeing the 4090 only drawing 300-350W while performing just as good as stock.
> 
> We got solar panels in my new home, so we just buy electricity from the solar company. $0.16/ KHW.
> 
> I'm running a 42" C2 capping at 116-117 also.


How are you liking the C2 42? I almost pulled the trigger during the Black Friday sales. I've decided to wait, I really want at least 144 Hz 4K. I've also been tempted by the ASUS PG42UQ (42" LG panel based OLED, 138 Hz). No one seems to have it in stock at MSRP though, for quite some time. At this point, it's probably best to wait for new 2023 models.

Regarding solar, I've almost pulled that trigger several times. One of these days...


----------



## WayWayUp

Is it possible for a potential 4090ti to support DP2.1? or is that locked out for the 40 series


----------



## KingEngineRevUp

Sheyster said:


> How are you liking the C2 42? I almost pulled the trigger during the Black Friday sales. I've decided to wait, I really want at least 144 Hz 4K. I've also been tempted by the ASUS PG42UQ (42" LG panel based OLED, 138 Hz). No one seems to have it in stock at MSRP though, for quite some time. At this point, it's probably best to wait for new 2023 models.
> 
> Regarding solar, I've almost pulled that trigger several times. One of these days...


The C2 is great, I got it mostly because I WFH a lot and it's nice being able to have 4 windows open at the same time. But the general consensus are that it's not that much greater than your CX. 

The ASUS monitor is really tempting to me because that heat sink advertises it almost eliminates screen burn in risk. 

I would just rock the CX till it's dead. It's a great display in itself. 

As for solar, I think the math has to make sense. I know SCE raises rates at a higher percentage than the solar company. You can easily project what your cost per KWH will be 5, 10 or 20 years from now. It sucks, seems like we'll be paying 0.75 for KHW in a decade or so.


----------



## Mad Pistol

neteng101 said:


> Dropping an expensive processor into a dinosaur board is just being penny wise, pound foolish. Even a B550 board is terribly lacking by today's standards, sheesh it can't even support 2 PCIe 4.0 M2 drives. Stop pouring gasoline on your money and lighting it on fire with a dumpster fire AM4 platform - its long past its useful life!


Yeouch. That hurts me too.

I had no idea my setup was so horrible.


----------



## KedarWolf

Mad Pistol said:


> Yeouch. That hurts me too.
> 
> I had no idea my setup was so horrible.


Well, it's not true about not supporting two M.2s.

Like the MSI B550 Unify-X could run two M.2s from the CPU lanes and two more from the chipset lanes.

Two M.2s on the CPU lanes will limit the GPU to 8x but you lose at most, 2-3 % performance over 16x.

And the VRM setup was incredible and the DDR4 memory overclocking really good too.

No shame in running a B550 setup at all, still a really nice rig overall.

I had my 5950x until a few months ago when I upgraded to a 7950x, and I was really happy with my 5950x.


----------



## Sheyster

KingEngineRevUp said:


> The C2 is great, I got it mostly because I WFH a lot and it's nice being able to have 4 windows open at the same time. But the general consensus are that it's not that much greater than your CX.
> 
> The ASUS monitor is really tempting to me because that heat sink advertises it almost eliminates screen burn in risk.
> 
> I would just rock the CX till it's dead. It's a great display in itself.


I was more interested in the 42" size, more PPI, better text. The 48" CX is rather large. 

At this point I'll wait to see what 2023 brings. The good thing about the CX is it will have a nice new home in my guest room, replacing an old Sony 40" LED TV. No need to bother selling it.


----------



## neteng101

KedarWolf said:


> Like the MSI B550 Unify-X could run two M.2s from the CPU lanes and two more from the chipset lanes.


Yeah - obviously showing how lousy the B550 chipset is like I said because you're stealing GPU lanes from the CPU to compensate for the lame B550 chipset limitations.



KingEngineRevUp said:


> What kind of losses does one see dropping a 5800X3D into a B450 vs a x570? Legitimate question.


Not comparing it to an X570 - bottom line is the whole AM4 ecosystem is old junk. Compare it to a Z690/Z790 and they all look terrible in comparison. That's the price people pay for staying on lousy old boards and keep upgrading. As for the 5800X3D, the poor VRMs on most B450 boards will be stretched... AMD sure delayed Zen 3 support on B450 until too many complains came in.


----------



## yzonker

neteng101 said:


> Yeah - obviously showing how lousy the B550 chipset is like I said because you're stealing GPU lanes from the CPU to compensate for the lame B550 chipset limitations.
> 
> 
> 
> Not comparing it to an X570 - bottom line is the whole AM4 ecosystem is old junk. Compare it to a Z690/Z790 and they all look terrible in comparison. That's the price people pay for staying on lousy old boards and keep upgrading. As for the 5800X3D, the poor VRMs on most B450 boards will be stretched... AMD sure delayed Zen 3 support on B450 until too many complains came in.


What? The 5800x3D doesn't pull hardly any power. It's extremely efficient. I can't imagine even a cheap board's VRM struggling with it.


----------



## neteng101

yzonker said:


> What? The 5800x3D doesn't pull hardly any power. It's extremely efficient. I can't imagine even a cheap board's VRM struggling with it.


Got me there - forgot that thing was cut at the knees by AMD so you can't really OC it.

I've looked at picking up AM4 for fun just to see what the hype is all about - and each time I do research I keep realizing that would be such a bad buy. I don't get why being able to upgrade on a limited board is such an attraction unless you have very modest needs. I need all the M2 slots I can get working at full speed, etc.


----------



## SilenMar

Krzych04650 said:


> Again, you keep talking fantasies and theorizing with nothing to show for it. Where is your testing?


I already showed 580W.





How hard it could be to show 500W.





I just played one game. Again, you guys really play games?


----------



## WayWayUp

Sheyster said:


> I was more interested in the 42" size, more PPI, better text. The 48" CX is rather large.
> 
> At this point I'll wait to see what 2023 brings. The good thing about the CX is it will have a nice new home in my guest room, replacing an old Sony 40" LED TV. No need to bother selling it.


I want a 8k ultrawide
its not as taxing a true 8k. basically 2x 4k. but with the way the 4090 is designed the frame dip isnt that much. going by 5k benchmarks (5,120 × 2,880) which is ~1.78x the resolution of 4k, the performance dip is fairly minimal over 4k. its there but it isnt huge; basically RT is more taxing on the gpu than the bump in resolution is

with 8k ultrawide you get the same ultra sharp ppi as 27" 4k, but with the added field of view without the stupid vertical height which is the main factor that makes ppl think a monitor is "too big"

if i can get this with an OLED panel then 🤩

and heres the thing, LG is making a 97" 8k oled that they are about to debut in January so it looks like they will get 8k panels rolling. This will trickle down to monitors and lg already announced they are doing an 8k ultrawide

this would truely be end game for me

but there is one big problem.... this would be the only scenario where DP 2.0/2.1 is needed

hdmi 2.1 can already support high refresh rate 4k with no compromises..... 4k 240hz is stupid if you need 240hz for online shooter you probably already play in 1440p... and its not needed for 1440p as you can easily get super high refresh with hdmi..... and 8k is too demanding for people to expect high refresh rate so 8k 60 is already doable now

but 8k ultrawide? thats where you actually need DP 2 and thats where nvidia dropped the ball


----------



## KingEngineRevUp

neteng101 said:


> Yeah - obviously showing how lousy the B550 chipset is like I said because you're stealing GPU lanes from the CPU to compensate for the lame B550 chipset limitations.
> 
> 
> 
> Not comparing it to an X570 - bottom line is the whole AM4 ecosystem is old junk. Compare it to a Z690/Z790 and they all look terrible in comparison. That's the price people pay for staying on lousy old boards and keep upgrading. As for the 5800X3D, the poor VRMs on most B450 boards will be stretched... AMD sure delayed Zen 3 support on B450 until too many complains came in.





yzonker said:


> What? The 5800x3D doesn't pull hardly any power. It's extremely efficient. I can't imagine even a cheap board's VRM struggling with it.


So in other words, there is no evidence. Just a user that hates AMD AM4 for not fulfilling their specific use cases. 

Okay, fair enough, it doesn't OC like crazy. But give it credit where it's due. It's a great drop in upgrade for many AM4 users.


----------



## alasdairvfr

Every time I start pondering 8k, i remind myself of my journey to 4k. I run 3 monitors and for a time it was a choice between high frames and high resolution. Having different resolution monitors (all 32") was jarring as I move windows from screen to screen. Things would constantly catch and get stuck on the virtual lip that is the 4k being taller in pixels, a tiny window on one screen becomes large on the next.

I finally got my hands on a 144hz 4k 32" (MSI Optix) - high refresh 4k monitors were damn near impossible to find at 32" for a long time. 27-28, no problem. 43" you had a few, although most of them had QC issues. It wasn't until early this year I was able to get my 3x 4k setup: 1 higher end for gaming and the side panels for productivity. Even [email protected] DP can barely handle it, needs DSC (works well enough for me) so I can only imagine what 8K would bring. I'm comfortable at 11,520x2160 for day-to-day and 3840x2160 for gaming. Going to 7680 × 4320 or similar on a single panel is something I'd like to see or try but I'm not sure if I'd be willing to rework my entire setup. Not to sacrifice high refresh 4k for 60fps 8k. Going from 1440p to 4k was a big jump in clarity for me but I'm less convinced 4k to 8k will wow me as much as going from 144 back to 60 will burst the balloon.

High refresh at 4k only a few years ago was barely a thing so who knows in 5 years 120/144hz 8k on a 2000w 6090 may be a thing. And knowing me, I'll be running it when the time comes.


----------



## neteng101

KingEngineRevUp said:


> But give it credit where it's due. It's a great drop in upgrade for many AM4 users.


The value is still questionable IMO at current prices - yes you get a nice uplift, but its not a balanced upgrade and other parts of the system are still way behind. If it was way cheaper then sure, but at its current price its still hard IMO to recommend vs. say a new build with a 13600k + cheaper Z690 DDR4 board... costs a bit more to upgrade to vs the drop in but its no longer just a CPU upgrade. Plus you get to OC the 13600k in a way the 5800X3D is not capable of. Or heck just go buy the $349 12700k + Z690 combo Microcenter had recently - for $20 more you get off antiquated AM4.

But I can give it credit for those that want a fuss free experience - not sure why those folks would be hanging out here though. Things AMD do makes me question how viable they are for OC enthusiasts overall - the 5800X3D actually introduced Agesa regressions on other AMD CPUs because AMD started worrying about what the old higher voltage limits might do to the 5800X3D. Seems like a **** show really even for a casual OC enthusiast.


----------



## alasdairvfr

SilenMar said:


> You can get easy 5fps from 500W to 600W in Resident Evil at 4K with 170% scale. They didn't test games so well. It didn't even show the full capability of a card. Some games are also massively CPU/RAM bottlenecked.
> 
> A good card is always the one pulls more power easily. This is the job of AIB.
> 
> When some AIB think it is fine at 450W like you do, they will use every component as cheap as possible to just pull 450W no more. But the price can be the same or more than the better built GPU that is designed to pull 600W.
> 
> There is a difference between an option to do more and no option at all. There is a even bigger difference when selling a 450W and a 600W at the same price.


Your point is barely a point, you missed what I was saying. Obviously a person that wants to run higher than 450w should either find a GPU with a higher limit BIOS or flash one. There's not really a debate here. The point I was making is a card that pulls more power isn't automatically better. Good silicon can do more with less power, achieve a good clock with less voltage and power than less-good silicon. 




SilenMar said:


> I already showed 580W.
> 
> 
> 
> 
> 
> How hard it could be to show 500W.
> 
> 
> 
> 
> 
> I just played one game. You guys really play games?


I'm not sure what you are trying to prove, is this the same card twice? Most GPUs will perform better with increased voltage and power limits, with a dropoff in diminishing returns toward the top. You need to compare multiple GPUs to know which one is better in the same environment. You can do a balls to the wall test which is pretty much 3DMark HoF. More power is typically needed for a better score, sure but to say that one GPU that hits a higher power consumption (watts) in a bench is better isn't really true. What about the 500w GPUs that can outperform a less fortunately binned Strix?


----------



## SilenMar

alasdairvfr said:


> Your point is barely a point, you missed what I was saying. Obviously a person that wants to run higher than 450w should either find a GPU with a higher limit BIOS or flash one. There's not really a debate here. The point I was making is a card that pulls more power isn't automatically better. Good silicon can do more with less power, achieve a good clock with less voltage and power than less-good silicon.
> 
> 
> 
> 
> I'm not sure what you are trying to prove, is this the same card twice? Most GPUs will perform better with increased voltage and power limits, with a dropoff in diminishing returns toward the top. You need to compare multiple GPUs to know which one is better in the same environment. You can do a balls to the wall test which is pretty much 3DMark HoF. More power is typically needed for a better score, sure but to say that one GPU that hits a higher power consumption (watts) in a bench is better isn't really true. What about the 500w GPUs that can outperform a less fortunately binned Strix?


If only you could've found the magic to chip lottery. 
The value of a GPU is more on the build quality so it has more potentials compared to no potentials at all. 
You've said this gen is boring. I tell you further gen will be like Apple product.


----------



## KingEngineRevUp

neteng101 said:


> but its not a balanced upgrade and other parts of the system are still way behind.


People have different needs. I had a 5900X because the multicore workload assisted me in CAE and FEA applications. Got a new job, got a $5K workstation, no longer needed the 5900X. All I do is game anyways. Sold and got a 5800X3D, costed me $30 and getting a nice performance uplift. 

Upgrade paths are different for everyone. I understand where you're coming from. If my PC magically imploded and disappeared into the abys, I wouldn't go buy a new AM4 motherboard and a 5800X3D. I would go AM5 or Intel's newest. 

But again, there are many people in my situation. We just game, and dropping a 5800X3D is a great option. It's affordable, you don't have to completely rebuild your system and for pure gaming it's going to hold me off for years to come. I'll skip AM5 all together at this rate.


----------



## yzonker

KingEngineRevUp said:


> So in other words, there is no evidence. Just a user that hates AMD AM4 for not fulfilling their specific use cases.
> 
> Okay, fair enough, it doesn't OC like crazy. But give it credit where it's due. It's a great drop in upgrade for many AM4 users.


I have both a 5800x3D system and 13900k. Hilariously I've only managed to match the 5800x3D with the 13900k in the SotTR benchmark (1080p lowest). My 12900k couldn't get there. Lol.


----------



## Pepillo

Testing the new computer, with the 13900K at a boost of 6.000 Mhz and the Gainward Phantom that I have flash with the bios of the GS model. Memories go up well, very happy as it was the cheapest custom I found:









I scored 19 776 in Time Spy Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## neteng101

coelacanth said:


> It seems a lot of the time nowadays it may not be pure overclocking but more tuning to fit one's needs. There is still plenty of tinkering to be done with tuning RAM, CPUs, video cards. The returns depend on what one if trying to do. If gaming at 4K there are definitely diminishing returns to overclocking CPU and RAM.


Its not just outright FPS, 1% lows and frame time consistency can be a big factor, and with 4k 240hz monitors being a reality today, its no longer diminishing returns for absolute FPS either. Frame time consistency is one reason why I would never choose to power limit a GPU - undervolt to a target clock if you want to, but don't let the GPU power limit because the clock fluctuations are greater that way. Power efficiency is nice but if you can, why not get every bit of performance you can. The 4090 with maxed out power limits is not going to be constantly north of 450W in games, and going down to 300W seems like a waste to me - pay for a high end card and you just knee cap the card? A frame rate limiter already will act as your power limiter, no need to let it not draw a bit more power when it really needs it and slow it down too.



KingEngineRevUp said:


> It's affordable, you don't have to completely rebuild your system and for pure gaming it's going to hold me off for years to come.


There's definitely the convenience factor there. The tinkerer that I am, I would have gone for a new CPU+board to try something else out, but it does consume time. I probably spend 25% of my time tinkering to 75% gaming which is unlike most people. And there are days I ask myself why I broke the PC again.


----------



## Sheyster

yzonker said:


> I have both a 5800x3D system and 13900k. Hilariously I've only managed to match the 5800x3D with the 13900k in the SotTR benchmark (1080p lowest). My 12900k couldn't get there. Lol.


I got by for the past 4 years on a 9900K/Z390. If I had a 5800x3D I probably would have stayed with it at least another year or two.


----------



## coelacanth

neteng101 said:


> Got me there - forgot that thing was cut at the knees by AMD so you can't really OC it.
> 
> I've looked at picking up AM4 for fun just to see what the hype is all about - and each time I do research I keep realizing that would be such a bad buy. I don't get why being able to upgrade on a limited board is such an attraction unless you have very modest needs. I need all the M2 slots I can get working at full speed, etc.


So because AM4 doesn't meet your needs it's "old junk." Got it.


----------



## neteng101

coelacanth said:


> So because AM4 doesn't meet your needs it's "old junk." Got it.


Guess I ruffled some feathers there. AM4 is just an old platform - with older chipsets. The 5800X3D might be its last swan song, but its stuck with older chipsets that are not nearly as capable today. Everyone claims supporting a socket for so long is great, but there are drawbacks to this approach that people aren't considering. PCIe 4.0 drives are indeed a must have and with GPUs like the 4090 - heck no to stealing GPU lanes for drives. So yes, sadly, you're left with a nice new engine and plugging it into a rusty old frame that is AM4.


----------



## KingEngineRevUp

neteng101 said:


> The tinkerer that I am, I would have gone for a new CPU+board to try something else out, but it does consume time.


Trust me, I would too lol. But I chose to father a child and daycare cost $1K a month. 😭

I could have built a computer once every month or two.


----------



## coelacanth

neteng101 said:


> Guess I ruffled some feathers there. AM4 is just an old platform - with older chipsets. The 5800X3D might be its last swan song, but its stuck with older chipsets that are not nearly as capable today. Everyone claims supporting a socket for so long is great, but there are drawbacks to this approach that people aren't considering. PCIe 4.0 drives are indeed a must have and with GPUs like the 4090 - heck no to stealing GPU lanes for drives. So yes, sadly, you're left with a nice new engine and plugging it into a rusty old frame that is AM4.


No feathers ruffled, I was just pointing out that your take on AM4 is absurd and nonsensical. "It doesn't work for me so it's junk for everyone."


----------



## neteng101

coelacanth said:


> No feathers ruffled, I was just pointing out that your take on AM4 is absurd and nonsensical. "It doesn't work for me so it's junk for everyone."


I did said the value doesn't seem to make much sense when you factor in the other parts that don't get any uplift. Really just asking for everyone to challenge me on my sounding absurd but so far its still hard to see how aside from ease of a drop in upgrade, that its a great option for everyone. But I did learn at least some things here by sounding that way - 5800X3D could be a great option for a power efficient, fast gaming build... as long as you can manage to the chipset limitations on storage. Aside from the 12400, Intel doesn't have a power efficient option because of the inclusion of E-cores into the RPL SKUs and the 12400 is locked too aside from BCLK shenanigans but the SA voltage lock hurts more.


----------



## GRABibus

How is this score possible regarding average clocks and Memory OC ?









I scored 29 528 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## KingEngineRevUp

GRABibus said:


> How is this score possible regarding average clocks and Memory OC ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 528 in Port Royal
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Port ROyal is broken. Scores inflate under certain memory instability moments with artefacts.


----------



## WayWayUp

GRABibus said:


> How is this score possible regarding average clocks and Memory OC ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 528 in Port Royal
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


It’s possible with a bugged out run
I don’t respect port royal this generation.
I put my score on the HOF the first day with my 4090 and haven’t run it a single time since

those scores are scams and I have no respect for them. Almost the entire HoF top 100 is bugged runs. An extreme tiny minority of cards can get 29,000 without having a bugged run

I remember watching Bearded hardware with Gamers nexus and they were stuck at 29k at best, and only got 30k with a bugged out memory error laden run…. And we are talking a hand picked card with unlocked power bios and LN2 just to be maxed out at 29k

yet you see 29k++ scores being handed out like candy for standard cards on air

Maybe focus on Time Spy instead


----------



## WayWayUp

I singled out tecblab since I know he used cold/ln2

3270 MHz and he’s down at #37 with a score of 29,304
PR scores should be Ignored this generation


----------



## GRABibus

WayWayUp said:


> It’s possible with a bugged out run
> I don’t respect port royal this generation.
> I put my score on the HOF the first day with my 4090 and haven’t run it a single time since
> 
> those scores are scams and I have no respect for them. Almost the entire HoF top 100 is bugged runs. An extreme tiny minority of cards can get 29,000 without having a bugged run
> 
> I remember watching Bearded hardware with Gamers nexus and they were stuck at 29k at best, and only got 30k with a bugged out memory error laden run…. And we are talking a hand picked card with unlocked power bios and LN2 just to be maxed out at 29k
> 
> yet you see 29k++ scores being handed out like candy for standard cards on air
> 
> Maybe focus on Time Spy instead


Not all scores are bugged hopefully.
I can make constant runs at 29000+ on my gigabyte gaming OC without any artefact . memory scaling is ok and my average clock is at 3100MHz even at 55 degrees.
My best run is 29134 (link in signature).


But yes, I will try timespy to see what I can get as best scores .


----------



## yzonker

GRABibus said:


> Not all scores are bugged hopefully.
> I can make constant runs at 29000+ on my gigabyte gaming OC without any artefact . memory scaling is ok and my average clock is at 3100MHz even at 55 degrees.
> My best run is 29134 (link in signature).
> 
> 
> But yes, I will try timespy to see what I can get as best scores .


There are artifacted runs in everything. Not just PR. I've done it with Speedway and TS. I've just been deleting them. 
And yes low 29k is do-able without artifacts. I've managed it with some careful control of water (chiller) and memory (backplate heater) temps. A few others have slightly better cards and can get there without that. 

The thing that I keep pointing out and most probably don't understand/believe, is that the card takes an additional cold efficiency hit around 10-12C water temp. My PR score will drop 200-300 pts without touching any settings. I don't understand what causes it, but it is real. 

And the LN2 guys probably take that hit as well or more. So when you combine that with the fact that the card is partially memory bottlenecked resulting in reduced gains from core clock, you can see how LN2 isn't nearly as effective as it was in past generations. Thus bringing the scores closer together.


----------



## yzonker

Wrong thread


----------



## neteng101

WayWayUp said:


> PR scores should be Ignored this generation


I wouldn't single it out to bugged runs - there's no scientific proof beyond theory and heresay at this point. Memory tuning and timings actually makes a significant difference in 3DMark runs with the 4090 - all 3 of my scores went down when I changed just 2 secondary memory timings recently. Twr 12 to 16, Trp 10 to 8. Just those 2 timings! Speed Way, Port Royal and Timespy Extreme scores were down noticeable enough that I wondered what went wrong. I proceeded to tun other secondaries towards extreme values but Twr was the biggest change in my 3DM scores. Nothing changed on the GPU side.

The V-cache seems to be affecting 5800X3D scores with the 4090 in a positive manner big time. Not surprised since the scores also moved with that movement in memory timings for me. Perhaps others will start testing and playing with other knobs and notice differences here - seems like system/cache/memory is a big factor for the 4090 given it has very high limits on the GPU side.


----------



## Panchovix

X909 said:


> Do you want to test unigine heaven 4.0 to compare (highest preset, 4xmsaa and window mode, so you can easily adjust clocks on the fly)? Thats the second hardest app (1st one is DCS world but thats a big download ) to find artefacts that I identifed so far.


Nice recommendation man, tested and could see some flickering/artifacting, it didn't crash but it's clearly not stable


Spoiler: pics


----------



## J7SC

yzonker said:


> There are artifacted runs in everything. Not just PR. I've done it with Speedway and TS. I've just been deleting them.
> And yes low 29k is do-able without artifacts. I've managed it with some careful control of water (chiller) and memory (backplate heater) temps. A few others have slightly better cards and can get there without that.
> 
> The thing that I keep pointing out and most probably don't understand/believe, is that the card takes an additional cold efficiency hit around 10-12C water temp. My PR score will drop 200-300 pts without touching any settings. I don't understand what causes it, but it is real.
> 
> And the LN2 guys probably take that hit as well or more. So when you combine that with the fact that the card is partially memory bottlenecked resulting in reduced gains from core clock, you can see how LN2 isn't nearly as effective as it was in past generations. Thus bringing the scores closer together.


Yeah, I find it laughable that someone would just make generalization about almost every HoF PR run by folks from all over the world. I have done multiple runs over 29k, some of which I subbed, others not yet - and some I will never sub because they were bugged (like that 30,337 ).

That cold bug you describe around 10 C - 12 C sounds interesting...I have never gotten my card that cold so I haven't seen it, but does the score always drop by 200pts plus when you reached down to those temps, or only sometimes ? These cards with their ~ 76.3 billion transistors on N4 must have all kinds of funky things going on inside at various temps and with boots algorithms that might get confused when they see a value out of bounds...

Finally, have you tried a custom vBios on your card (TUF/OC ?), like the Strix V2, Neptune 630W or MSI Suprim 600W ? I've got all three of those downloaded though not installed yet - really, I am just looking for the one that doesn't curtail my Giga-G-OC's core (3180+ when select conditions are just right on the stock vbios) but helps a bit with VRAM temps via timings and/or more aggressive voltage.


----------



## Mad Pistol

yzonker said:


> I have both a 5800x3D system and 13900k. Hilariously I've only managed to match the 5800x3D with the 13900k in the SotTR benchmark (1080p lowest). My 12900k couldn't get there. Lol.


I don't think some people realize how capable of a gaming CPU the 5800x3D actually is. Regardless of the underlying chipset and DDR4, it is a power house, and it does so without the use of DDR5.

This isn't the "Bulldozer" AMD anymore... this is AMD at their best, taking an old architecture and turning the dial up to 11.

Admittedly, if you're looking at CPU compute benchmarks or the CPU portion of 3DMark, the 5800x3D looks underpowered even compared to the 5800x. But, that's part of its charm; its IPC in games is unparalleled. Gotta love 3D v-cache.


----------



## yzonker

J7SC said:


> Yeah, I find it laughable that someone would just make generalization about almost every HoF PR run by folks from all over the world. I have done multiple runs over 29k, some of which I subbed, others not yet - and some I will never sub because they were bugged (like that 30,337 ).
> 
> That cold bug you describe around 10 C - 12 C sounds interesting...I have never gotten my card that cold so I haven't seen it, but does the score always drop by 200pts plus when you reached down to those temps, or only sometimes ? These cards with their ~ 76.3 billion transistors on N4 must have all kinds of funky things going on inside at various temps and with boots algorithms that might get confused when they see a value out of bounds...
> 
> Finally, have you tried a custom vBios on your card (TUF/OC ?), like the Strix V2, Neptune 630W or MSI Suprim 600W ? I've got all three of those downloaded though not installed yet - really, I am just looking for the one that doesn't curtail my Giga-G-OC's core (3180+ when select conditions are just right on the stock vbios) but helps a bit with VRAM temps via timings and/or more aggressive voltage.


The cold bug is 100% consistent. I've tried several times as I increased my memory temps with various methods. But it's stayed pretty much the same unfortunately. 

I've been running the Strix bios ever since we got a compatible version of NVFlash. I've also run the MSI and Giga OC bios. I don't think there is any significant difference in mem OC between the TUF, STRIX, and MSI bios. Dunno on the Giga bios as I haven't really tested it yet. It's on the 2nd position on the bios switch. I'm probably going to test several here at some point. Just haven't had time.


----------



## raad11

I've been having this weird stuttering problem in Overwatch 2 as of 3 days ago. Here's my post about it:



> The stuttering is random, it’s in-game only, and it’s noticeable above 300fps (I have it set to 600fps limit, and it’s usually pushing the cap the entire time).
> 
> I believe it’s something to do with the GPU. I have two pictures of it happening in hardware monitoring software:
> 
> 
> 
> http://imgur.com/ZJYO58O
> 
> 
> 
> 
> http://imgur.com/KERoxQf
> 
> 
> I have Windows 10 22H2 (Windows Home 64-bit)
> Motherboard: Asus ROG Striz Z690-A Gaming WiFi D4
> CPU: i9-13900K
> GPU: MSI RTX 4090 Gaming Trio 24G
> RAM: 32GB
> 
> I got the GPU on 10/12/2022 on my old CPU (12900K). Then I got the new CPU on 10/20/22, release date. I’ve had *zero* issues in Overwatch 2 with this hardware until 2 days ago.
> 
> I played on 11/28/22 with no problems. Then I got sick and came down with the flu. I signed on again Saturday or Sunday night and the stuttering had started.
> 
> There was no Windows update or Nvidia driver update since then.
> 
> It’s not the drivers, I rolled back to 522.25 (first 4090 driver) and still experience the issue.
> 
> MSI Afterburner can sometimes make the issue appear or disappear inbetween launches of the game, but I can’t figure out any pattern to it. I uninstalled RivaTuner.
> 
> I disabled Resizable BAR. I updated the VBIOS.
> 
> GPU usage and scores are normal in 3DMark TimeSpy.
> 
> Other games: I’ve played CoD: Warzone 2.0 and Fortnite and they work fine and I don’t experience stutter, though their framerate is significantly lower than Overwatch 2’s. If I lock my FPS in OW2 to like 300, I can avoid the stutter. But it’s driving me nuts why it’s happening in the first place. I liked playing at 600fps.
> 
> It’s set to Maximum Performance in Nvidia Control Panel. I’ve used High and Ultimate Performance Power Plans in Windows as well. I’ve disabled and re-enabled G-Sync. I’ve turned Game Mode off and then back on. Xbox Game Bar is already disabled. I disabled virtualization options in the BIOS (though Core Isolation was already disabled in Windows). I ran my CPU at all stock settings (my overclock was already tested for stability in programs like Cinebench and had been running fine in Overwatch for weeks, I knew this wouldn’t be it). Nothing fixed it.
> 
> I made sure no unnecessary background tasks/programs are running. I already run OW2 in high priority (I tried going back to Normal, no change). I uninstalled a lot of stuff.
> 
> Last thing I’m trying right now is enabling Hardware Accelerated GPU Scheduling (HAGS). I have to try it for around a day since the problem appears and disappears at random.
> 
> At this point, either my GPU is failing in a very mysterious way or something happened to the game.
> 
> I did Scan and Repair with no problems (also did SFC /SCANNOW in Windows, and the DISM image repair thing, no problems found through either method). I will try uninstalling and reinstalling the game next as well.
> 
> Beyond that, I may try a temporary new install of Windows on a separate removable drive to completely isolate the problem away from software other than the game (or my hardware).


Does anyone have any other ideas?


----------



## WayWayUp

neteng101 said:


> I wouldn't single it out to bugged runs - there's no scientific proof beyond theory and heresay at this point. Memory tuning and timings actually makes a significant difference in 3DMark runs with the 4090 - all 3 of my scores went down when I changed just 2 secondary memory timings recently. Twr 12 to 16, Trp 10 to 8. Just those 2 timings! Speed Way, Port Royal and Timespy Extreme scores were down noticeable enough that I wondered what went wrong. I proceeded to tun other secondaries towards extreme values but Twr was the biggest change in my 3DM scores. Nothing changed on the GPU side.
> 
> The V-cache seems to be affecting 5800X3D scores with the 4090 in a positive manner big time. Not surprised since the scores also moved with that movement in memory timings for me. Perhaps others will start testing and playing with other knobs and notice differences here - seems like system/cache/memory is a big factor for the 4090 given it has very high limits on the GPU side.


I don’t see how that is related to my post. Many results are legit
But the highest ones are not


----------



## Sheyster

raad11 said:


> I've been having this weird stuttering problem in Overwatch 2 as of 3 days ago. Here's my post about it:
> 
> Does anyone have any other ideas?


Have you installed this MS update yet? Link:






November 29, 2022—KB5020044 (OS Build 22621.900) Preview - Microsoft Support







support.microsoft.com





Long list of improvements (AKA bug fixes) including:



> It addresses an issue that affects the performance of some games and applications.


----------



## WayWayUp

J7SC said:


> Yeah, I find it laughable that someone would just make generalization about almost every HoF PR run by folks from all over the world. I have done multiple runs over 29k, some of which I subbed, others not yet - and some I will never sub because they were bugged (like that 30,337 ).


And I find it laughable how you misrepresented everything I typed

I’ve literally said twice but I guess I’ll say it a 3rd time, 29k+ is definitely possible and I have posted such a score in this very thread on the first day I got the card.

And guess what? The HoF scores are still trash and I will still continue to disregard them. Yes it’s a huge problem, they are having meltdowns over at hwbot and 3dmark even wants to change how you can even validate your 4090 scores.

and yes ALL top25 scores are either pro benchers or bugged so there is no point or motivation as you can’t beat a bugged out score without one of your own
And just a big fyi you guys can’t seem to understand that memory can… and often does.. error out without visible artifacts. Just because you don’t see flickering everywhere doesn’t mean your score is legitimate sorry to break it to you

finally the memory scaling makes port royal a terrible benchmark this generation irrespective of the other issues
If you cant post over +1500 memory you will have a hard time with scores. Memory speed is too important and this doesn’t track games, most workloads, or other benchmarks. You can run PR at 3350Mhz and still have just a “so/so” score because your memory speed doesn’t hit a certain threshold. That is not representative of gaming

I also realize you can bug out FS and TS, but it’s MUCH MUCH harder to do
PR isn’t respected by many in the community anymore


----------



## Betroz

So.... overclocking vram on 4090 is META and core not so important for gaming?


----------



## WayWayUp

It’s meta in some of the terrible benchmarks like port royal 
In actual gaming both are very important and I would still say core clock is more important but I’m sure it’s game dependent as well


----------



## KingEngineRevUp

Betroz said:


> So.... overclocking vram on 4090 is META and core not so important for gaming?


Well here's a study of my daily drivers for gaming









Result







www.3dmark.com





You can see memory gives a pretty nice performance boost, at least in this application.

I ended up undervolting and OC though UV+OC vs. stock









Result







www.3dmark.com


----------



## Betroz

WayWayUp said:


> It’s meta in some of the terrible benchmarks like port royal
> In actual gaming both are very important and I would still say core clock is more important but I’m sure it’s game dependent as well


I only play BF2042 at the moment - at 3840x1600 ultrawide.


----------



## J7SC

WayWayUp said:


> And I find it laughable how you misrepresented everything I typed
> 
> I’ve literally said twice but I guess I’ll say it a 3rd time, 29k+ is definitely possible and I have posted such a score in this very thread on the first day I got the card.
> 
> And guess what? The HoF scores are still trash and I will still continue to disregard them. Yes it’s a huge problem, they are having meltdowns over at hwbot and 3dmark even wants to change how you can even validate your 4090 scores.
> 
> and yes ALL top25 scores are either pro benchers or bugged so there is no point or motivation as you can’t beat a bugged out score without one of your own
> And just a big fyi you guys can’t seem to understand that memory can… and often does.. error out without visible artifacts. Just because you don’t see flickering everywhere doesn’t mean your score is legitimate sorry to break it to you
> 
> finally the memory scaling makes port royal a terrible benchmark this generation irrespective of the other issues
> If you cant post over +1500 memory you will have a hard time with scores. Memory speed is too important and this doesn’t track games, most workloads, or other benchmarks. You can run PR at 3350Mhz and still have just a “so/so” score because your memory speed doesn’t hit a certain threshold. That is not representative of gaming
> 
> I also realize you can bug out FS and TS, but it’s MUCH MUCH harder to do
> PR isn’t respected by many in the community anymore


It's really quite simple...
You wrote, and I quote, "Almost the entire HoF top 100 is bugged runs" and I have not misrepresented that in any way in my post. In fact, I didn't respond when you wrote something like it last week in the post about your Fire Strike Ultra here...


Spoiler















...though noting you closing comment in that post, I did do my first Fire Strike Ultra runs with my 4090 shortly thereafter...


Spoiler














 
Finally, while I haven't subbed at HWBot for years, I used to do quite a lot of benching at the 'Elite league' (= sub-zero), with manufacturer support. I really don't think I need your 'advice' about VRAM and such


Spoiler















Perhaps it is best for you to stay away from faulty generalizations, that's all.


----------



## GRABibus

Mad Pistol said:


> I don't think some people realize how capable of a gaming CPU the 5800x3D actually is. Regardless of the underlying chipset and DDR4, it is a power house, and it does so without the use of DDR5.
> 
> This isn't the "Bulldozer" AMD anymore... this is AMD at their best, taking an old architecture and turning the dial up to 11.
> 
> Admittedly, if you're looking at CPU compute benchmarks or the CPU portion of 3DMark, the 5800x3D looks underpowered even compared to the 5800x. But, that's part of its charm; its IPC in games is unparalleled. Gotta love 3D v-cache.


This is why I wait for 7900X3D or 7950X3D with a lot of impatience 😉


----------



## GRABibus

Gigabyte Gaming CO on stock air cooler
*Rebar is not forced* in Timepsy.exe with Inspector.


















I scored 36 048 in Time Spy


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Of course, All first places are with 13900K as it beats a lot 7950X at CPU score.

I am at least 15th in HOF with one single card at graphics score :








3DMark Time Spy Graphics Score Hall of Fame


The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.




www.3dmark.com





Disabling SMT on 7950X helps a lot for CPU score : + 2700points versus SMT enabled.


----------



## Panchovix

GRABibus said:


> Gigabyte Gaming CO on stock air cooler
> *Rebar is not forced* in Timepsy.exe with Inspector.
> 
> View attachment 2587330
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 36 048 in Time Spy
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Of course, All first places are with 13900K as it beats a lot 7950X at CPU score.
> 
> I am at least 15th in HOF with one single card at graphics score :
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark Time Spy Graphics Score Hall of Fame
> 
> 
> The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Disabling SMT on 7950X helps a lot for CPU score : + 2700points versus SMT enabled.


41300 at stock air cooler and no rebar forced is really impressive, really good card there mate!


----------



## MrTOOSHORT

Better than my average Giga 4090 OC.


----------



## WayWayUp

Panchovix said:


> 41300 at stock air cooler and no rebar forced is really impressive, really good card there mate!


how much bump up are people getting from forced rebar anyways? I heard it improves PR by 200-300 points but im not sure about timespy? i dont believe it helps in all the benchmarks only select ones

excellent run by the way Grabibus good score🤜🤛


----------



## neteng101

WayWayUp said:


> I don’t see how that is related to my post. Many results are legit
> But the highest ones are not


You saying that doesn't make them not legit. Without knowing all the factors behind the run, its just speculation on your part. The person actually running might know if their own results are not legit - but assuming all those scores are not legit, you just can't proof that.

Its related to your post because you're making ASSumptions basically and sweeping generalizations.


----------



## alasdairvfr

WayWayUp said:


> how much bump up are people getting from forced rebar anyways? I heard it improves PR by 200-300 points but im not sure about timespy? i dont believe it helps in all the benchmarks only select ones
> 
> excellent run by the way Grabibus good score🤜🤛


My Timespy goes down with rebar on


----------



## WayWayUp

J7SC said:


> It's really quite simple...
> You wrote, and I quote, "Almost the entire HoF top 100 is bugged runs" and I have not misrepresented that in any way in my post. In fact, I didn't respond when you wrote something like it last week in the post about your Fire Strike Ultra here..


You dont need a bugged run to hit the leaderboards but a huge percentage of the scores ARE indeed bugged and probably the majority. I completely stand by my statement

open your eyes













ouch der8auer what a shame. despite your volt mod from elmor, despite your heater that you place on the backside of your card, despite the super cold temps and big clock frequences. despite the unlocked power bios... you can barely crack 29k and dont even have a top 50 score..... you got guys with air cards under 3Ghz breathing down your neck what a shame












nice score liam bro! I'm happy to you made it to the Hall of Fame. That 2,889 average clock so high it was a sure fire guarantee to get you there. amazing clocks!


the leaderboards are littered with this. you dont need to be a rocket scientist to figure out that PR HoF has a huge problem right now


but its more than just this that makes me disregard PR. We know that this benchmark has always been a gpu test but its getting out of hand in terms of hardware. you have guys with sub par cpus and slow Dram 









i mean thats fine and all but top 100 score in the world with this kind of configuration? At least timespy requires you to have a balance system. in the past it was nice since you didn't need to tweak and optimize your cpu to get a good run....but it at least required you to have a really good cpu not an i3

but then there is the Vram which just tips the scale and makes the benchmark dumb. scores are scaling with vram but to optimize the memory you actually dont want it cold.... which is in complete contrast of tuning your core clock as that directly responds to cold. just too many unique scenarios making PR a pretty bad benchmark this generation

there is just too much negative going on


----------



## Sheyster

WayWayUp said:


> And guess what? The HoF scores are still trash and I will still continue to disregard them. Yes it’s a huge problem, they are having meltdowns over at hwbot and 3dmark even wants to change how you can even validate your 4090 scores.


On a less serious note, I find the whole thing laughable. I'm tempted to get a group of people together to get the best possible bugged run and post it up on the leaderboard, then go public about it being the best bugged run. Maybe when it gets to be that much of a joke, they'll actually do something about it.


----------



## Sheyster

GRABibus said:


> Disabling SMT on 7950X helps a lot for CPU score : + 2700points versus SMT enabled.


Just curious, for gaming is it generally better to disable SMT on the 7950x or not?


----------



## WayWayUp

im not sure anyone has tested a full suite of games to give you a definitive answer. right now its a game by game basis

same concept with disabling HT on intel. there are scenarios where its beneficial but nothing conclusive


----------



## GRABibus

Sheyster said:


> Just curious, for gaming is it generally better to disable SMT on the 7950x or not?


in a game like Spiderman remastered where CPU bound is present, even at 4K, even with 7950X, it helps.
My GPU usage increased and fps also.


----------



## raad11

Sheyster said:


> Have you installed this MS update yet? Link:
> 
> 
> 
> 
> 
> 
> November 29, 2022—KB5020044 (OS Build 22621.900) Preview - Microsoft Support
> 
> 
> 
> 
> 
> 
> 
> support.microsoft.com
> 
> 
> 
> 
> 
> Long list of improvements (AKA bug fixes) including:


I'm on Win 10 22H2. There's a optional quality update I may check out. Will also try Win 11 on a removable drive.

Quick Q for you all, this can't be a symptom of GPU failure, can it? As far as I know, GPUs that have hardware problems will crash, black screen, BSOD/reboot/etc but shouldn't randomly have irregular power usage/stuttering intermittently, right?

Edit: Also, enabling HAGS didn't fix problem. It got better for about a day, but problem came back. Sometimes leaving afterburner open in background (not even minimized) helps.

Also, the stutter is severe in that it drops to 200-ish FPS. When FPS cap is 600, it drops so quickly the display doesn't show it go below 400, but when I turn on G-Sync and have monitor's OSD on, it shows it is indeed dropping to at least 200-300 range.

I'm going to experiment more with other games today to make sure it's only Overwatch with the problem.


----------



## J7SC

neteng101 said:


> You saying that doesn't make them not legit. Without knowing all the factors behind the run, its just speculation on your part. The person actually running might know if their own results are not legit - but assuming all those scores are not legit, you just can't proof that.
> 
> Its related to your post because you're making ASSumptions basically and sweeping generalizations.


...yeah, take DerBauer's 4090 Strix results for example...per his own vids, his card developed an issue early on whereby it dropped from PCI 16x to 8x - I ran into that with some older EVGA GTX cards, and even then it impacted scores though EVBot and a special mobo (w/PEX) helped to reduce though not eliminate the impact....with the much more powerful RTX 4090 series, DerBauer actually figured he is loosing 200 or more points. Of course he cannot put that information into the HoF score table....

BTW, he tried to fix it by 'heat treatment' and now it doesn't boot at all anymore, but there is still some hope as it at least showed up in Device Manager per iGPU boot-up



Sheyster said:


> On a less serious note, I find the whole thing laughable. I'm tempted to get a group of people together to get the best possible bugged run and post it up on the leaderboard, then go public about it being the best bugged run. Maybe when it gets to be that much of a joke, they'll actually do something about it.


...soooo let me get this straight, you want me to contribute to your collection of runs with issues (i.e. 29.9 to 30.3k in PR) I did not sub so that you can sub them to say it has issues ?  

In general, it is best to compete against your own last score to see if you can improve. There are way too many people who become obsessed about someone else's scores w/o knowing the setup and run conditions...


----------



## GRABibus

WayWayUp said:


> how much bump up are people getting from forced rebar anyways? I heard it improves PR by 200-300 points but im not sure about timespy? i dont believe it helps in all the benchmarks only select ones


Problem with Rebar in TS is that it kills CPU score by 2800 points.
it increases GX score by 600 points
So global overall score with forced Rebar is much more lower than without Rebar.

*With forced Rebar :*








I scored 34 914 in Time Spy


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





*With no forced Rebar *:








I scored 36 048 in Time Spy


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## neteng101

J7SC said:


> Of course he cannot put that information into the HoF score table....


This is why context is key and generalizations throwing out all high scores doesn't make any sense. We need to know the details to be able to claim that a score is bugged/not legit/etc. Only the person that runs the test knows that or can tell us about it.



> In general, it is best to compete against your own last score to see if you can improve. There are way too many people who become obsessed about someone else's scores w/o knowing the setup and run conditions...


This is sage advice... unlike WayWayUp who sounds like someone obsessed with other people's scores and wants to call all high scores a bug without full context. There is some value in comparing scores to others if we can understand their runs better like cooling, memory, CPU OC, other settings... it provides clues on how we might be able to improve our own scores but beyond that is just silliness.


----------



## PhuCCo

I should receive my Alphacool FE block in a couple of days. Will post results


----------



## WayWayUp

neteng101 said:


> This is sage advice... unlike WayWayUp who sounds like someone obsessed with other people's scores and wants to call all high scores a bug without full context. There is some value in comparing scores to others if we can understand their runs better like cooling, memory, CPU OC, other settings... it provides clues on how we might be able to improve our own scores but beyond that is just silliness.


we already know about his tiny performance loss via x8 instead of x16 hes made many videos on the topic. Ive tested with x8 when i ran my m.2 in the 5.0 slot. extremely minimal score reduction. This doesnt take away from my point at all. he would still be in the 29k range with a custom volt modded card with unlocked power bios and Ln2.... meanwhile people using their air cards are getting as good or better scores than him. All it does is emphasize my point. He still has a top 10 score in Time Spy Extreme.. its obviously not holding him back and he hardly even ran other benchmarks he spent the entire time trying to boost is port royal scores

and your second failed point is what makes PR HoF so stupid this generation

guy has lower core clock
lower memory clock
worse cpu
worse temperatures
3600mhz ram and low tier mobo.......

............scores higher than you still

what can you conclude from this? what insight are you looking for exactly?

yes it's a great tool to compare your own scores, but what does that have to do at all with the hall of fame?


----------



## neteng101

WayWayUp said:


> guy has lower core clock
> lower memory clock
> worse cpu
> worse temperatures
> 3600mhz ram and low tier mobo.......
> 
> ............scores higher than you still
> 
> what can you conclude from this? what insight are you looking for exactly?


I can only conclude that you continue making generalizations and assumptions - you can't tell without data logging the full run - effective clocks, etc. You're making even more ASSumptions without the full data and citing a single source - so his runs are in question, that doesn't mean everyone else's are bad.

Knowing someone's basic setup is far from enough - you don't know their curve, power limits, memory timings, etc.

Like I told you and you can't seem to comprehend - just changing 2 memory settings which will never show up in those results you're looking at as data points for the run - changes scores in all 3DMark benchmarks - that you will never know if I didn't tell you what I changed between my runs. That's real insight vs. all your hearsay theory.


----------



## mirkendargen

WayWayUp said:


> we already know about his tiny performance loss via x8 instead of x16 hes made many videos on the topic. Ive tested with x8 when i ran my m.2 in the 5.0 slot. extremely minimal score reduction. This doesnt take away from my point at all. he would still be in the 29k range with a custom volt modded card with unlocked power bios and Ln2.... meanwhile people using their air cards are getting as good or better scores than him. All it does is emphasize my point. He still has a top 10 score in Time Spy Extreme.. its obviously not holding him back and he hardly even ran other benchmarks he spent the entire time trying to boost is port royal scores
> 
> and your second failed point is what makes PR HoF so stupid this generation
> 
> guy has lower core clock
> lower memory clock
> worse cpu
> worse temperatures
> 3600mhz ram and low tier mobo.......
> 
> ............scores higher than you still
> 
> what can you conclude from this? what insight are you looking for exactly?
> 
> yes it's a great tool to compare your own scores, but what does that have to do at all with the hall of fame?


I could get 29k on an air-cooled card, that's plausible especially with more recent driver. Somewhere between 29.5k-30k is where the line between plausibly real and definitely fake if not ln2 is. I have a sweet 31.5k run I haven't uploaded.


----------



## WayWayUp

neteng101 said:


> I can only conclude that you continue making generalizations and assumptions - you can't tell without data logging the full run - effective clocks, etc. You're making even more ASSumptions without the full data and citing a single source - so his runs are in question, that doesn't mean everyone else's are bad.
> 
> Knowing someone's basic setup is far from enough - you don't know their curve, power limits, memory timings, etc.


who cares about their power limits or curve. it tells you their average clocks for core. they run 2900Mhz average you run 3200Mhz average with the same effective memory speed. you both run ddr5 and your latency is 48ns. you know based on their M/T that it's literally impossible for them to have lower latency than you

heres a quick tip, you will destroy them in any other benchmark both on 3dmark and elsewhere and it shows in their other results. yet, they still outscore you in port royal...
now your just being stubborn and it's getting laughable.
im having a stupid conversation with you and it's not worth continuing
thanks and goodbye


----------



## WayWayUp

mirkendargen said:


> I could get 29k on an air-cooled card, that's plausible especially with more recent driver. Somewhere between 29.5k-30k is where the line between plausibly real and definitely fake if not ln2 is. I have a sweet 31.5k run I haven't uploaded.


Nice 👍🏻 good run
I think I posted something that was lost in translation
For the record I have an air card… and I already posted a legit run over 29k in this thread
I feel like you guys seem to think I said 29k was impossible on air or something..?
But I agree with you there is definitely a line for sure
And when you see people under 2.9Ghz with average memory speeds scoring around and over 29k it gets suspicious
I would also like to point out that the team at 3dmark have already removed tons of fake scores off the leaderboards already. It’s a big issue for them

anyways we buy these cards to play games not to run port royal 😀
Better we end this discussion but personally I will continue to disregard PR scores


----------



## Nizzen

WayWayUp said:


> Nice 👍🏻 good run
> I think I posted something that was lost in translation
> For the record I have an air card… and I already posted a legit run over 29k in this thread
> I feel like you guys seem to think I said 29k was impossible on air or something..?
> But I agree with you there is definitely a line for sure
> And when you see people under 2.9Ghz with average memory speeds scoring around and over 29k it gets suspicious
> I would also like to point out that the team at 3dmark have already removed tons of fake scores off the leaderboards already. It’s a big issue for them
> 
> anyways we buy these cards to play games not to run port royal 😀
> Better we end this discussion but personally I will continue to disregard PR scores


I posted scaling with PR a while ago. If the score is caling perfect for every 100 or 200mhz Mem OC, you are good. When the scaling is suddenly 600 points better, then there are som "bougus". Pretty easy to verify. You just need to take the time to run PR 10 times with steps of memory OC.


----------



## neteng101

WayWayUp said:


> im having a stupid conversation with you and it's not worth continuing
> thanks and goodbye


Yeah you still haven't provided any conclusive evidence to your wild theories.

Wrong wrong and still wrong.

You continue making assumptions based on limited data - instead of asking the real question of value - which is why someone can achieve those scores. Without knowing other details you are glossing over - you simply will never know.

Very well - go away then and live under that rock of yours cause you aren't interested in factual conversations.


----------



## Laithan

Y'all done yet?

🍿🍿🍿


----------



## yzonker




----------



## Panchovix

I was doing some tests on Cyberpunk with my TUF VBIOS vs Strix VBIOS and found something interesting.

With the same exact conditions, same core clocks and same mem clocks, with the strix VBIOS, it was 1.33% faster. I know it isn't much, but it makes me wonder why at same clocks (both reported and effective), the performance is different.



Spoiler: Pics comparison, 4K maxed with RT pshycho



TUF VBIOS 3015Mhz core - +1050 VRAM









Strix VBIOS 3015Mhz core - +1050 VRAM









-------
TUF VBIOS, 2805Mhz core at 0.995V, +1000 VRAM








Strix VBIOS, 2805Mhz core at 0.995V, +1000 VRAM








-------
TUF VBIOS, 2700Mhz core at 0.96V, +1000 VRAM.









Strix VBIOS, 2700Mhz core at 0.96V, +1000 RAM


----------



## kryptonfly

GRABibus said:


> New build 7950X + ASUS Strix X670E-E Gaming Wifi + 32GB DDR5 @ 6200MHz CL30.
> 
> GIGABYTE GAMING OC on stock air cooler.
> 
> View attachment 2586819
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 134 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Average temperature = 53°C.
> 
> No significant gain versus my last build 5950X and 32GB DDR4 @ 3800MHz CL14.
> 
> Happy to be > 29100


Congrats  I just received the same Gigabyte 4090 Gaming OC 2 days ago but mine heats up a lot compared to yours. I can't do more for now.
I scored 28 767 in Port Royal

It's with side panel opened, 100% fan and 18°C ambient. I'm waiting a Bykski WB just for the place inside but it could improve temps in same time.


----------



## GRABibus

kryptonfly said:


> Congrats  I just received the same Gigabyte 4090 Gaming OC 2 days ago but mine heats up a lot compared to yours. I can't do more for now.
> I scored 28 767 in Port Royal
> 
> It's with side panel opened, 100% fan and 18°C ambient. I'm waiting a Bykski WB just for the place inside but it could improve temps in same time.


Is Rebar forced in Inspector ?
What is your delta between GPU temp and GPU hotspot ?


----------



## dk_mic

Panchovix said:


> I was doing some tests on Cyberpunk with my TUF VBIOS vs Strix VBIOS and found something interesting.
> 
> With the same exact conditions, same core clocks and same mem clocks, with the strix VBIOS, it was 1.33% faster. I know it isn't much, but it makes me wonder why at same clocks (both reported and effective), the performance is different.


Try running the benchmark repeatedly using the same settings.. I bet you will observe more than 1.33% run to tun variance. Fluctuations in ambient temperature, CPU temps, background tasks, how the stars are aligned, etc will probably have a bigger impact than TUF vs Strix BIOS


----------



## LtMatt

GRABibus said:


> Gigabyte Gaming CO on stock air cooler
> *Rebar is not forced* in Timepsy.exe with Inspector.
> 
> View attachment 2587330
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 36 048 in Time Spy
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Of course, All first places are with 13900K as it beats a lot 7950X at CPU score.
> 
> I am at least 15th in HOF with one single card at graphics score :
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark Time Spy Graphics Score Hall of Fame
> 
> 
> The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Disabling SMT on 7950X helps a lot for CPU score : + 2700points versus SMT enabled.


Does ReBar hurt performance in Timespy? Nice score.


----------



## klotan

Anyone that have played with the GeForce RTX 4090 VENTUS 3X 24G OC card?


----------



## nathan.cachia1

Is running +1400 mem going to damage my card if I use for 24/7 use?


----------



## Nizzen

nathan.cachia1 said:


> Is running +1400 mem going to damage my card if I use for 24/7 use?


No, next question 🤓🤟


----------



## GRABibus

LtMatt said:


> Does ReBar hurt performance in Timespy? Nice score.


Thanks !
Yes, my gigabyte gaming OC is a very good sample.
Yes, Rebar on TS had a global negative effect (see below) :








[Official] NVIDIA RTX 4090 Owner's Club


It's really quite simple... You wrote, and I quote, "Almost the entire HoF top 100 is bugged runs" and I have not misrepresented that in any way in my post. In fact, I didn't respond when you wrote something like it last week in the post about your Fire Strike Ultra here.. You dont need a...




www.overclock.net


----------



## MrB123

My Bysky Giga OC block dont cover all components 100%
is it only my block that is like this ?
(Edit: temps is good it just dont look right)


----------



## HeadlessKnight

Today I tried to play some older games at 4k with 8x MSAA (AC Unity and Crysis 3) , and I noticed the 4090 is barely any faster than my 3090 Ti. Probably 15% faster at most. I know it is overkill to try 8x MSAA at 4k, but I thought the might RTX 4090 would have much easier time in such old games. In Assassin's Creed Unity my fps tanks to 35-40 fps, something doesn't seem right. Could it be a memory bandwidth bottleneck? since both the 4090 and 3090 Ti have same bandwidth, but the 4090 has 52% more cores, and it is barely 15% faster in such cases.


----------



## EEE-RAY

Those who own an ASUS card and have updated to the V2 BIOS via the asus update tool - can someone check for me if both bios got updated? I switched to the Q-BIOS (for the first time, I've never updated the Q bios) and found that it was the exact same (updated) bios version as the P mode!


----------



## Betroz

HeadlessKnight said:


> Today I tried to play some older games at 4k with 8x MSAA (AC Unity and Crysis 3) , and I noticed the 4090 is barely any faster than my 3090 Ti. Probably 15% faster at most. I know it is overkill to try 8x MSAA at 4k, but I thought the might RTX 4090 would have much easier time in such old games. In Assassin's Creed Unity my fps tanks to 35-40 fps, something doesn't seem right. Could it be a memory bandwidth bottleneck? since both the 4090 and 3090 Ti have same bandwidth, but the 4090 has 52% more cores, and it is barely 15% faster in such cases.


What CPU and RAM config you use? Crysis 3 is an old game now, and not very optimized for the newest hardware.


----------



## HeadlessKnight

Betroz said:


> What CPU and RAM config you use? Crysis 3 is an old game now, and not very optimized for the newest hardware.


5800X3D and DDR4-3800CL14 with tight timings. I know it is not a CPU bottleneck because when I turn MSAA off or reduce to 4xMSAA my fps jumps dramatically. 8xMSAA tanks the fps and is not much better than 3090 Ti.


----------



## yzonker

HeadlessKnight said:


> 5800X3D and DDR4-3800CL14 with tight timings. I know it is not a CPU bottleneck because when I turn MSAA off or reduce to 4xMSAA my fps jumps dramatically. 8xMSAA tanks the fps and is not much better than 3090 Ti.


Same for me in GTA5. 4x MSAA is pretty much locked at 120fps, 8x MSAA can drop as low as 60 fps which isn't much better than my 3090. I didn't benchmark them both, so I don't know exact numbers but it does seem like the 4090 falls down on that for some reason.


----------



## Sheyster

HeadlessKnight said:


> 5800X3D and DDR4-3800CL14 with tight timings. I know it is not a CPU bottleneck because when I turn MSAA off or reduce to 4xMSAA my fps jumps dramatically. 8xMSAA tanks the fps and is not much better than 3090 Ti.


Have you tried enabling MFAA in NVCP? That should help boost performance with old games that use MSAA.


----------



## shiokarai

Panchovix said:


> I was doing some tests on Cyberpunk with my TUF VBIOS vs Strix VBIOS and found something interesting.
> 
> With the same exact conditions, same core clocks and same mem clocks, with the strix VBIOS, it was 1.33% faster. I know it isn't much, but it makes me wonder why at same clocks (both reported and effective), the performance is different.
> 
> 
> 
> Spoiler: Pics comparison, 4K maxed with RT pshycho
> 
> 
> 
> TUF VBIOS 3015Mhz core - +1050 VRAM
> View attachment 2587415
> 
> 
> Strix VBIOS 3015Mhz core - +1050 VRAM
> 
> View attachment 2587416
> 
> -------
> TUF VBIOS, 2805Mhz core at 0.995V, +1000 VRAM
> View attachment 2587417
> 
> Strix VBIOS, 2805Mhz core at 0.995V, +1000 VRAM
> View attachment 2587418
> 
> -------
> TUF VBIOS, 2700Mhz core at 0.96V, +1000 VRAM.
> 
> View attachment 2587419
> 
> Strix VBIOS, 2700Mhz core at 0.96V, +1000 RAM
> View attachment 2587420


That's just a run-to-run variance. ALSO: if you look closely, the benchmark varies slightly each time (there are different people standing by the bar, more/few of them etc.)


----------



## tomerturbo

hello everyone I have a question i have asus rog strix 4090 oc

my stock voltage is 1.050 for more gpu overclock i need more voltage when i push the slider in msi afterburner my voltage is 1.1 does it safe for 24\7? can it damged the card? can it casue gpu degradation?

thnaks for the help


----------



## lawson67

Edit


----------



## lawson67

tomerturbo said:


> hello everyone I have a question i have asus rog strix 4090 oc
> 
> my stock voltage is 1.050 for more gpu overclock i need more voltage when i push the slider in msi afterburner my voltage is 1.1 does it safe for 24\7? can it damged the card? can it casue gpu degradation?
> 
> thnaks for the help


No it should be fine plus it wont be running at that 24/7 as the voltage will downclock it will only using max voltage underload like gaming or benchmarks, generally you only use the max voltage slider when overclocking to the max and then it will still downclock after your gaming session etc, it takes voltage from 1050mv to 1.1mv which is safe


----------



## kryptonfly

GRABibus said:


> Is Rebar forced in Inspector ?
> What is your delta between GPU temp and GPU hotspot ?


Yes it is forced for Port Royal, I enabled the 3 options, is it right ?








I have almost +10°C than you, I turned on the PC, launched Port Royal and it reached 64°C in the minute, 100% fan speed, side panel opened and failed, *ambient 16°C.*
The delta is 10.6°C. Is it normal the V/F curve is 3000 mhz max ? I use SMI because of the temp but it's not accurate. Do you enable HAGS "Hardware accelerated GPU scheduling" ?







Timespy :
I scored 37 858 in Time Spy

To give an idea in real world, FFXIV Bench 4K maximum, ambient 17°C :







Seems there's something wrong with paste/pads. The PCB is a little bent near of the 12VHPWR. My Bykski WB is coming anyway, I hope it will fix temps.


----------



## LtMatt

Has anyone tried playing Dying Light 2, with 1.100v set and running at 2.9-3Ghz core clock? If so, have you monitored effective clock speed in HWINFO64 at the same time? As I notice that game seems to drop effective clock speed much lower than some other titles. Anywhere up to 100Mhz lower core clock than what MSI AB reports at times. This is with 600W strix BIOS and max power limit. Guess it's something to do with RT and the much higher power draw in this game as it gets close to 550W at times.


----------



## GRABibus

kryptonfly said:


> Yes it is forced for Port Royal, I enabled the 3 options, is it right ?
> View attachment 2587480
> 
> I have almost +10°C than you, I turned on the PC, launched Port Royal and it reached 64°C in the minute, 100% fan speed, side panel opened and failed, *ambient 16°C.*
> The delta is 10.6°C. Is it normal the V/F curve is 3000 mhz max ? I use SMI because of the temp but it's not accurate. Do you enable HAGS "Hardware accelerated GPU scheduling" ?
> View attachment 2587481
> 
> Timespy :
> I scored 37 858 in Time Spy
> 
> To give an idea in real world, FFXIV Bench 4K maximum, ambient 17°C :
> View attachment 2587482
> 
> Seems there's something wrong with paste/pads. The PCB is a little bent near of the 12VHPWR. My Bykski WB is coming anyway, I hope it will fix temps.


I have "Hardware accelerated GPU scheduling" .
Drivers in NVCP on high performances.
I use v2.3.0.13 of Nvidia Inspector with below screenshot settings:










Here arte the links. There are more recent versions :








Releases · Orbmu2k/nvidiaProfileInspector


Contribute to Orbmu2k/nvidiaProfileInspector development by creating an account on GitHub.




github.com





I will try the last one.


My 29134 points score was done at 17°C ambient.
Also my very nice TS score (Graphics scored, because my 7950X won't beat your 13900K in CPU score) :








I scored 36 048 in Time Spy


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## zGunBLADEz

Panchovix said:


> I was doing some tests on Cyberpunk with my TUF VBIOS vs Strix VBIOS and found something interesting.
> 
> With the same exact conditions, same core clocks and same mem clocks, with the strix VBIOS, it was 1.33% faster. I know it isn't much, but it makes me wonder why at same clocks (both reported and effective), the performance is different.


 This aint new, it happened on pascal too for example with xoc BIOS. Bcuz it has higher clocks doesnt mean nothing. It happens on amd as well. Power!!!! its power starved it needs power even so it says it have higher clocks... Its "performing SLOWER". Also possible different vios different vram timings. 

In another notes wwu

But the hof is " bugged" lolz


----------



## kryptonfly

GRABibus said:


> I have "Hardware accelerated GPU scheduling" .
> Drivers in NVCP on high performances.
> I use v2.3.0.13 of Nvidia Inspector with below screenshot settings:
> 
> View attachment 2587492
> 
> 
> Here arte the links. There are more recent versions :
> 
> 
> 
> 
> 
> 
> 
> 
> Releases · Orbmu2k/nvidiaProfileInspector
> 
> 
> Contribute to Orbmu2k/nvidiaProfileInspector development by creating an account on GitHub.
> 
> 
> 
> 
> github.com
> 
> 
> 
> 
> 
> I will try the last one.
> 
> 
> My 29134 points score was done at 17°C ambient.
> Also my very nice TS score (Graphics scored, because my 7950X won't beat your 13900K in CPU score) :
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 36 048 in Time Spy
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Thanks  I have the latest version of Profil Inspector, it seems we almost have the same card, mine heats more around +8/10°C, starting +60°C it's hard to stabilize but the worst is GPU boost which decreases 30 mhz because of temp, I have power "normal" in NVCP. I've just tested the memtest vulkan for VRAM, I can do +1680mhz, doesn't pass +1700. Are temps and powers normal for this kind of test ?


----------



## WayWayUp

Can’t wait to get up and running again. It’s actually a p7 not a p5 so I have to install the 2 side panels

my gpu block came in 🙌🏻
it’s going to be an overkill build since a bought an external 1080 rad that will go behind this. Ideally I would give 2 rads to the cpu but the way I’m building my gpu will get 2 pumps 2 res and 2x420 plus a 1080 lol

I will change it eventually but I wanted the middle to be one color and the sides another


----------



## WayWayUp

neteng101 said:


> Yeah you still haven't provided any conclusive evidence to your wild theories.
> 
> Wrong wrong and still wrong.
> 
> You continue making assumptions based on limited data - instead of asking the real question of value - which is why someone can achieve those scores. Without knowing other details you are glossing over - you simply will never know.
> 
> Very well - go away then and live under that rock of yours cause you aren't interested in factual conversations.


neteng,

I'm not trying to be rude or have a heated argument. We can have an intelligent discussion if you acknowledge that many scores simply are not legit and that it's easy to bug out scores. I'll go ahead and post the most conclusive and undisputable evidence of my claim










to the left is the #1 score holder and arguably the best overclocker in the world. To the right is a random user. But liquid cooled hapens to have the #4 score on the leaderboards because he got his card to error out in one of the tests

how does he have 194.56fps in graphics test #1.... but OGS only has 167.39fps

ask yourself why there is a huge outlier?


guess what, i dont need to see his "curve" i dont need an "analysis of the data" or anything else you suggested. to get 194fps even a 4090ti couldnt get that. 

why is this a "crazy theory" 

this is NOT just speculation. you act like we need to be forensic scientists and request their computers for inspection or something

why is it so hard for you to understand that 3dmark is littered with fake scores because of memory bug? this is not a crazy claim when the team at 3dmark recognizes and acknowledges it. and your so stubborn because your obviously proven wrong but too prideful to let it go. There isnt one single person on this forum except for you that would deny this


----------



## Betroz

@Nizzen 
Tested my new 4090 TUF OC in BF2042 at 3840x1600 Low, and as aspected my 10900K with 4133C15 tweaked memory is a bottleneck. Full Ultra settings is more GPU bottlenecked, but even then my CPU is not enough. The card boost to 2700 Mhz at stock and stays cold and quiet (BF2042 playing).

So yeah I need a new CPU, motherboard and RAM to fully utilize the 4090 that's for sure! I will probably wait for the Ryzen 7000 3D and see if that will match a 13900K with fast DDR5 before I upgrade.



Spoiler: The PCI-E cable is touching the sidepanel but I can close it, and that is in this Phanteks P500A


----------



## neteng101

WayWayUp said:


> why is it so hard for you to understand that 3dmark is littered with fake scores because of memory bug? this is not a crazy claim when the team at 3dmark recognizes and acknowledges it. and your so stubborn because your obviously proven wrong but too prideful to let it go. There isnt one single person on this forum except for you that would deny this


Your claim and I quote


> Almost the entire HoF top 100 is bugged runs.


 - I challenged that. You have provided evidence of two scores here, one legit, one bugged based on your analysis. The burden of proof is still on you to back up your claims. Its not people being stubborn with you - its your sweeping generalizations that are the crux of the problem. If you want to, go ahead and please provide us the analysis of all top 100 runs and which ones are bugged and why. Then we might believe you.

Just because you can spot some bugged runs does not make Port Royal bad and mean that the majority of the high scores need to be tossed out. You just don't have enough data to back up that claim. You're acting like Igor did - one bad cable assembly doesn't mean that's the entire problem - and in the end Igor got a lot of egg on his face for jumping to conclusions on what caused the melted connectors.

I'm not questioning that some results are not accurate. Just not willing to toss out the whole pool of results without conclusive evidence. I'm also interested to know more about what can trigger or cause inaccurate scores, which is why I keep saying to you not to gloss over the information you don't know and jump to conclusions. If this was easy - UL would just fix the benchmark to prevent further bad runs.

I'll leave you with this last bit where I kept telling you that sometimes you need more details, these runs are almost identical in every way but the score is noticeably different...









Result







www.3dmark.com





The only thing that changed was 2 secondary memory timings. Yes, these benchmarks can be affected by factors not seen in the results summary.


----------



## yzonker

neteng101 said:


> Your claim and I quote - I challenged that. You have provided evidence of two scores here, one legit, one bugged based on your analysis. The burden of proof is still on you to back up your claims. Its not people being stubborn with you - its your sweeping generalizations that are the crux of the problem. If you want to, go ahead and please provide us the analysis of all top 100 runs and which ones are bugged and why. Then we might believe you.
> 
> Just because you can spot some bugged runs does not make Port Royal bad and mean that the majority of the high scores need to be tossed out. You just don't have enough data to back up that claim. You're acting like Igor did - one bad cable assembly doesn't mean that's the entire problem - and in the end Igor got a lot of egg on his face for jumping to conclusions on what caused the melted connectors.
> 
> I'm not questioning that some results are not accurate. Just not willing to toss out the whole pool of results without conclusive evidence. I'm also interested to know more about what can trigger or cause inaccurate scores, which is why I keep saying to you not to gloss over the information you don't know and jump to conclusions. If this was easy - UL would just fix the benchmark to prevent further bad runs.
> 
> I'll leave you with this last bit where I kept telling you that sometimes you need more details, these runs are almost identical in every way but the score is noticeably different...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> The only thing that changed was 2 secondary memory timings. Yes, these benchmarks can be affected by factors not seen in the results summary.


No way 2 mem timings made 300pts in Speedway. Something else had to have changed. It's nowhere near that sensitive to mem timings unless they were totally borked. I bet it won't change that much if you set your RAM to jedec.


----------



## bmagnien

New Portal with RTX is available in Steam. Tons of setting in the developer menu if you press Alt+X. I can't however get my MSI Afterburner/RivaTuner OSD to show up in game. I've had this happen with some other games, and closing AB or just spamming the key I have assigned for the OSD toggle would eventually work, but can't get it working here. Anyone have an idea on how to force it to work with a given game? I didn't even want to play the game, just wanted to watch my OSD stats


----------



## WayWayUp

i backed off the claim of majority and just switched to a ton of them. but my point in saying that was emphasis on how big of a problem it is and how unmotivated i would be personally to climb the ladder and hence why I disregard. i cant know if the person ahead of me is a borked score or not and i would be wasting my time trying to chase scores that i dont know how much of a score boost they got.
thats my intent for the post. not to suggest that they are all fake results. Even minor memory errors toward the end, doesnt need to boost their scores by 1000points.. even a 200-300 jump from minor memory errors would be enough to totally flip rankings on their head

i dont think you realized how easy it is to get a borked score. I could show you myself with my own scores how easy i can make a borked score (once i finish my current build) and nobody would know. its something that i can reliably recreate and i know because i have multiple fake scores that i chose not to validate

In contrast to Timespy and Firestrike. It still possible in firestrike but easily discernable because they are 4 different test so easy to spot an outlier

and its incredibly difficult in timespy exteme. thats why i focus on those 2 benches and "disregard" port royal. do you see what im saying now?


----------



## WayWayUp

bmagnien said:


> New Portal with RTX is available in Steam. Tons of setting in the developer menu if you press Alt+X. I can't however get my MSI Afterburner/RivaTuner OSD to show up in game. I've had this happen with some other games, and closing AB or just spamming the key I have assigned for the OSD toggle would eventually work, but can't get it working here. Anyone have an idea on how to force it to work with a given game? I didn't even want to play the game, just wanted to watch my OSD stats


im excited to download it

i will try with dlss 3.0

i thought 3.0 would be terrible based off of hardware unboxed review of it, but when i tried it in plague tale i found it to actually be really good. I couldn't discern any difference/ any artifacts... just that my fps exploded. the fps was far beyond my 4k monitor refresh rate unfortunately but in portal RTX it could be super useful 

i think 3.0 is being downplayed by the media... it can really be a game changer


----------



## yzonker

bmagnien said:


> New Portal with RTX is available in Steam. Tons of setting in the developer menu if you press Alt+X. I can't however get my MSI Afterburner/RivaTuner OSD to show up in game. I've had this happen with some other games, and closing AB or just spamming the key I have assigned for the OSD toggle would eventually work, but can't get it working here. Anyone have an idea on how to force it to work with a given game? I didn't even want to play the game, just wanted to watch my OSD stats


Did you try hitting the windows key while it's running? I've had that work in some stuff.


----------



## Mad Pistol

bmagnien said:


> New Portal with RTX is available in Steam. Tons of setting in the developer menu if you press Alt+X. I can't however get my MSI Afterburner/RivaTuner OSD to show up in game. I've had this happen with some other games, and closing AB or just spamming the key I have assigned for the OSD toggle would eventually work, but can't get it working here. Anyone have an idea on how to force it to work with a given game? I didn't even want to play the game, just wanted to watch my OSD stats


Same issue here. Can't get the overlay for MSI Afterburner to work.

As for the game itself, it absolutely wrecks GPUs, RTX 4090 included. I tried 4K with no DLSS or Frame Generation, and it feels like 15-20 FPS. Set DLSS to Auto and Frame Generation on and it runs great. This game/demo is brutal.

There are going to be many unhappy RTX 2000 and 3000 owners out there.


----------



## neteng101

yzonker said:


> No way 2 mem timings made 300pts in Speedway. Something else had to have changed.


I can swap the timings out and get repeatable results on multiple runs. Not just SW, this difference happens on PR and TSE too. It blows my mind too. While I did change two settings, I believe its almost all tWR. After I found this I tried setting other secondaries down to Extreme recommendations for DDR4, and that didn't really change scores much if any.



WayWayUp said:


> and its incredibly difficult in timespy exteme. thats why i focus on those 2 benches and "disregard" port royal. do you see what im saying now?


Yeah I do - personally I don't bother chasing the HOF rankings, rather I just want to be able to tweak my own setup for daily peak performance. What I can learn from others and their setups is what are the variables I can tweak to further improve my own so I want as many details as I can get.

Personally, Port Royal has been the bane of my existence - be it Titan Xp, a bunch of Ampere cards and now the 4090 - I always perform way worse in Port Royal from the top of the leaderboard. Surprisingly Speedway hasn't been the same struggle as PR scores.


----------



## bmagnien

yzonker said:


> Did you try hitting the windows key while it's running? I've had that work in some stuff.


No dice - might be related to fake full screen - but when I go into the legacy valve display settings in game (not the new Nvidia settings menu) and make changes there, it crashes to desktop.


----------



## WayWayUp

but RTX portal is full blown path tracing yeah? it's impressive we can even run this at all in 2022. If you heard interviews from developers from 10 years ago, they were saying it would take maybe 50 years to achieve this.
and yes, portal isnt even that demanding of a game hence why they used it to showcase

nvidia needs another 240% increase in RT performance 

maybe 2 generations from now, with the rtx 6090, we can have full blown modern pc games fully path traced. but RT improvements need to continue to outpace raster by a large margin like this generation did


----------



## Krzych04650

LtMatt said:


> Has anyone tried playing Dying Light 2, with 1.100v set and running at 2.9-3Ghz core clock? If so, have you monitored effective clock speed in HWINFO64 at the same time? As I notice that game seems to drop effective clock speed much lower than some other titles. Anywhere up to 100Mhz lower core clock than what MSI AB reports at times. This is with 600W strix BIOS and max power limit. Guess it's something to do with RT and the much higher power draw in this game as it gets close to 550W at times.


Some games do that. Generally effective clock is a bit a mystery. One would think that it is going to be based off temperature, power or maybe the type of load, but it doesn't look like there are any patterns. For example you could get an assumption that RT heavy titles do that, like for example Dying Light 2 and Cyberpunk do, but then Control has very high effective clock. For power draw the same thing, I've seen games with very high power draw have higher effective clock than ones with much lower one. I guess it depends on how and which parts of the die are being hit, there is no way for us to verify or test that at all.


----------



## Belcebuu

Well I managed to get the 4090 FE in UK, any recomendation guys? shall I buy a new 12VHPWR cable or the one that is provided by Nvidia is fine and I just need to put it fully connected 100% ?
I plan to do a bit of undervolt that it seems that is the best thing ever, like losing a 8% but winning a much better consumption
Thanks


----------



## LtMatt

Krzych04650 said:


> Some games do that. Generally effective clock is a bit a mystery. One would think that it is going to be based off temperature, power or maybe the type of load, but it doesn't look like there are any patterns. For example you could get an assumption that RT heavy titles do that, like for example Dying Light 2 and Cyberpunk do, but then Control has very high effective clock. For power draw the same thing, I've seen games with very high power draw have higher effective clock than ones with much lower one. I guess it depends on how and which parts of the die are being hit, there is no way for us to verify or test that at all.


Good answer, thanks.


----------



## neteng101

yzonker said:


> No way 2 mem timings made 300pts in Speedway. Something else had to have changed.


Quick test on my other system, far from a 4090. Only tightened tWR 16 to 12. Just need to wait for UL to update verified drivers (527.56 just came out) and make that Legendary count!


----------



## bmagnien

in lieu of in-game osd, next best thing: 1440p windowed, dlss off, frame gen on, ultra settings, - pushing card to max at around 110 fps


----------



## ALSTER868

Is it worth to pay extra $150-170 for the Aorus Master over the Gaming OC? Seen lots of feedback on Gaming OC and nothing about Aorus.
On paper Aorus looks good, but will that translate into benefits compared to quite good Gaming?

Thanks


----------



## mirkendargen

ALSTER868 said:


> Is it worth to pay extra $150-170 for the Aorus Master over the Gaming OC? Seen lots of feedback on Gaming OC and nothing about Aorus.
> On paper Aorus looks good, but will that translate into benefits compared to quite good Gaming?
> 
> Thanks


Exact same PCB with a few more power stages filled in. Unknown how much if any better the cooler is.


----------



## ALSTER868

mirkendargen said:


> Exact same PCB with a few more power stages filled in. Unknown how much if any better the cooler is.


Yep, I know about stronger VRM, it is close in fact to Strix. But how would that get into better OC results if at all, that's why I'm wondering. Cooling should be likewise to G-OC imo. No feedback on this card.


----------



## mirkendargen

ALSTER868 said:


> Yep, I know about stronger VRM, it is close in fact to Strix. But how would that get into better OC results if at all, that's why I'm wondering. Cooling should be likewise to G-OC imo. No feedback on this card.


Silicon lottery is gonna be the determining factor in OC results, not AIB brand/design. There are garbage clocking Strixs and awesome clocking Zotacs.


----------



## yzonker

neteng101 said:


> Quick test on my other system, far from a 4090. Only tightened tWR 16 to 12. Just need to wait for UL to update verified drivers (527.56 just came out) and make that Legendary count!
> 
> View attachment 2587534
> 
> 
> View attachment 2587535


Wow, that's really interesting. Still can't hardly believe it. I'll have to test this for myself. I think I actually have tWR set fairly loose, but that may be as good as I could get it. Have to test it again. Thanks.


----------



## alasdairvfr

Hey I have a new issue today, I saw some people were having good frames with certain games doing Nvidia Surround multimonitors on the 4090 so I was trying to span my 3x 4k screens. 2 of them are 60hz and the other is a 144 hz. When I go to span the three, it defaults to 30 hz. I have isolated one screen as the culprit. It's running 60hz but when it's on a 2x1 setup with either of the others, it drops to 30hz. 










The Samsung is running 60 hz according to NVCP; OS states 59.997



















When I try to add the samsung to the surround, the refresh drops to 30. 2 monitors without the Samsung, 60 hz. Trying to do 3 x 1 is 30 hz and no option to change as well. Anyone see this before? I think I had it working properly in the past, not sure if its a driver issue. Any ideas?


----------



## KingEngineRevUp

PhuCCo said:


> I should receive my Alphacool FE block in a couple of days. Will post results


Can you take pictures along the way of the install process?


----------



## KingEngineRevUp

WayWayUp said:


> who cares about their power limits or curve. it tells you their average clocks for core. they run 2900Mhz average you run 3200Mhz average with the same effective memory speed.


The problem is, 3DMark doesn't tell us their average "effective clocks."

It's kind of hard comparing runs between one another without that data. It still gives us a general idea. But not the entire story.


----------



## Cobra26

What do you guys' think of this card:

MSI Suprim X RTX 4090?

Is this a good choice in particular build quality and VRM used?
From consensus don't get a Gigabyte.
From the research ive done the MSI Suprim X is a pretty decent choice correct?


----------



## EarlZ

I have a Gigabyte Auros Master 4090 and does anyone know if its possible to disable fanstop while being able to keep a low fan speed like 30? I've noticed that even at 40% fan speed the fans stop then start up again in an infinite cycle.


----------



## th3illusiveman

Tried Portal RTX, game looks good but nowhere near as good as it should for the hardware it demands. I think this is one of the worst games to try to use Raytraced lighting on since the whole game world design intent is around uniform lighting. 

You don't get the beautiful contrast and indirect lighting bounce effects you see in Metro Exodus Enhanced edition for example or even Quake RTX. MSI OSD doesn't work by my C1 has a game menu which was showing between 70-100 fps with frame gen on and it played well enough. 

Card was chewing up around 380 watts max w/ my undervolt.


----------



## jediblr

Cobra26 said:


> What do you guys' think of this card:
> 
> MSI Suprim X RTX 4090?
> 
> Is this a good choice in particular build quality and VRM used?


take it, superb card, good vrm and mem , cool and quiet


----------



## yzonker

neteng101 said:


> Quick test on my other system, far from a 4090. Only tightened tWR 16 to 12. Just need to wait for UL to update verified drivers (527.56 just came out) and make that Legendary count!
> 
> View attachment 2587534
> 
> 
> View attachment 2587535


Just as I expected on my machine, no change.

tWR = 24









I scored 10 884 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





tWR = 16









I scored 10 898 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com





I forgot to screenshot tWR = 24, but that's where it was. I'm glad you mentioned this anyway as I think I had meant to go back and test it and never did.


----------



## GRABibus

kryptonfly said:


> Thanks  I have the latest version of Profil Inspector, it seems we almost have the same card, mine heats more around +8/10°C, starting +60°C it's hard to stabilize but the worst is GPU boost which decreases 30 mhz because of temp, I have power "normal" in NVCP. I've just tested the memtest vulkan for VRAM, I can do +1680mhz, doesn't pass +1700. Are temps and powers normal for this kind of test ?
> View attachment 2587496


I never did it 😊


----------



## Mad Pistol

th3illusiveman said:


> Tried Portal RTX, game looks good but nowhere near as good as it should for the hardware it demands. I think this is one of the worst games to try to use Raytraced lighting on since the whole game world design intent is around uniform lighting.
> 
> You don't get the beautiful contrast and indirect lighting bounce effects you see in Metro Exodus Enhanced edition for example or even Quake RTX. MSI OSD doesn't work by my C1 has a game menu which was showing between 70-100 fps with frame gen on and it played well enough.
> 
> Card was chewing up around 380 watts max w/ my undervolt.


Considering that all lighting, reflections, and shadows are being calculated in real-time, it's actually very impressive. The issue is that it could technically be accomplished via an artist, and the original already looks pretty good.


----------



## Nizzen

*PG27AQDM*


----------



## Cobra26

jediblr said:


> take it, superb card, good vrm and mem , cool and quiet


Thanks man,
Lot's of research for getting a card that lasts long i think good VRM is a must.
Narrowed it down to either the MSI Suprim X 4090 or Asus Strix 4090.


----------



## th3illusiveman

Nizzen said:


> *PG27AQDM*


Should be amazing for esports titles with that lighting quick response time.

For me though, anything smaller than 38" is a no go for me. My 48" screen is so immersive in racing games and fligh sim.


----------



## Nizzen

th3illusiveman said:


> Should be amazing for esports titles with that lighting quick response time.
> 
> For me though, anything smaller than 38" is a no go for me. My 48" screen is so immersive in racing games and fligh sim.


I like 34" 3440x1440 the best, that's why I'm using Alienware 34" Oled 175hz. Over 38" is too big for me. Most games, I like ultrawide the best. Maybe I will upgrade the Asus 360hz monitor for Quake gaming


----------



## Mad Pistol

Nizzen said:


> *PG27AQDM*


LG is releasing one as well (It's a WOLED panel).

Still waiting for a 4K 240hz model.


----------



## Nizzen

Mad Pistol said:


> Still waiting for a 4K 240hz model *and rtx 5090 😉


----------



## J7SC

Mad Pistol said:


> LG is releasing one as well (It's a WOLED panel).
> 
> Still waiting for a 4K 240hz model.


...that's how it goes, isn't it. For years I was hoping for a 4K120Hz OLED panel - and finally got the C1 48 OLED late last spring - and love it ! At the time, my w-cooled 3090 Strix was pushing all the pixels it could nicely, but now with a well-running 4090, I could use a 4K OLED panel > 120 Hz .

---

Some benching today, along with older scores (not all of them subbed yet). The Time Spy Graphics would actually get me to 15th or so at HoF, but the rest of the score isn't so hot as I only used one CCX. The TS-X is from a few days ago, and is only a marginal improvement over my last subbed runs. The Speedway 11104 is actually subbed at 3DM but won't show in HoF as Systeminfo was off...generally speaking, I get better final scores w/o Systeminfo - it takes too long between runs and the VRAM cools down...the rest of the Speedway scores do have Systeminfo on, though...needless to add that the last one won't get subbed - it had the 'Disco lights' version of artefacts ...looked kind of cool, though. 

...might do some more 'cold benching' (lowering ambient from 26 C to 18 C or so) for benchies which are more sensitive to core than VRAM...haven't gotten around to install the heatpads on the VRAM area...


----------



## Mad Pistol

J7SC said:


> ...that's how it goes, isn't it. For years I was hoping for a 4K120Hz OLED panel - and finally got the C1 48 OLED late last spring - and love it ! At the time, my w-cooled 3090 Strix was pushing all the pixels it could nicely, but now with a well-running 4090, I could use a 4K OLED panel > 120 Hz .


Yep. Love my LG CX. 4K120 has never been better on my 4090... but I want MOAR!!!! 

1440p @ 240hz ain't gonna cut it. Gotta go the full 4K240.


----------



## newls1

Cobra26 said:


> Thanks man,
> Lot's of research for getting a card that lasts long i think good VRM is a must.
> Narrowed it down to either the MSI Suprim X 4090 or Asus Strix 4090.


something wrong with the gigabyte OCcard?


----------



## Roacoe717

Swapped out the gigabyte OC, it had problems. Ended up getting a gaming x Trio, I wish they would have kept Nvlink.


----------



## Demian1880

How to get this score in TS graphics???
Can anyone tell me the good setting for me?

















3DMark Time Spy Graphics Score Hall of Fame


The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.




www.3dmark.com


----------



## neteng101

yzonker said:


> Just as I expected on my machine, no change.


Sorry to send you down a rabbit hole - I retested 3 tWR settings 24, 16, 12 and got a measley 8 point spread in Speedway scores between the 3 runs. Basically no change. At least this time I have a hot smoking suspect on how I got the low results earlier today - booted up, updated drivers, ran the test without restarting. Then I changed tWR in BIOS, next run was much higher. I still don't know exactly what changed on my main desktop that affected the scores.


----------



## Desecrated

Hello everyone,

Does anyone have the AORUS GeForce RTX 4090 XTREME WATERFORCE (rev. 1.1) bios with the 600W? I tried to look for it, but only found 95.02.18.40.2D (older) and 95.02.18.80.A5 (newer)
Currently on 95.02.18.80.A5 which is only 500w for some stupid reason...

I rather not flash anything else from any other brand on it due to the warranty just in case.
Appreciate it if anyone can dump their rev. 1.1 with 600w.
Thanks in advance!


----------



## yzonker

neteng101 said:


> Sorry to send you down a rabbit hole - I retested 3 tWR settings 24, 16, 12 and got a measley 8 point spread in Speedway scores between the 3 runs. Basically no change. At least this time I have a hot smoking suspect on how I got the low results earlier today - booted up, updated drivers, ran the test without restarting. Then I changed tWR in BIOS, next run was much higher. I still don't know exactly what changed on my main desktop that affected the scores.


Oh no problem. Like I mentioned, I had tWR set too loose anyway and needed to fix it. I'm stability testing now.


----------



## kryptonfly

GRABibus said:


> I never did it 😊


You should  (all stock, except vram +1700 obviously and I set 100% fan speed but it's up to you. I could check temps & powers with GPU-Z in the same time).
Releases · GpuZelenograd/memtest_vulkan
Stupid question but how do you post your links here ? Mine are just simple flat texts but yours and others there's an icon with description. You see what I mean ?


----------



## Nizzen

Demian1880 said:


> How to get this score in TS graphics???
> Can anyone tell me the good setting for me?
> View attachment 2587609
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark Time Spy Graphics Score Hall of Fame
> 
> 
> The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.
> 
> 
> 
> 
> www.3dmark.com


You can't


----------



## Betroz

Nizzen said:


> I like 34" 3440x1440 the best, that's why I'm using Alienware 34" Oled 175hz. Over 38" is too big for me. Most games, I like ultrawide the best. Maybe I will upgrade the Asus 360hz monitor for Quake gaming


I hope you won't get any burn-in on that OLED monitor... I have the LG 38GN950-B and yes it is little big - or should I say too wide, cause the vertical size is great with the 1600 pixels. 

What settings in Afterburner are you using for your 4090 cards if I may ask?


----------



## J7SC

ALSTER868 said:


> Is it worth to pay extra $150-170 for the Aorus Master over the Gaming OC? Seen lots of feedback on Gaming OC and nothing about Aorus.
> On paper Aorus looks good, but will that translate into benefits compared to quite good Gaming?
> 
> Thanks


...the only review of the Aorus Master I have seen is by DerBauer (see vid below). The Gigabyte Gaming OC and Aorus Master seem to have the identical PCB (and the same water-block models fit and are tagged for both) but the Master has a different stock cooler which includes a LCD screen. The warranty card included with my Gaming OC actually only said 'Aorus'...


----------



## Nizzen

Betroz said:


> I hope you won't get any burn-in on that OLED monitor... I have the LG 38GN950-B and yes it is little big - or should I say too wide, cause the vertical size is great with the 1600 pixels.
> 
> What settings in Afterburner are you using for your 4090 cards if I may ask?


Stock 🤣🤟


----------



## Betroz

Nizzen said:


> Stock 🤣🤟


I thought you had a waterblock and running 3100 core +1700mem by now 
For me, I am CPU limited so....yeah.


----------



## bsch3r

Is this 1000W xoc vbios for the 4090 Strix somewhere out to download in the meantime?


----------



## Nizzen

bsch3r said:


> Is this 1000W xoc vbios for the 4090 Strix somewhere out to download in the meantime?


No, only for strix users with Elmor tool connected


----------



## bsch3r

Nizzen said:


> No, only for strix users with Elmor tool connected


Well, I have a STRIX 4090 and the Elmor tool, bit no xoc vbios...


----------



## J7SC

neteng101 said:


> Quick test on my other system, far from a 4090. Only tightened tWR 16 to 12. Just need to wait for UL to update verified drivers (527.56 just came out) and make that Legendary count!
> 
> View attachment 2587534
> 
> 
> View attachment 2587535


...yes, there goes another secret  ...I actually run* tWR at 10* on my DDR4 set (4 x Samsung B). I have run that kit at DDR4 4000, but only when raising SoC more than I like, and also loosen timings...no net gain from the extra bandwidth. I'm actually undervolting the RAM by a bit with the daily settings below.











Nizzen said:


> No, only for strix users with Elmor tool connected


...I just like to get a vbios that puts a bit more juice into VRAM to warm it up in a water-cooled setup...got to check Elmor's catalogue...oh how I miss the EVBot days


----------



## pat182

portal rtx makes my strix go all in, getting near 600watts, and yet in the 65 c on stock cooler stock fan profile lol, this card a beast


----------



## Faltzer

Desecrated said:


> Hello everyone,
> 
> Does anyone have the AORUS GeForce RTX 4090 XTREME WATERFORCE (rev. 1.1) bios with the 600W? I tried to look for it, but only found 95.02.18.40.2D (older) and 95.02.18.80.A5 (newer)
> Currently on 95.02.18.80.A5 which is only 500w for some stupid reason...
> 
> I rather not flash anything else from any other brand on it due to the warranty just in case.
> Appreciate it if anyone can dump their rev. 1.1 with 600w.
> Thanks in advance!


I am also interested in this, I did flash a 600w bios from strix onto my 4090 waterforce, there were some gains which I am currently trying to figure out. The fans however stopped spinning at idle as opposed to the default 30% spin of the stock waterforce vbios


----------



## yzonker

Anybody try this bios? 666w. It may be 2x16pin though.

VGA Bios Collection: GALAX RTX 4090 24 GB | TechPowerUp


----------



## Nizzen

yzonker said:


> Anybody try this bios? 666w. It may be 2x16pin though.
> 
> VGA Bios Collection: GALAX RTX 4090 24 GB | TechPowerUp


It IS 2x 16 pin 
You try and repport back to us


----------



## Nizzen

Error 404


----------



## WayWayUp

AMD Radeon RX 7900 XTX & RX 7900 XT 3DMark GPU Benchmarks Leak, Trading Blows With The NVIDIA RTX 4080


The first 3DMark GPU benchmarks of AMD's Radeon RX 7900 XTX & RX 7900 XT graphics cards based on RDNA 3 architecture have leaked out.




wccftech.com





disappointing result. the 4080 is so cut down and narrow bus. amd's halo card should beat it right? amd needs to go back to the drawing board



not sure we will get a 4090ti now, if the 4090 is so so far ahead. if we do get something it might not be until q1 2024


----------



## mouacyk

WayWayUp said:


> AMD Radeon RX 7900 XTX & RX 7900 XT 3DMark GPU Benchmarks Leak, Trading Blows With The NVIDIA RTX 4080
> 
> 
> The first 3DMark GPU benchmarks of AMD's Radeon RX 7900 XTX & RX 7900 XT graphics cards based on RDNA 3 architecture have leaked out.
> 
> 
> 
> 
> wccftech.com
> 
> 
> 
> 
> 
> disappointing result. the 4080 is so cut down and narrow bus. amd's halo card should beat it right? amd needs to go back to the drawing board
> 
> 
> 
> not sure we will get a 4090ti now, if the 4090 is so so far ahead. if we do get something it might not be until q1 2024


would be sad if true


----------



## J7SC

yzonker said:


> Anybody try this bios? 666w. It may be 2x16pin though.
> 
> VGA Bios Collection: GALAX RTX 4090 24 GB | TechPowerUp


I need to find one w/ 777 W and 1x 16pin


----------



## PLATOON TEKK

Suprim x doesn’t seem to have an i2c header, unlike the Strix. Good to know.


----------



## J7SC

PLATOON TEKK said:


> View attachment 2587770
> 
> Suprim x doesn’t seem to have an i2c header, unlike the Strix. Good to know.


Nice VRM, though. What do you think / have any experience with the Suprim X 600W bios on other 4090 models ?


----------



## PLATOON TEKK

J7SC said:


> Nice VRM, though. What do you think / have any experience with the Suprim X 600W bios on other 4090 models ?


Has best power delivery of all 4000s, shame I can’t easily add voltage with EVC2. I’m about to mount Phanteks block and might use the elmor hot plate to keep the vrms warm. Will let you know how it goes.

has the right controller, just lacks the damn headers, shame really.


----------



## Sheyster

yzonker said:


> Anybody try this bios? *666w*.


 

Yeah, think I'll skip this one. Not necessarily for the satanic max PL, but the fact it's a dual 12VHPWR card.


----------



## yzonker

Sheyster said:


> Yeah, think I'll skip this one. Not necessarily for the satanic max PL, but the fact it's a dual 12VHPWR card.


Yea it may limit at ~333w. I might try it though when I have time.


----------



## newls1

Would flashing my Giga Gaming OC to this 666w bios NOT be advised casue itll be looking for 2 16pins?


----------



## bmagnien

newls1 said:


> Would flashing my Giga Gaming OC to this 666w bios NOT be advised casue itll be looking for 2 16pins?


I advise it  Post your results


----------



## newls1

bmagnien said:


> I advise it  Post your results


im stuck at work!


----------



## yzonker

newls1 said:


> Would flashing my Giga Gaming OC to this 666w bios NOT be advised casue itll be looking for 2 16pins?


I doubt it'll soft brick your card, but may or may not work properly. Just have to test.


----------



## mirkendargen

newls1 said:


> Would flashing my Giga Gaming OC to this 666w bios NOT be advised casue itll be looking for 2 16pins?


I highly doubt it would physically damage your card, worst case you'd have to use your iGPU to flash it back.


----------



## newls1

the more i think about it, the more I think it will do nothing at all for perofrmance cause im voltage limited NOT wattage limited..


----------



## yzonker

newls1 said:


> the more i think about it, the more I think it will do nothing at all for perofrmance cause im voltage limited NOT wattage limited..


It would have to have some other benefit. Better core/mem OC, higher effective clock, etc...


----------



## Faltzer

I would like to share some research that I have done on the gigabyte 4090 waterforce.

I have tried all the vbioses, the best one was from gigabyte itself, but the gigabyte OC version of the other card they sell.

It yielded the best results, about 550wat usage at max (tried unigine heaven and the 3dmark tests).

It also gave the best overclockability, with the other vbioses I got artifacts and crashes at certain overclocks, with the gigabyte OC I managed to stabely use +190 core + 1390 mem.

I could go higher, but then it glitches (artifacts @Heaven benchmark).


----------



## wirx

Can you share link or tell which gigabyte BIOS exactly it was?


----------



## Faltzer

wirx said:


> Can you share link or tell which gigabyte BIOS exactly it was?











Gigabyte RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com


----------



## J7SC

Faltzer said:


> I would like to share some research that I have done on the gigabyte 4090 waterforce.
> 
> I have tried all the vbioses, the best one was from gigabyte itself, but the gigabyte OC version of the other card they sell.
> 
> It yielded the best results, about 550wat usage at max (tried unigine heaven and the 3dmark tests).
> 
> It also gave the best overclockability, with the other vbioses I got artifacts and crashes at certain overclocks, with the gigabyte OC I managed to stabely use +190 core + 1390 mem.
> 
> I could go higher, but then it glitches (artifacts @Heaven benchmark).
> 
> 
> View attachment 2587793


After looking at some memtest_vulkan results (mind you, very limited data, so grain of salt), it seems that the Gigabyte-G-OC bios uses tighter memory timings at a fixed frequency compared to for example the Strix vbios on the same card.

---

Trying to take 5K shots and fitting it all into a collage, following some minor visual updates...


----------



## Faltzer

J7SC said:


> After looking at some memtest_vulkan results (mind you, very limited data, so grain of salt), it seems that the Gigabyte-G-OC bios uses tighter memory timings at a fixed frequency compared to for example the Strix vbios on the same card.
> 
> ---
> 
> Trying to take 5K shots and fitting it all into a collage, following some minor visual updates...
> View attachment 2587810


Nice PC you got there, the memtest vulkan results you stumpled upon, what do they mean for us?


----------



## J7SC

Faltzer said:


> Nice PC you got there, the memtest vulkan results you stumpled upon, what do they mean for us?


..."different memory timings'...based on the limited results, the Giga-G-OC vbios has _slightly_ higher GB/s at the same frequency.


----------



## Riadon

If only we could tweak VRAM memory timings like normal RAM. At least with locking voltage they have the excuse of "we don't want users to accidentally damage the card."


----------



## bmagnien

for those like myself unable to enable AB OSD in Portal RTX, download the new RTSS beta: MSI AB / RTSS development news thread

Worked like a charm for me.


----------



## KedarWolf

Faltzer said:


> Gigabyte RTX 4090 VBIOS
> 
> 
> 24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com


Might want to install the BIOS, and then run the BIOS update tool from the Gigabyte support site to get the V2 BIOS.


----------



## SilenMar

I was just wondering if these engineers at Gigabyte has tuned the VRAM timings too tight that results in total shutdown once the PC goes into sleep mode.

Each time the PC powers back there is a good chance the GPU didn't recover from it.

During high load there is also a good chance the GPU will reset to a safe mode with lower than 1500MHz frequency. Only disabling and re-enabling the GPU in the Device Manager can make it function again.


----------



## J7SC

KedarWolf said:


> Might want to install the BIOS, and then run the BIOS update tool from the Gigabyte support site to get the V2 BIOS.


Do you / does anyone know what the differences are between the V1 and V2 ?


----------



## Faltzer

J7SC said:


> Do you / does anyone know what the differences are between the V1 and V2 ?


I know that with the original waterforce firmware, the official bios update has decreased the wattage and probably also the performance. Here it says increased compatibility, but they would never write in the notes stuff that deter users (like decreased watts)


----------



## Nico67

J7SC said:


> Do you / does anyone know what the differences are between the V1 and V2 ?


Asus V1 and V2 are the same, V2 updater just works on more platforms. V2.1 is the ECC enable fix.


----------



## WayWayUp

Can someone explain to me the secret sauce for speedway??
Obviously it’s benefiting from something else that I’m not aware of. Rebar?
It doesn’t match my performance in any other benchmark


----------



## mouacyk

WayWayUp said:


> Can someone explain to me the secret sauce for speedway??
> Obviously it’s benefiting from something else that I’m not aware of. Rebar?
> It doesn’t match my performance in any other benchmark


It's a fickle SOB


----------



## mirkendargen

J7SC said:


> Do you / does anyone know what the differences are between the V1 and V2 ?


V2 already has that black screen on boot fix that Nvidia released a BIOS patcher for, it's possible that's the only change. You can tell because Nvidia's patcher will report the BIOS is already patched.


----------



## Panchovix

WayWayUp said:


> Can someone explain to me the secret sauce for speedway??
> Obviously it’s benefiting from something else that I’m not aware of. Rebar?
> It doesn’t match my performance in any other benchmark


It's really really VRAM intensive, though I think your card had good VRAM OC? RAM speed also kinda bottlenecks it. Also on my testing, any CPU small "throttle" will get you less FPS in SpeedWay.

But most than that, if you have good VRAM you will be easily at top the tops, is the bottleneck there (you can even try higher core clocks than TimeSpy/FireStrike and it won't crash)


----------



## J7SC

Nico67 said:


> Asus V1 and V2 are the same, V2 updater just works on more platforms. V2.1 is the ECC enable fix.


Tx. On the Giga-G-OC original bios, ECC works and no black screen on my setup, so I guess I'll keep it for now.

BTW, the latest NVidia driver seems to allow for more VRAM speed setting...only had a few test-runs but in both cases, plus one if not two speed bins for the VRAM


----------



## Panchovix

J7SC said:


> Tx. On the Giga-G-OC original bios, ECC works and no black screen on my setup, so I guess I'll keep it for now.
> 
> BTW, the latest NVidia driver seems to allow for more VRAM speed setting...only had a few test-runs but in both cases, plus one if not two speed bins for the VRAM


How much Mhz do you use for your bins on VRAM? I mean, that makes me hopeful (a little), but I know I crashed the other day anyways with +1150Mhz instead of +1100Mhz lol


----------



## J7SC

Panchovix said:


> How much Mhz do you use for your bins on VRAM? I mean, that makes me hopeful (a little), but I know I crashed the other day anyways with +1150Mhz instead of +1100Mhz lol


...really depends on temps - I did 'such a good job' with thermal putty on VRAM for the waterblock that VRAM is usually just in the high 30s to 40 C range, which is too low. By kicking up ambient and/or running VRAM intensive apps right before, I can manage... usually at least + ~1500 or so, a bit more or less depending on where the most efficiency is re. the specific app in question.


----------



## yzonker

Well may not be useful but Vrel in Kombustor 4k donut.   Can't hit the power limit. lol

Edit: 2nd image


----------



## Panchovix

J7SC said:


> ...really depends on temps - I did 'such a good job' with thermal putty on VRAM for the waterblock that VRAM is usually just in the high 30s to 40 C range, which is too low. By kicking up ambient and/or running VRAM intensive apps right before, I can manage... usually at least + ~1500 or so, a bit more or less depending on where the most efficiency is re. the specific app in question.


I tried and no luck, and probably now my VRAM is also degraded haha (have been doing AI stuff like a lot every day for more than 1 month), at +1100 Vulkan memtest crashes when before it was stable  (Though now I'm using the Strix VBIOS instead of the TUF VBIOS, but I doubt they have different memory timings)
On games seems fine, but welp, prob gonna have to see if someone here in Chile knows to repair GDDR6X VRAM (I don't have Warranty anymore)


----------



## Sheyster

Faltzer said:


> Gigabyte RTX 4090 VBIOS
> 
> 
> 24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com


FWIW, that's the original GB-OC BIOS, there is a newer F2 version out there. I think it has been shared, somewhere in this very long thread!


----------



## Sheyster

KedarWolf said:


> Might want to install the BIOS, and then run the BIOS update tool from the Gigabyte support site to get the V2 BIOS.


I believe this is the F2 (V2) version:









Gigabyte RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com


----------



## J7SC

Panchovix said:


> I tried and no luck, and probably now my VRAM is also degraded haha (have been doing AI stuff like a lot every day for more than 1 month), at +1100 Vulkan memtest crashes when before it was stable  (Though now I'm using the Strix VBIOS instead of the TUF VBIOS, but I doubt they have different memory timings)
> On games seems fine, but welp, prob gonna have to see if someone here in Chile knows to repair GDDR6X VRAM (I don't have Warranty anymore)


...you could get your hands on some 24 GB/s (instead of 21 GB/s) Micron GDDE6X and get the chap below to swap it for you...if only I could get my hands on a RTX 6000 Ada, I would get both the VRAM and chip 'upgraded'


----------



## Panchovix

J7SC said:


> ...you could get your hands on some 24 GB/s (instead of 21 GB/s) Micron GDDE6X and get the chap below to swap it for you...if only I could get my hands on a RTX 6000 Ada, I would get both the VRAM and chip 'upgraded'


If only I knew where to buy them D:

I guess this is the 21 GB/s GDDR6X that the 4090 uses MT61K512M32KPA-21

And this is the 24GB/s GDDR6X one MT61K512M32KPA-24

No idea where to buy enough chips for one card though lol


----------



## bmagnien

yzonker said:


> Well may not be useful but Vrel in Kombustor 4k donut.  Can't hit the power limit. lol
> 
> Edit: 2nd image
> 
> View attachment 2587848
> 
> View attachment 2587849


not gonna show us your max hotspot temp at that 676w peak???


----------



## yzonker

bmagnien said:


> not gonna show us your max hotspot temp at that 676w peak???


Low 60's with 50C core.


----------



## yzonker

Galax bios is definitely faster in TSE for me. Easily my highest graphics score. Something weird going on with my cpu score though. Not the bios as I tried the Strix again with the same low result. Something unrelated I think.

Edit: and no chiller or heater. Just ambient









I scored 20 216 in Time Spy Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Panchovix

yzonker said:


> Galax bios is definitely faster in TSE for me. Easily my highest graphics score. Something weird going on with my cpu score though. Not the bios as I tried the Strix again with the same low result. Something unrelated I think.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 20 216 in Time Spy Extreme
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


It says the result is hidden, but damn with already you saying that it's better, I trust lol.

It is like the other XOC VBIOS with no downclocking at idle, etc; or no issues?


----------



## yzonker

Panchovix said:


> It says the result is hidden, but damn with already you saying that it's better, I trust lol.
> 
> It is like the other XOC VBIOS with no downclocking at idle, etc; or no issues?


Oh, here's the problem. And I fixed my CPU issue. Now if 3DMark would put the Galax in their database.


----------



## Panchovix

yzonker said:


> Oh, here's the problem. And I fixed my CPU issue. Now if 3DMark would put the Galax in their database.
> 
> View attachment 2587858


WOW amazing score man! Does it downclock at idle? If yes I'm going to flash it now LOL. Though, does games work good and such?


----------



## yzonker

Aaaah yea, suck it down Asus/Giga/MSI. No artifacts, no chiller, no heater. lol +210/+1750


----------



## yzonker

Panchovix said:


> WOW amazing score man! Does it downclock at idle? If yes I'm going to flash it now LOL. Though, does games work good and such?


I haven't done anything other than run benchmarks yet. And I have prefer high performance set, so can't tell if it downclocks either. I'll back to my normal OS and check. Give me a minute. No point in running more benches with it not recognized by 3DMark.

Edit: yes it idles. Just a normal bios with some magic apparently...


----------



## Sheyster

yzonker said:


> I haven't done anything other than run benchmarks yet. And I have prefer high performance set, so can't tell if it downclocks either. I'll back to my normal OS and check. Give me a minute. No point in running more benches with it not recognized by 3DMark.
> 
> Edit: yes it idles. Just a normal bios with some magic apparently...


I'm tempted to try it out for gaming at default settings. I've already hit 498w with the ASUS Strix V2 BIOS at defaults during a 2 hour MW2/WZ2 session.


----------



## mirkendargen

That Galax BIOS doesn't mess around...


----------



## Panchovix

yzonker said:


> I haven't done anything other than run benchmarks yet. And I have prefer high performance set, so can't tell if it downclocks either. I'll back to my normal OS and check. Give me a minute. No point in running more benches with it not recognized by 3DMark.
> 
> Edit: yes it idles. Just a normal bios with some magic apparently...


Really thanks man! Well, time to install another VBIOS lol

EDIT: 


mirkendargen said:


> That Galax BIOS doesn't mess around...
> 
> View attachment 2587866


But I guess being cautious, I'm on air, so I guess I can expect 90-100°C on the hotspot at 660W+


----------



## yzonker

No point in running benchmarks at all until that bios is recognized. Gonna just have to run everything again anyway.


----------



## J7SC

mirkendargen said:


> That Galax BIOS doesn't mess around...
> 
> View attachment 2587866


...the Galax for single or dual 12VHPWR ?


----------



## mirkendargen

J7SC said:


> ...the Galax for single or dual 12VHPWR ?











GALAX RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com





HOF, so dual. Seems to work fine though, and my PSU backs up the power numbers so it isn't halved or something goofy.


----------



## yzonker

mirkendargen said:


> GALAX RTX 4090 VBIOS
> 
> 
> 24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> HOF, so dual. Seems to work fine though, and my PSU backs up the power numbers so it isn't halved or something goofy.


Only a single connector shows up in GPUZ. Probably why it works.


----------



## mirkendargen

Yeah either this BIOS or the new Nvidia driver are a winner (I haven't benched before now on the new driver). My effective clocks at 1.1V were up to 80-100Mhz lower than my reported clocks with all previous BIOSes, with the Galax one they are only 20-30Mhz lower.


----------



## yzonker

mirkendargen said:


> Yeah either this BIOS or the new Nvidia driver are a winner (I haven't benched before now on the new driver). My effective clocks at 1.1V were up to 80-100Mhz lower than my reported clocks with all previous BIOSes, with the Galax one they are only 20-30Mhz lower.


The runs I posted were on the older driver.


----------



## KedarWolf

HOF BIOS boosts past the limit you set it Afterburner, at 3150 it was boosting to 3190 and would crash my benchmark.

At 3135 in Afterburner, the bench finished though.

Got my best-ever EZBench with it.

Default settings, 1080p, sound off, 7950x, ASUS Strix OC RTX 4090 with GALAX HOF OC Lab Plus 666w BIOS on 3 to PSU 16-pin power connector.

130060


----------



## J7SC

mirkendargen said:


> GALAX RTX 4090 VBIOS
> 
> 
> 24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> HOF, so dual. Seems to work fine though, and my PSU backs up the power numbers so it isn't halved or something goofy.


...nice ! My Seasonic Pr.Plt.PX1300 should handle it fine. 

...had these on my back-up thumb-drive (not subbed). Not as crazy as the bugged PortRoyal 30,337 (also not subbed) but they could make a nice desktop background . I wonder what crazy numbers the Galax will generate...


----------



## Panchovix

KedarWolf said:


> HOF BIOS boosts past the limit you set it Afterburner, at 3150 it was boosting to 3190 and would crash my benchmark.
> 
> At 3135 in Afterburner, the bench finished though.
> 
> Got my best-ever EZBench with it.
> 
> Default settings, 1080p, sound off, 7950x, ASUS Strix OC RTX 4090 with GALAX HOF OC Lab Plus 666w BIOS on 3 to PSU 16-pin power connector.
> 
> 130060
> 
> View attachment 2587867


So basically about ~195W per cable, so not that bad at all. Pretty nice score there mate!


----------



## bmagnien

holy moly...683w...11.7c hot spot delta


----------



## J7SC

mirkendargen said:


> Yeah either this BIOS or the new Nvidia driver are a winner (I haven't benched before now on the new driver). My effective clocks at 1.1V were up to 80-100Mhz lower than my reported clocks with all previous BIOSes, with the Galax one they are only 20-30Mhz lower.


...the new NVidia driver definitely allows for higher VRAM speeds (2 to 3 steps) on my setup, but it is not really faster overall (nor slower). They might have just moved the ECC speeds up with it, but I'm just guessing. 

My biggest challenge (and per your earlier HWInfo re. Galax, yours too) is to get VRAM temp up. VRAM likes it at 25 C ambient and up, while the core likes it at 20 C ambient and below - and they cohabitate next to each other on the same PCB . I was hoping the Galax would put more heat into the VRAM.


----------



## mirkendargen

mirkendargen said:


> Yeah either this BIOS or the new Nvidia driver are a winner (I haven't benched before now on the new driver). My effective clocks at 1.1V were up to 80-100Mhz lower than my reported clocks with all previous BIOSes, with the Galax one they are only 20-30Mhz lower.


The higher effective clock speeds netted me about 1% better than my previous best (non-bugged) PR score. Nothing amazing to write home about, but definitely a tangible improvement.


----------



## yzonker

J7SC said:


> ...the new NVidia driver definitely allows for higher VRAM speeds (2 to 3 steps) on my setup, but it is not really faster overall (nor slower). They might have just moved the ECC speeds up with it, but I'm just guessing.
> 
> My biggest challenge (and per your earlier HWInfo re. Galax, yours too) is to get VRAM temp up. VRAM likes it at 25 C ambient and up, while the core likes it at 20 C ambient and below - and they cohabitate next to each other on the same PCB . I was hoping the Galax would put more heat into the VRAM.


I don't think an XOC bios will help. These new chips just don't pull any significant power. You can easily see the difference by locking the core at 1100mv on both the 4090 and 3080ti/3090. The 4090 sees a small ~2C increase the VRAM temp. Closer to 10C on the 3080ti /3090. I tested this a while back with my 3080ti and that's what I found anyway.


----------



## J7SC

yzonker said:


> I don't think an XOC bios will help. These new chips just don't pull any significant power. You can easily see the difference by locking the core at 1100mv on both the 4090 and 3080ti/3090. The 4090 sees a small ~2C increase the VRAM temp. Closer to 10C on the 3080ti /3090. I tested this a while back with my 3080ti and that's what I found anyway.


...I was talking thinking about the Galax vBios re. VRAM temps, but if the driver would do it...I'd take it. Testing some ECC now w/ memtest_vulkan...for now just speeds (and speed losses), but when I get around to it, I'll check if ECC brings relatively higher operating temps compared to non-ECC at the same ambient. For now, ECC and non-ECC temps look to be about the same...


----------



## bmagnien

This passed my naked eye artifact test multiple times. It's right in-line with @yzonker 's +210/+1750. This run was +240/+1690. That being said...a +240/+1700 that had visible artifacts left and right scored 29,737 lol. I feel like naked eye artifact testing can't really be trusted either...even if you were to believe a random internet stranger saying 'This passed my naked eye artifact test multiple times.' But this new drive/Galax combo is definitely producing repeatable improvements:


----------



## J7SC

...I guess it might get worse before it gets better ?
As promised, an update on the ECC-checked / NVpanel runs vs the same speeds posted before - note the 23 GB vs 24 GB in the headers. The ECC results on the right are based on the new driver. While it can go higher, these were the three speeds I posted before when asking @Sheyster for his runs with the Giga-G-OC and Asus Strix V2. VRAM temps did go up in the last two ECC runs by 2 C, so that's something...

EDIT: I was thinking about the Galax XOC...with 2x 12VHPWR connections it may very well be an actual 1320W cap vbios for those Galac HoF OCL...the previous gen had a (theoretical) max of 1000 W...so with cards that only have 1x 12VHPWR, it would read half of that. BTW, the Galax HoF bios shows as 'approved' on 3DM...


----------



## schoolofmonkey

mirkendargen said:


> GALAX RTX 4090 VBIOS
> 
> 
> 24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> HOF, so dual. Seems to work fine though, and my PSU backs up the power numbers so it isn't halved or something goofy.


And here I was wondering what BIOS to try on my Galax 4090 SG.
I always worry about killing something power wise though with only a GPU Stage 18×50A (900A) and VRAM Stage 4×50A (200A)...


----------



## B1gD4ddy

DELETE


----------



## B1gD4ddy

WayWayUp said:


> what clocks do you get with +200? if its over 3k you cant really call it a trash card


2920 - 2940 in borderlands 3 4k badass

maybe i should test the 666w galax bios on my inno3d x3 oc.


----------



## J7SC

schoolofmonkey said:


> And here I was wondering what BIOS to try on my Galax 4090 SG.
> I always worry about killing something power wise though with only a GPU Stage 18×50A (900A) and VRAM Stage 4×50A (200A)...


...what is the Power limit on the stock Galax 4090 SG ? 600W ? If you don't want to go for the Galax HoF or even the Strix V2, you might try the stock oc (2nd bios) from the Gigabyte Gaming OC (20+4 stages), or the similar one from the Asus TUF OC.


----------



## B1gD4ddy

what's the difference between strix v1 and v2?


----------



## schoolofmonkey

J7SC said:


> ...what is the Power limit on the stock Galax 4090 SG ? 600W ? If you don't want to go for the Galax HoF or even the Strix V2, you might try the stock oc (2nd bios) from the Gigabyte Gaming OC (20+4 stages), or the similar one from the Asus TUF OC.


Stock power limit of the SG is 510w..
Galax kinda skimped on the dual BIOS, it doesn't have that feature, only the 1 bios, so no fallback.......


----------



## Krzych04650

Holy **** this Galax 666W BIOS is so good. It does not have any of those phantom limiters that make 600W+ cards power limit a lot sooner, it just works as you would expect.









Also it is not limited to 2400 RPM like Neptune, I can run full 3200 RPM.


----------



## Faltzer

Krzych04650 said:


> Holy **** this Galax 666W BIOS is so good. It does not have any of those phantom limiters that make 600W+ cards power limit a lot sooner, it just works as you would expect.
> View attachment 2587879
> 
> 
> Also it is not limited to 2400 RPM like Neptune, I can run full 3200 RPM.


interesting! is that not a 2x16 pins card or something? I mean, can we other 4090 users simply use it without any problem? (on a 4090 waterforce for example)


----------



## Krzych04650

Faltzer said:


> interesting! is that not a 2x16 pins card or something? I mean, can we other 4090 users simply use it without any problem? (on a 4090 waterforce for example)


Works fine for me so far and some others too from what I've read in last few pages.


----------



## newls1

can someone be kind enough to point me to a step by step instructions on how to flash my bios?


----------



## Krzych04650

newls1 said:


> can someone be kind enough to point me to a step by step instructions on how to flash my bios?


Download nvflash and then open CMD with administrator privileges

cd C:\nvflash (or wherever your folder is located and however your folder is called)
nvflash64 --protectoff
nvflash64 -6 BIOS.rom (you can rename that .rom file as you want)

The flashing itself only takes like 3 seconds, but still you don't want to have this process interrupted in any way, so quit all apps and etc before doing it. There may be some extra safety steps that I do not take, I don't know for sure, wait for more answers.


----------



## phillyman36

How do most of you fell about Zotac 4090 cards? Yes or pass for something else?


----------



## jootn2kx

Krzych04650 said:


> Holy **** this Galax 666W BIOS is so good. It does not have any of those phantom limiters that make 600W+ cards power limit a lot sooner, it just works as you would expect.
> View attachment 2587879
> 
> 
> Also it is not limited to 2400 RPM like Neptune, I can run full 3200 RPM.


Interesting have you got any difference or gain in gaming compared to other bios?


----------



## Gogoo

Hello folks, I've been following this thread on and off and its time to upgrade my GPU. I'm thinking of getting the MSI Liquid version as its slim and I've seen overall good things about it. I wanted to ask , about the vbios. If I flash a 600w from SuprimX air , would it mess with the rpms on the AIO? Is there any downside of doing so?


----------



## Krzych04650

Since Galax 666W BIOS now allows for proper 600W+ limit without phantom limiters I did re-run some of the tests from my 350W vs 450W vs 630W 30 Game Test post. There were 8 games that still reported power limit with 630W BIOS because it wasn't really anywhere near 630W in reality, more like 560W max. Most were power limited only by a whisker, so the scores did not change much, but there are some that went up from 560W to 620-630W and the average performance for 600W+ went up by like 0.5%. I will post screenshots only for those 8 games here with an updated excel table, everything else in the original post still stands. I've also added one new game, Portal RTX.


Spoiler: 350W vs 450W vs 666W
























































































Some games did manage to cross 600W point. Path of Exile in particular still reports power limit with 623W draw, but clock is full 3000 [email protected], so it doesn't matter. It did gain almost 2% by going from previous BIOS. LOTRO also went up from 570W to 630W. I did increase the resolution for Quake RTX to make it a bit harder, it is above 600W now as well. Remaining 6 games that reported power limit before had some insignificant changes with 10-20W difference and 0.1-0.2% gains.


----------



## Laithan

Krzych04650 said:


> Since Galax 666W BIOS now allows for proper 600W+ limit without phantom limiters I did re-run some of the tests from my 350W vs 450W vs 630W 30 Game Test post.
> View attachment 2587888


Thanks for running all these tests. If I may ask, what app are you using to create that OSD?


----------



## StreaMRoLLeR

Asking for strix owners, how is the coil-whine ? Suprim have those new HCI caps which make a CW unbearable. May someone upload basic 15 sec cw videos of their strix ?


----------



## newls1

Krzych04650 said:


> Download nvflash and then open CMD with administrator privileges
> 
> cd C:\nvflash (or wherever your folder is located and however your folder is called)
> nvflash64 --protectoff
> nvflash64 -6 BIOS.rom (you can rename that .rom file as you want)
> 
> The flashing itself only takes like 3 seconds, but still you don't want to have this process interrupted in any way, so quit all apps and etc before doing it. There may be some extra safety steps that I do not take, I don't know for sure, wait for more answers.


thank you so much. Have you flashed to this 666 bios and noticed any positive effects?

**EDIT... never mind that question! I see your above replies


----------



## Faltzer

J7SC said:


> Tx. On the Giga-G-OC original bios, ECC works and no black screen on my setup, so I guess I'll keep it for now.
> 
> BTW, the latest NVidia driver seems to allow for more VRAM speed setting...only had a few test-runs but in both cases, plus one if not two speed bins for the VRAM


I have applied the bios update via their official website. It seems to cause no harm.
Wattage is still capped at 600.
I have previously said that my card runs okay at +190 core +1390 memory oc, but I have decreased it due to artifacts to 185 / 1350

I am afraid to use the galax 660w bios, as my 12hpw cable (or however it is being spelled) has a big 600w description written on it. I happen to have a gigabyte PSU which supplied me with an original cable, so I assume it fits perfectly with the waterforce gpu. The stock bios was really capped at 500, so the extra 100 watts is the maximum I am willing to go in order not to risk anything.


----------



## dr/owned

Alphacool block finally showed up...think whoever is running [email protected] is dead or something because they never replied.

Anyways...*it's a much better block than the Phanteks one.* You can tell they spent a lot more time thinking about the hydraulics of it...put a thicker copper base plate to allow them to do a lot of CNC work to ensure flow is adequate around the entire block. CNC machining looks flawless. Chrome instead of nickel (blue-ish tint instead of yellow) is also much more durable. Finer fin structure. A jet plate and acrylic cut to "even" the flow into the full length of the jet plate. I see a delta of 24C with 600W of furmark load. 17C with PR. Hotspot +9C. Kryonaut + Arctic TP3 pads, all assembled with the block and pcb baked at 170F.

For comparison, the Phanteks block would be like +35-40C in Furmark. When I disassembled it, it seemed like the top half of the die wasn't getting max pressure but it wasn't blatantly bad like I've seen with too-thick of thermal pads (where the paste didn't even spread to cover the entire die).


----------



## Krzych04650

Laithan said:


> Thanks for running all these tests. If I may ask, what app are you using to create that OSD?


It is RTSS Overlay Editor. You go into Setup>Plugins>OverlayEditor.dll>Setup. You can add sensors from various software and arrange things the way you want.



StreaMRoLLeR said:


> Asking for strix owners, how is the coil-whine ? Suprim have those new HCI caps which make a CW unbearable. May someone upload basic 15 sec cw videos of their strix ?


Asus and MSI are known to have disastrous coil whine. You have to go with Gigabyte or with reference boards like Inno3D/Zotac/PNY if coil whine is a concern.



phillyman36 said:


> How do most of you fell about Zotac 4090 cards? Yes or pass for something else?


They are okay in terms of cooling and looks and have slightly customized reference boards that are known for having very little coil whine compared to some others like Asus or MSI, so Zotac is pretty good. Also it is not as in-demand as more popular brands like MSI-Asus-Gigabyte so it is much easier to find one close to MSRP.


----------



## dk_mic

EKWB block for MSI is finally shipping for me (ordered 27th October), should be here next week.
They have silently removed the "Custom-made Dual-Slot GPU Bracket" from the product prescription.
Contacted support, let's see how this turns out. Can't vertically mount the original bracket and this is something I wanted to try for the first time.
Have some Arctic MX-6 and TP-3 ready, will probably just use the EKWB pads for the memory.. but then my memory only goes to +1000 

Can't wait to put it under water, I am getting really high temps under moderate loads on air.. have a suspicion the thermal paste job is really bad, just as it was on my 2080 Ti gaming x trio 


Spoiler: 2080 ti factory paste


----------



## jootn2kx

dr/owned said:


> Alphacool block finally showed up...think whoever is running [email protected] is dead or something because they never replied.
> 
> Anyways...*it's a much better block than the Phanteks one.* You can tell they spent a lot more time thinking about the hydraulics of it...put a thicker copper base plate to allow them to do a lot of CNC work to ensure flow is adequate around the entire block. CNC machining looks flawless. Chrome instead of nickel (blue-ish tint instead of yellow) is also much more durable. Finer fin structure. A jet plate and acrylic cut to "even" the flow into the full length of the jet plate. I see a delta of 24C with 600W of furmark load. 17C with PR. Hotspot +9C. Kryonaut + Arctic TP3 pads, all assembled with the block and pcb baked at 170F.
> 
> For comparison, the Phanteks block would be like +35-40C in Furmark. When I disassembled it, it seemed like the top half of the die wasn't getting max pressure but it wasn't blatantly bad like I've seen with too-thick of thermal pads (where the paste didn't even spread to cover the entire die).


 Cool! just to be sure
Do you mean this Alphacool block? Eisblock GPU Kühler für RTX 4090 | Nvidia Fullsize | Grafikkarten Wasserkühler | Shop | Alphacool - the cooling company


----------



## RaMsiTo

galax bios 666w , suprim x stock clocks.


----------



## bloot

Hi, I've seen some hwinfo and gpu-z screenshots in this thread with lowish 11.6V on the 16 pin voltage sensor, what would be a safe value? I bought a FE card and have seen it as low as 11.964V on hwinfo and 12V on gpu-z when playing Portal with RTX, but not less, for now.


----------



## RaMsiTo

suprim X bios galax 666w, gpu stock.

cp2077 ,4k rtx insane dlss off


----------



## Blameless

Krzych04650 said:


> Also it is not limited to 2400 RPM like Neptune, I can run full 3200 RPM.


What model is your 4090?


----------



## jootn2kx

Flashed the galax 666 bios on my gainward card and the first thing I noticed it gives a bit more framerates per watt.
For example I can set my powerlimit to 70% and it will consume about 380watt but I saw higher frame rates in most games en better scores in benchmarks compared with the stock bios around the same settings/watts and voltages.

Regardsless 380watt in my test the temperature is 2 degrees more than [email protected]
So that means it's pushing the card more in some kind of way regardless the consumption


----------



## Benni231990

Holy ****ing S.H.I.T the galaxy Bios is the best bios ive ever see mit average clock boost is 100mhz higher and now im in the 29k Port Royal club


----------



## Krzych04650

Blameless said:


> What model is your 4090?


Inno3D X3 OC.


----------



## Benni231990

why i cant upload my score to 3DMark it allways say GPU not detected with the galaxy bios? why? in 3DMark i can see GPU : Nvidia 4090









I scored 29 141 in Port Royal


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

Benni231990 said:


> why i cant upload my score to 3DMark it allways say GPU not detected with the galaxy bios? why? in 3DMark i can see GPU : Nvidia 4090
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 141 in Port Royal
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


I don't think the card is officially released so it's not in the 3DMark database.


----------



## Krzych04650

Benni231990 said:


> why i cant upload my score to 3DMark it allways say GPU not detected with the galaxy bios? why? in 3DMark i can see GPU : Nvidia 4090
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 141 in Port Royal
> 
> 
> Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com





yzonker said:


> I don't think the card is officially released so it's not in the 3DMark database.


Yea same for me, it says Generic VGA.


----------



## Benni231990

what bullshit is that ? the Bios is official and validate


----------



## yzonker

jootn2kx said:


> Flashed the galax 666 bios on my gainward card and the first thing I noticed it gives a bit more framerates per watt.
> For example I can set my powerlimit to 70% and it will consume about 380watt but I saw higher frame rates in most games en better scores in benchmarks compared with the stock bios around the same settings/watts and voltages.
> 
> Regardsless 380watt in my test the temperature is 2 degrees more than [email protected]
> So that means it's pushing the card more in some kind of way regardless the consumption


My guess is that it's running one or more voltages higher. That's a significant amount of performance for just better optimization alone.


----------



## Benni231990

the Galaxy bios is the best Bios to flash

i have now over 650 more Points in Port Royal with the HOF Bios before with the 600watt MSI Liquid Bios i had 28400-28500


----------



## yzonker

I guess the Galax bios will be added, although there are already Galax HOF scores in the 3DMark database. 









I scored 31 932 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





But that's probably on an XOC bios.


----------



## J7SC

Krzych04650 said:


> Yea same for me, it says Generic VGA.


...it could also be the new ECC and 3DM version updates required as of December 10th; we'll see. As I mentioned earlier, this Galax bios could very well be the XOC bios capped at 1320 W (for 2x 12VHPWR)...with 1x 12VHPWR, it would be around 660W.
---

Re. your updated game tests w/Galax (tx!), did you notice any changes re. your *VRAM* speed, temps, fps ?


----------



## Panchovix

The Galax HoF VBIOS def seems to give better performance per clocks compared to other VBIOS.

Haven't tested much since here in Chile ambient temps are about 33°C and my room is like 38°C, so can't keep the temps in check at 660W on air, but even for lower limits, the card seems to give a slight advantage, the effective clocks are closer to the reported clocks.


----------



## Krzych04650

J7SC said:


> Re. your updated game tests w/Galax (tx!), did you notice any changes re. your *VRAM* speed, temps, fps ?


VRAM speed not really, but I did quite a big performance boost in some cases. For example, I couldn't really break past 28 000 Port Royal and now doing 28 800 even though seemingly nothing changed. All the previous BIOSes I've used had those phantom limiters and generally felt kind of anemic and software limited, with this one what you dial in is what you get with only some minor fluctuations, all the effective clock and power limit weirdness is gone now.


----------



## jootn2kx

Lol what kind of witchcraft is this galax bios?  Did some further testing it's significantly better performance in all my games now compared to any other bios I tried.

Edit: Say hello to 29K score in 3dmark


----------



## Benni231990

this bios is a ****ing Beast


----------



## matique

Received my Alphacool 4090 FE block, purchased from taobao for USD$150.










I think it's quite a handsome block, a little bit optimus-esque. I'm glad they ditched their multiple orings for a more symmetrical design. Channels are well milled.










There are some machining marks that can be seen, but surface is absolutely smooth to the touch. It's chrome plated as well, very shiny. Die area is super flat as well, measured with a razor blade.



















The block is well packaged with everything you need to get it up and running. I've forgotten to picture the backplate though. Front 1mm pads seem to be the alphacool apex soft 7w/mk pads, which is pretty good. They've also provided their subzero paste which should work pretty well. I love that pads are pre-cut. Everyone pre-cuts their pads, EK why don't you do the same?

Pushed the card in furmark with 600w PL of the FE. As usual, it won't go the full 600w due to vrel:









_550w load, 23K core-coolant delta, 10K core-hotspot delta._









_310w load, 13K core-coolant delta, 7K core-hotspot delta. _

It seems like with the 4000 series, having 10K core-coolant deltas are a thing of the past > 300w, unless perhaps you use LM. I suspect this is due to the higher density of the newer die. It's definitely not a contact issue as core-hotspot deltas are excellent. 









_14K core-coolant delta, 7.5K core-hotspot delta at 375w._

In game, temps are pretty well managed. It's not that much cooler than air-cooling, but at least now if i wanna bench something I don't need to run the fans on max to make sure it's below 60c. All in all, I'm quite pleased with the alphacool block. If the Heatkiller FE block is more aesthetically pleasing i'd defo still grab that, but I'd be happy to keep the alphacool block. Good job, Alphacool.


----------



## Benni231990

with this bios the Top100 in 3dMark will only 29k+


----------



## tryout1

Panchovix said:


> I tried and no luck, and probably now my VRAM is also degraded haha (have been doing AI stuff like a lot every day for more than 1 month), at +1100 Vulkan memtest crashes when before it was stable  (Though now I'm using the Strix VBIOS instead of the TUF VBIOS, but I doubt they have different memory timings)
> On games seems fine, but welp, prob gonna have to see if someone here in Chile knows to repair GDDR6X VRAM (I don't have Warranty anymore)


I'm personally not 100% sure if that vulkan memtest actually works flawlessly, when i heard about it i tried the test at complete stock settings and it errored out after 10 mins, then i did the test again but with +1000 and it was fine for about 20 mins till i quit. The difference was that when i was running the vulkan test at stock i was watching some twitch on the side so maybe that was the culprit.


----------



## WayWayUp

cant flash another bios with the FE correct?


----------



## Panchovix

tryout1 said:


> I'm personally not 100% sure if that vulkan memtest actually works flawlessly, when i heard about it i tried the test at complete stock settings and it errored out after 10 mins, then i did the test again but with +1000 and it was fine for about 20 mins till i quit. The difference was that when i was running the vulkan test at stock i was watching some twitch on the side so maybe that was the culprit.


Hmm I see, yeah I will need more testing I guess. Yesterday I was playing MW2 and CP2077 at 4K for some hours with VRAM at +1100 and no issues, but as soon I try the vulkan test it crashes my PC lol




WayWayUp said:


> cant flash another bios with the FE correct?


Sadly not, same as Ampere I think.


----------



## newls1

Little update.. I flashed to the HOF 666 BIOS and can confirm the same MSI AB settings for my prior OC are no longer stable as it does more clock speed per (+ offset) you set. Using my Stock Gigabyte OC bios my OC was
+240core
+1450 Mem
1.1v (obviously)
this gave me 3045Mhz (3040Mhz Actual speed) during benchmark

3 back to back Port Royal bench stable on this 666 HOF Bios is:
+220 Core
+1450 Mem
1.1v
this gave me 3075Mhz (3060Mhz actual) speed during benchmark

Score increased 300 points. 

Here is a HWInfo64 shot of temps and voltages and Port Royal Score


----------



## KedarWolf

The HOF BIOS is incredible!!

This is my best run before, zero artefacts, totally legit run.









I scored 29 200 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Just ran the HOF, no artefacts, totally legit run.

29 727









I scored 29 727 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## KedarWolf

KedarWolf said:


> The HOF BIOS is incredible!!
> 
> This is my best run before, zero artefacts, totally legit run.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 200 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Just ran the HOF, no artefacts, totally legit run.
> 
> 29 727
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 727 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2587945


Raised my core to 3150, and kept memory at 1648, as high as I can go no artefacts.


----------



## newls1

I get instant crash @ 3090Mhz . You must have a great chip to be able to do 3150


----------



## LtMatt

KedarWolf said:


> Raised my core to 3150, and kept memory at 1648, as high as I can go no artefacts.
> 
> View attachment 2587947


The results are hidden. Can you share a screenshot showing the average core clock in the run?

This is the best I've managed so far on stock cooling without loads of artifacts.


----------



## tryout1

Hehe laughs in 2955mhz max before crash, everything over 3000mhz is great already, for benchmarking going further is preem, but so far it seems like it stops scaling in games after about 2900mhz.


newls1 said:


> I get instant crash @ 3090Mhz . You must have a great chip to be able to do 3150


----------



## Nizzen

KedarWolf said:


> Raised my core to 3150, and kept memory at 1648, as high as I can go no artefacts.
> 
> View attachment 2587947


----------



## LtMatt

Nizzen said:


> View attachment 2587954


Even the screenshot is showing artifacts.


----------



## Nizzen

LtMatt said:


> Even the screenshot is showing artifacts.


HDR....

Port Royal is a joke, even with 1000w and 666w bios


----------



## Miao

hi guys, 4090FE just delivered here.
I've a 13900k/z790 EXT rig to build with, but i'm waiting for the blocks (magnitude & vector), atm I put her in a 9900k build, so no bench for now.
btw someone know what's new on this bios 95.02.20.00.03? is the only one different from the .01 of the fe, my brand new card came with .01 too, so...


----------



## J7SC

LtMatt said:


> The results are hidden. Can you share a screenshot showing the average core clock in the run?
> 
> This is the best I've managed so far on stock cooling without loads of artifacts.
> View attachment 2587950


To show in HoF, you now need to have the EEC checked in NVP and also have the latest version of 3DM app + SystemInfo.

@Nizzen - you beat my unsubbed 30337 '''score''' in PR


----------



## LtMatt

J7SC said:


> To show in HoF, you now need to have the EEC checked in NVP and also have the latest version of 3DM app + SystemInfo.
> 
> @Nizzen - you beat my unsubbed 30337 '''score''' in PR


Thanks, but i didn't realise this. God damn, i wasted hours on those scores for nothing Lol.



Nizzen said:


> HDR....
> 
> Port Royal is a joke, even with 1000w and 666w bios


Any reason for enabling HDR? Does 3DMark even support it?


----------



## Krzych04650

Looks like Galax 666W BIOS can still get power limited earlier than 666W.








1 next to mV indicates power limit as reported by MSI Afterburner.

This game is just so well optimized, basically a flat frametime graph at all times and pushing the GPU hard on power draw.

Path of Exile does similar thing with power limiting at ~620W








This is enough for staying at 1100mV at all times though.


----------



## Laithan

tryout1 said:


> I'm personally not 100% sure if that vulkan memtest actually works flawlessly, when i heard about it i tried the test at complete stock settings and it errored out after 10 mins, then i did the test again but with +1000 and it was fine for about 20 mins till i quit. The difference was that when i was running the vulkan test at stock i was watching some twitch on the side so maybe that was the culprit.


IMO the quickest way myself and others have found to identify memory stability is with Heaven 4.0 (max settings). It usually shows in the first scene.


----------



## newls1

J7SC said:


> To show in HoF, you now need to have the EEC checked in NVP and also have the latest version of 3DM app + SystemInfo.
> 
> @Nizzen - you beat my unsubbed 30337 '''score''' in PR


when did that start?


----------



## tryout1

Laithan said:


> IMO the quickest way myself and others have found to identify memory stability is with Heaven 4.0 (max settings). It usually shows in the first scene.


Ah, but well i have no memory oc issues, it was more about the question from him, personally i quick check always with Superposition 8k as it's supersensitive in my case +1600 makes my pc reboot instantly the second it finished loading. +1500 is the absolute max i can run without any issues but with my undervolt i use atm 0.925v ~2625mhz i didn't see any perf gains over +1000mhz but it's almost stock performance while i have like 100-150w less.


----------



## newls1

newls1 said:


> Little update.. I flashed to the HOF 666 BIOS and can confirm the same MSI AB settings for my prior OC are no longer stable as it does more clock speed per (+ offset) you set. Using my Stock Gigabyte OC bios my OC was
> +240core
> +1450 Mem
> 1.1v (obviously)
> this gave me 3045Mhz (3040Mhz Actual speed) during benchmark
> 
> 3 back to back Port Royal bench stable on this 666 HOF Bios is:
> +220 Core
> +1450 Mem
> 1.1v
> this gave me 3075Mhz (3060Mhz actual) speed during benchmark
> 
> Score increased 300 points.
> 
> Here is a HWInfo64 shot of temps and voltages and Port Royal Score


I cant get FC6 to run @ 3160MHz anymore... @ +220 i get 3045 and +225 it goes to 3090! What can I do to get 3060 back?


----------



## J7SC

newls1 said:


> when did that start?


...yesterday I think, depending on which time zone of the planet you're in.

------
Core clocks aren't really the challenge for my card (below is with the Giga-G-OC stock vbios; also posted 3210 MHz / light load before), but keeping vram warmish is. I pick up another few speed bins on VRAM if it reaches 48 C - 50 C.

The new driver and the Galax vbios should if anything help even with just raising overall temps a bit. I could of course order a backplate heater from ElmorLabs for under $30, but I'm not that committed anymore to benching - 'they say' that there are other uses for a 4090  ...still, my not slow (around 2850 MHz) 6900XT scores about 1/3rd in Speedway of what the 4090 does🥴


----------



## KedarWolf

LtMatt said:


> The results are hidden. Can you share a screenshot showing the average core clock in the run?
> 
> This is the best I've managed so far on stock cooling without loads of artifacts.
> View attachment 2587950


----------



## yzonker

J7SC said:


> To show in HoF, you now need to have the EEC checked in NVP and also have the latest version of 3DM app + SystemInfo.
> 
> @Nizzen - you beat my unsubbed 30337 '''score''' in PR


You're losing me here. This has something to do with the Galax bios, or any run will not show up without ECC now? If so, then all of the others need to be deleted, otherwise it's pointless. I must be missing something?


----------



## Papusan

Hi. I have the Msi 4090 Suprim X Liquid. What of the 600w unverified OC bios rom to select from *Techpowerup*? The 600w VBIOS Version: 95.02.18.00.CD is listed with several different .rom files. Which one to download?

Any drawback with 95.02.18.00.CD? Or is there an better OC vbios for 4090 Suprim X Liquid?

And can latest Nvflash from first page be used or do I need a custom Nvflash version to flash an unverified vbios? I expect the vbios is from another 4090 card so no need for a special procedure for the flashing.

Thanks.

Edit. The different .rom for 600w Version: 95.02.18.00.CD









And these two had an extra note below.

251808.rom
MSI, Suprim Liquid X, NVIDIA GeForce RTX 4090, 600w bios iwth nvidia fw patch 1.2 (2235 / 1313).

251573.rom
MSI, Suprim Liquid X, NVIDIA GeForce RTX 4090, This is a 600w that my card shipped with, I didn't realize I had a different bios till I watched online reviews of my card that only had 520 watt bios. (2235 / 1313)


----------



## J7SC

yzonker said:


> You're losing me here. This has something to do with the Galax bios, or any run will not show up without ECC now? If so, then all of the others need to be deleted, otherwise it's pointless. I must be missing something?


...this has been on their site for weeks...











Papusan said:


> Hi. I have the Msi 4090 Suprim X Liquid. What of the 600w unverified OC bios rom to select from *Techpowerup*? The 600w VBIOS Version: 95.02.18.00.CD is listed with several different .rom files. Which one to download?
> 
> Any drawback with 95.02.18.00.CD? Or is there an better OC vbios for 4090 Suprim X Liquid?
> 
> And can latest Nvflash from first page be used or do I need a custom Nvflash version to flash an unverified vbios? I expect the vbios is from another 4090 card so no need for a special procedure for the flashing.
> 
> Thanks.


@Papusan ...check > here


----------



## yzonker

J7SC said:


> ...this has been on their site for weeks...
> View attachment 2587994
> 
> 
> 
> 
> @Papusan ...check > here


Huh, that's been there for more than a year?


----------



## mirkendargen

There's no way they'd force ECC to be enabled for HOF submissions without clearing the existing HOF. There would just never be any new entries until the next gen of cards.


----------



## yzonker

I submitted a ticket asking about the "Generic VGA" label we're getting, whether the Galax HOF is officially in their database, and if/when it will be fixed. We'll see if I get anything...


----------



## J7SC

yzonker said:


> Huh, that's been there for more than a year?


...might be a date mix-up with this:











mirkendargen said:


> There's no way they'd force ECC to be enabled for HOF submissions without clearing the existing HOF. There would just never be any new entries until the next gen of cards.


...happened several times before...


----------



## StreaMRoLLeR

Papusan said:


> Hi. I have the Msi 4090 Suprim X Liquid. What of the 600w unverified OC bios rom to select from *Techpowerup*? The 600w VBIOS Version: 95.02.18.00.CD is listed with several different .rom files. Which one to download?
> 
> Any drawback with 95.02.18.00.CD? Or is there an better OC vbios for 4090 Suprim X Liquid?
> 
> And can latest Nvflash from first page be used or do I need a custom Nvflash version to flash an unverified vbios? I expect the vbios is from another 4090 card so no need for a special procedure for the flashing.
> 
> Thanks.
> 
> Edit. The different .rom for 600w Version: 95.02.18.00.CD
> View attachment 2587996
> 
> 
> And these two had an extra note below.
> 
> 251808.rom
> MSI, Suprim Liquid X, NVIDIA GeForce RTX 4090, 600w bios iwth nvidia fw patch 1.2 (2235 / 1313).
> 
> 251573.rom
> MSI, Suprim Liquid X, NVIDIA GeForce RTX 4090, This is a 600w that my card shipped with, I didn't realize I had a different bios till I watched online reviews of my card that only had 520 watt bios. (2235 / 1313)


How is the coil whine on your card brother ? May you upload a short video sample


----------



## newls1

Damn FINALLY! Thanks to the 666 BIOS, I finally reached 29k.. Is this a good score For 3060Mhz GPU, +1500mem, 13900k @ 5.85


----------



## Krzych04650

newls1 said:


> Damn FINALLY! Thanks to the 666 BIOS, I finally reached 29k.. Is this a good score For 3060Mhz GPU, +1500mem, 13900k @ 5.85


Looks about right, I am getting 28 880 with 3030 MHz and +1350 mem.


----------



## J7SC

mirkendargen said:


> There's no way they'd force ECC to be enabled for HOF submissions without clearing the existing HOF. There would just never be any new entries until the next gen of cards.





yzonker said:


> I submitted a ticket asking about the "Generic VGA" label we're getting, whether the Galax HOF is officially in their database, and if/when it will be fixed. We'll see if I get anything...


As discussed before in this thread, it goes back to the HWBot complaints (see excerpt below, updated to Dec.8th, 2022). All that is a bit confusing now as to what will happen where...pls let us know what they (3DM support) say.


----------



## bmagnien

J7SC said:


> ...might be a date mix-up with this:
> View attachment 2587997
> 
> 
> 
> 
> ...happened several times before...


@J7SC @yzonker The outcome from this update is the below. Just a notatioon on the score details page noting whether ECC was on or off. AFAICT - this is the extent of ECC integration, its not being used to filter or qualify results. It is not showing on scores submitted prior to this sysinfo update though...so it's unclear if they'd be able to retroactively check should they wish to implement that requirement.


----------



## coelacanth

phillyman36 said:


> How do most of you fell about Zotac 4090 cards? Yes or pass for something else?


I have the Trinity OC. No coil whine and fans max out around 30% so it's very quiet. 3 year warranty with product registration.


----------



## J7SC

bmagnien said:


> @J7SC @yzonker The outcome from this update is the below. Just a notatioon on the score details page noting whether ECC was on or off. AFAICT - this is the extent of ECC integration, its not being used to filter or qualify results. It is not showing on scores submitted prior to this sysinfo update though...so it's unclear if they'd be able to retroactively check should they wish to implement that requirement.
> 
> View attachment 2588021


...likely then that HWBot will require ECC proof to be on; just not sure what 3DM will require for HoF, if so /when...


----------



## newls1

newls1 said:


> I cant get FC6 to run @ 3160MHz anymore... @ +220 i get 3045 and +225 it goes to 3090! What can I do to get 3060 back?


update on this ... Ive always done increments of +5's for core clocks increase in MSIAB, but to see if I could obtain back my 3060mhz speed like i had on the giga OC bios (stock) I knew the following behavior with this HOF 666 bios
+220 got me 3045, +225 got me 3090, so I did +222 and now i get 3075. So far im happy with that. Why was I getting 3060 @ +240 on old bios, but this new bios +222 is faster? And has anyone ever used such odd numbers for their core OC in MSIAB? Thought it was like standard rule to use numbers like +110,+115, +200 etc... Not 222!


----------



## mirkendargen

newls1 said:


> update on this ... Ive always done increments of +5's for core clocks increase in MSIAB, but to see if I could obtain back my 3060mhz speed like i had on the giga OC bios (stock) I knew the following behavior with this HOF 666 bios
> +220 got me 3045, +225 got me 3090, so I did +222 and now i get 3075. So far im happy with that. Why was I getting 3060 @ +240 on old bios, but this new bios +222 is faster? And has anyone ever used such odd numbers for their core OC in MSIAB? Thought it was like standard rule to use numbers like +110,+115, +200 etc... Not 222!


If you don't use a multiple of 15, it just rounds to the nearest multiple of 15. No point changing the core in anything other than increments of 15.


----------



## motivman

GALAX bios is amazing on my 4090 gaming OC, however, my memory overclock has been negatively affected. I can no longer run +1500 with low temps.. the good news though is that my effective clock is way high now compared to before.


----------



## 8472

StreaMRoLLeR said:


> Asking for strix owners, how is the coil-whine ? Suprim have those new HCI caps which make a CW unbearable. May someone upload basic 15 sec cw videos of their strix ?


FWIW, I'm on a similar mission to find a 4090 without coil whine. Cyberpunk and MW2 seem to make coil whine the loudest. 

My issue is esthetics and noise. I don't particularly care for the look of the Gigabyte OC but I may not have a choice. I prefer the card to be inaudible on air and I know minus the coil whine Asus and MSI can do that but I'm not sure about Gigabyte. I've heard that PNY, Zotac, etc have relatively loud fans as well.


----------



## Sheyster

Krzych04650 said:


> Also it is not limited to 2400 RPM like Neptune, I can run full 3200 RPM.


The fans were spinning a hair under 3300 RPM (~3290-3298) at 100% on my GB-G-OC card. I used it for a few hours of MW2 gaming (default settings), no issues at all.


----------



## dr/owned

The galax bios I could get about 750W out of it before it hit Pwr. HWinfo says about 790W peak total.

It seems fine on the Trio X...all 3 DP work (unlike the Strix that kills a DP)


----------



## yzonker

The generic VGA issue appears common for new cards. 



3dmark generic vga site:steamcommunity.com - Google Search


----------



## Panchovix

Sheyster said:


> The fans were spinning a hair under 3300 RPM (~3290-3298) at 100% on my GB-G-OC card. I used it for a few hours of MW2 gaming (default settings), no issues at all.


Yep, I find the 3300RPM a nice thing tbh, my TUF max was 2900-3000RPM I think.

Now probably the fans would last "less" at more than the 3000RPM I guess, but not an issue IMO.


----------



## energie80

can you guys post some mw2 benchies at 1440p competitive settings?
thanks


----------



## MrB123

energie80 said:


> can you guys post some mw2 benchies at 1440p competitive settings?
> thanks


sure but what is the "competitive settings"?


----------



## Nizzen

MrB123 said:


> sure but what is the "competitive settings"?


Everything low. Max fps in 1440p it means.


----------



## Panchovix

Nizzen said:


> Everything low. Max fps in 1440p it means.


Is there a CPU that doesn't bottlenecks with everything on low on a 4090?


----------



## J7SC

yzonker said:


> The generic VGA issue appears common for new cards.
> 
> 
> 
> 3dmark generic vga site:steamcommunity.com - Google Search


...good thing then that I know someone who horded Virge S3 cards 
---
I'll have to get a support ticket from 3DM as I cannot update to SystemInfo 5.55 on my Win 10 Pro machine w/o getting into an endless loop (worked fine on another machine which is Win Pro 11)...this likely happened when I reverted back to Win 10 Pro from 11 Pro on the system w/the 4090.


----------



## Panchovix

Man this HoF VBIOS surely is something, now I can do 3030Mhz-3015Mhz on everything without artifacts, even WZ2 which was very sensible above 2985Mhz for me before.

And effective clocks are way, way higher, about 3002Mhz at 3015 reported clocks, and 3012Mhz at 3030Mhz reported lol



Spoiler: Warzone 2 pics, 1440p render at 1.5x (4K)







































Temps (ambient temp is about 30°C)


----------



## motivman

Panchovix said:


> Man this HoF VBIOS surely is something, now I can do 3030Mhz-3015Mhz on everything without artifacts, even WZ2 which was very sensible above 2985Mhz for me before.
> 
> And effective clocks are way, way higher, about 3002Mhz at 3015 reported clocks, and 3012Mhz at 3030Mhz reported lol
> 
> 
> 
> Spoiler: Warzone 2 pics, 1440p render at 1.5x (4K)
> 
> 
> 
> 
> View attachment 2588044
> 
> View attachment 2588046
> 
> View attachment 2588047
> 
> View attachment 2588045
> 
> 
> 
> 
> Temps (ambient temp is about 30°C)
> View attachment 2588048


Too bad it negatively affects memory overclock SIGNIFICANTLY, lol.


----------



## Panchovix

motivman said:


> Too bad it negatively affects memory overclock SIGNIFICANTLY, lol.


Really? I'm at the same +1000 that I use everyday without issues, haven't tested +1100Mhz like when I do benchmarks though


----------



## schoolofmonkey

So, for me I can't get my overclock any higher, still a max of 3Ghz on the core, but the power limit isn't being hit and maintaining the clocks better, leading to one of the highest results with the same core/vram speeds..
Instead of hitting the cap of 510w it'll hit 550w (slider at 100%).

Oh I did get a STEAM achievement of "Mystery Machine"


----------



## J7SC

...I don't scare easily, but this Galax vbios...

Actually, it is just what the doctor ordered...I haven't pushed clocks or PL 'all the way' yet but more than 660 W at 115% out of 121% is enough . I used OCCT at 115% to generate the max wattskies you see below. Ambient for this was 25 C, and for 'regular' testing, I used Superposition 8K. VRAM definitely picking up some extra warmth, which is what I wanted.

VRAM speed setting is the same as before. The only thing I noticed is that this is a bit of a nervous nelly at idle on clocks and watts compared to stock. Also, I don't think you want to bench this on air for extended periods of time.


----------



## bigfootnz

Krzych04650 said:


> Inno3D X3 OC.


You are pushing your 450W without any problems with this Galax BIOS?

I’m asking this as I’ve also 450W.


----------



## schoolofmonkey

bigfootnz said:


> You are pushing your 450W without any problems with this Galax BIOS?


For me I just flashed back to stock, idk, something felt wrong seeing the max on the Galax 4090 SG is 510w yet it was doing 550w on stock..
I'm sure it's fine, but for the sake of 109 points extra, I don't see the point, for me that is.
Like I said I don't get any extra out of my gpu core, it's the same max OC I can get with the stock BIOS,.
But the voltages are higher which leads to not hitting the voltage cap, equaling more stable higher clocks, but that ends up with more heat, probably better for WC cards, I'm still air cooled.


----------



## yzonker

Fixed, 









I scored 29 242 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## energie80

MrB123 said:


> sure but what is the "competitive settings"?


I know it’s not the smartest move with a 4090 but I’m just curious.
That’s how competitive fps are played


----------



## Pepillo

jootn2kx said:


> Flashed the galax 666 bios on my gainward card and the first thing I noticed it gives a bit more framerates per watt.
> For example I can set my powerlimit to 70% and it will consume about 380watt but I saw higher frame rates in most games en better scores in benchmarks compared with the stock bios around the same settings/watts and voltages.
> 
> Regardsless 380watt in my test the temperature is 2 degrees more than [email protected]
> So that means it's pushing the card more in some kind of way regardless the consumption


¿Gainward Phantom model?

THX


----------



## J7SC

schoolofmonkey said:


> For me I just flashed back to stock, idk, something felt wrong seeing the max on the Galax 4090 SG is 510w yet it was doing 550w on stock..
> I'm sure it's fine, but for the sake of 109 points extra, I don't see the point, for me that is.
> Like I said I don't get any extra out of my gpu core, it's the same max OC I can get with the stock BIOS,.
> But the voltages are higher which leads to not hitting the voltage cap, equaling more stable higher clocks, but that ends up with more heat, probably better for WC cards, I'm still air cooled.


That's probably the right thing to do. The 4090 performance is so far out there already that pushing the card for gaming really doesn't have the risk-reward ratio that makes sense. For future reference, getting a card with a dual vbios allows for convenience and some experimentation (and an extra degree of safety). I also prefer custom water-cooling anyhow, even for the lack of noise. I did some light test-benching and noticed that in TimeSpy Extreme, HWInfo reported something like 620W w/the Galax, and that is before transient spikes which can elude some of the monitoring software....For gaming (I game exclusively at 4K), I'll continue to use the stock vbios - which is no slouch anyway: As of this morning, results with the stock bios already gave me plenty...









...that said, I have already broken every single score I had just with basic setup runs. ...the only setup run w/ the Galax I subbed so far is below (and not even my best PR... ) as there's always 'next week', another vbios and more drivers...I am sure a lot of high-scores will change in a jiffy









I scored 29 298 in Port Royal


AMD Ryzen 9 5950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com




For general 'fyi', per below, the stock Giga vbios still seems ever so slightly faster at the same VRAM speed, but the differences are very minor...


----------



## jootn2kx

Pepillo said:


> ¿Gainward Phantom model?
> 
> THX


Phantom non GS model


----------



## Nizzen

Panchovix said:


> Is there a CPU that doesn't bottlenecks with everything on low on a 4090?


It isn't just the cpu, but memory too. Memory performance is everything in this game. 8400c34 or 4300c14, and you are still bottlenecked. Still scaling with.
Same with BF 2042


----------



## dk_mic

J7SC said:


> ... The only thing I noticed is that this is a bit of a nervous nelly at idle on clocks and watts compared to stock...


Can you elaborate what you mean regarding the stock behaviour? I can observe lowest idle GPU Total Power with this Galax BIOS (might be the newest NVIDIA driver though)
(around 28W w/ dual displays, one at 144Hz)

Another thing:
before you put too much weight into memtest_vulkan bandwidth numbers:
Here is quote from the author:


> Displaying memory bandwidth is more a debug feature. The tool uses a lot of mem bandwidth, but it does NOT try achieve a maximum. The memory access pattern is 'randomly-iteratively-designed' a mix of linear and random accesses that should be best for detecting errors, not for checking max memory bandwidth. So bcrypt or any other crypto-based algo is not used in testing or application logic, maybe it is imported by some of the compiled-in libraries. Tool is targeted as "stability check tool", not as "memory bandwidth estimator".


----------



## Blameless

dk_mic said:


> before you put too much weight into memtest_vulkan bandwidth numbers:
> Here is quote from the author:


In my experience, memtest_vulkan's bandwidth figures can be extremely consistent; if there is an appreciable variance, under otherwise identical testing conditions, it's probably indicative of something. The bandwidth figures can reveal errors that ECC/EDC otherwise correct, differences in effective clock speed, or timing table variations.

It doesn't need to try for maximum bandwidth, it just needs to be consistent.
_
That said_, the difference in memory size tested is far more significant than the difference in reported bandwidth in J7SC's test. A 1.1GiB/s variance (a tenth of a percent) in bandwidth when very near 1TiB/s is not meaningful.


----------



## dk_mic

galax 666W bios seems to be approved in leaderboards now


----------



## yzonker

dk_mic said:


> galax 666W bios seems to be approved in leaderboards now


Yea someone from their support staff emailed me in the middle of the night indicating they added the HOF card. Even my previous runs were valid at that point.


----------



## J7SC

dk_mic said:


> Can you elaborate what you mean regarding the stock behaviour? I can observe lowest idle GPU Total Power with this Galax BIOS (might be the newest NVIDIA driver though)
> (around 28W w/ dual displays, one at 144Hz)
> 
> Another thing:
> before you put too much weight into memtest_vulkan bandwidth numbers:
> Here is quote from the author:





Blameless said:


> In my experience, memtest_vulkan's bandwidth figures can be extremely consistent; if there is an appreciable variance, under otherwise identical testing conditions, it's probably indicative of something. The bandwidth figures can reveal errors that ECC/EDC otherwise correct, differences in effective clock speed, or timing table variations.
> 
> It doesn't need to try for maximum bandwidth, it just needs to be consistent.
> 
> _That said_, the difference in memory size tested is far more significant than the difference in reported bandwidth in J7SC's test. A 1.1GiB/s variance (a tenth of a percent) in bandwidth when very near 1TiB/s is not meaningful.


...on 'stock behaviour', I was referring to the much faster-paced voltage and watt changes in a given interval observed in HWInfo's 'current column' during idle when using 'optimal performance' in NVpanel (so downclocking), rather than 'prefer maximum performance'.

As to memtest_vulkan, I had read the notes...
@Blameless already pointed out the major advantages, though I had mentioned that the observed 'differences are very minor' re. bandwidth between the two vbios in my earlier post. While memtest_vulkan is not the only VRAM tool I use, I like its consistency and even the easier-to-share output tables. By now, I have build up a mini database of it. Finally, it stresses VRAM quite heavily judging by GPUz readings on memory bus load, board power draw and the VRAM temps (which incidentally are about 1.5C to 2C higher w/ the Galax).

@yzonker ...re. 3DM support staff & time zones, the last time I had direct contact with them, it was with someone in Finland.


----------



## Betroz

Nizzen said:


> It isn't just the cpu, but memory too. Memory performance is everything in this game. 8400c34 or 4300c14, and you are still bottlenecked. Still scaling with.
> Same with BF 2042


Did you test that at 3440x1440 or 4K? (BF2042 Low settings)
I know my 10900KF with tweaked memory is a bottleneck even at 3840x1600 Low in BF2042, but I wonder how much more CPU and RAM performance I really need to max out my 4090 in this game.

Edit : If a 13900K and DDR5 8400C34 would still hold me back (at 3840x1600 Low), then why would I bother upgrading now... Maybe a Ryzen 7000 3D or Intel next gen would be worth the wait?


----------



## Nizzen

Betroz said:


> Did you test that at 3440x1440 or 4K? (BF2042 Low settings)
> I know my 10900KF with tweaked memory is a bottleneck even at 3840x1600 Low in BF2042, but I wonder how much more CPU and RAM performance I really need to max out my 4090 in this game.


Playing now 3440x1440 only. Using max settings in that resolution, and I'm gpubound most of the time. Around 170-230fps. Perfect range for g-sync 175hz. Very smooth gameplay.

Low settings, and 13900k @ 5800 all core and 8400c34 is pushing 250-300++ fps


----------



## Betroz

Nizzen said:


> Playing now 3440x1440 only. Using max settings in that resolution, and I'm gpubound most of the time. Around 170-230fps. Perfect range for g-sync 175hz. Very smooth gameplay.


Well G-sync is off over 175 for you, so maybe no point in getting 200+ fps. I game at a higher 3840x1600 (160 Hz) so Ultra settings is a bit too much even for a 4090 card.


----------



## dk_mic

J7SC said:


> ...on 'stock behaviour', I was referring to the much faster-paced voltage and watt changes in a given interval observed in HWInfo's 'current column' during idle when using 'optimal performance' in NVpanel (so downclocking), rather than 'prefer maximum performance'.
> 
> As to memtest_vulkan, I had read the notes...
> @Blameless already pointed out the major advantages, though I had mentioned that the observed 'differences are very minor' re. bandwidth between the two vbios in my earlier post. While memtest_vulkan is not the only VRAM tool I use, I like its consistency and even the easier-to-share output tables. By now, I have build up a mini database of it. Finally, it stresses VRAM quite heavily judging by GPUz readings on memory bus load, board power draw and the VRAM temps (which incidentally are about 1.5C to 2C higher w/ the Galax).


I am also using optimal performance and a 0,875 UV @ 2535 MHz undervolt for gaming and think it's quite 'tame' at the Desktop, sitting usually at 210 MHz. But I didn't pay much attention to the behaviour before.

And yes, memtest_vulkan is very consistent, did some tests and neither temp or core frequency seemed to change the bandwidth. I just thought its method might involve some random seed or so leading to some bigger variance, but it does not. 

I can also confirm that there are differences between maximum stable memory frequencies between the galax and other bios. I am erroring even at +1000 with the galax 
Anways, it benches considerably better in Port Royal, but i couldnt beat my previous TSE and FSU scores.

Gonna do some testing how it performs with daily settings when I have installed my waterblock. I game at 3840 x 1600 @ 140 Hz, and I literally can't tell the difference between 0.875 V @ 2535 MHz and a full 3+ GHz overclock, when not looking at RTSS.

I also hope mounting the waterblock will do something with my VRAM, but I think that is just bad silicon.


----------



## phillyman36

Delete please. i was posting that Corsair had their 12vhpwr cable in stock. I grabbed 1 but they are now sold out.


----------



## jootn2kx

Confirm indeed it works now with the galax 666bios !
NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 7 5800X3D,Gigabyte Technology Co., Ltd. B550M S2H (3dmark.com)


----------



## Krzych04650

bigfootnz said:


> You are pushing your 450W without any problems with this Galax BIOS?
> 
> I’m asking this as I’ve also 450W.


I am. Those limits are entirely arbitrary and 4090 chips themselves are the same for every AIB, so there are no concerns in that regard. I am not much of an expert on VRMs, but this 620-630W max power draw that you can see in some games in my testing is a total board power, not just GPU core power, the actual GPU core power in those situations is only around 510W, so at 1.1V it would be around 465A, and the core VRM is rated for 770A, memory is separate. It is probably not that simple but still  Also this kind of a power draw is a very rare edge case, most games do not come even close to 600W, let alone more.

This is still mostly for fun as according to my testing it doesn't really make sense to go above 450W in 99% of cases but seeing this kind of card just maintain full clock without hitting any power limits and seemingly not caring about temperatures is great to see and play with.

Also this 666W BIOS is not some kind of XOC BIOS or anything like that, it is still a normal BIOS that maintains all the normal functionality and has a range from 150W to 666W, so you can use it however you want, and it is just faster per clock or per watt vs other BIOSes, so there is no point in flashing back to stock BIOS or whatever. You can just create a profile with 83% PL after you are done playing with it and run at 450W if you want. Or even 65% PL for 350W.


----------



## energie80

Can you flash nooses on the fe?


----------



## kryptonfly

Laithan said:


> IMO the quickest way myself and others have found to identify memory stability is with Heaven 4.0 (max settings). It usually shows in the first scene.


Hum, I'm not so sure. I could test at +1820 mem and +200 core, 1.05v, I didn't see any artifact. +1850 doesn't pass. As soon as I launched Resident Evil 2 at 4k max it instantly froze the pc. I could play at +1800 mem but didn't see any 1 fps more. Driver 527.37


----------



## yzonker

energie80 said:


> Can you flash nooses on the fe?


No, unfortunately the FE is out of luck again.


----------



## Slipknot79

Since I have moved to water-cooling, i have the problem with VRAM stability. I have read that this is a temp issues, due to too low temps. My temps are around 36-40°C in-game. I would like to increase them up to 60°C to regain stability. What solutions could help? Let the thin foil on the VRAM pads? Or half coverage of the VRAM chips? What else could help?


----------



## jootn2kx

kryptonfly said:


> Hum, I'm not so sure. I could test at +1820 mem and +200 core, 1.05v, I didn't see any artifact. +1850 doesn't pass. As soon as I launched Resident Evil 2 at 4k max it instantly froze the pc. I could play at +1800 mem but didn't see any 1 fps more. Driver 527.37
> View attachment 2588118
> View attachment 2588119


Same here +1700 memory without any artifacts with the Galax bios


----------



## yzonker

When I tested the Galax bios the other day, I didn't see any significant change in core/mem OC's. Hopefully I'll have time today to play with it now that 3DMark recognizes it.


----------



## jootn2kx

yzonker said:


> When I tested the Galax bios the other day, I didn't see any significant change in core/mem OC's. Hopefully I'll have time today to play with it now that 3DMark recognizes it.


With this bios it's better to compare the frame rates, strangely I have lower clock speeds but i'm getting higher framerates in all benchmarks and games.
Also the GPU is getting 5 degrees more with Galax bios without seeing more usage/watt so it's definitely boosting something xp


----------



## yzonker

jootn2kx said:


> With this bios you get a way higher "Average clock frequency " which has more impact than normal clock speeds.
> It's better to compare the frame rates between the 2 bioses


Well I showed that originally in this thread.


----------



## Panchovix

yzonker said:


> When I tested the Galax bios the other day, I didn't see any significant change in core/mem OC's. Hopefully I'll have time today to play with it now that 3DMark recognizes it.


Maybe cards that were already good didn't got much difference? While my chip was barely stable above 2970-2985Mhz at 1.1V, now it breezes at 3030Mhz without issues. Haven't tested more than that actually, My VRAM OC is the same though


----------



## Laithan

Panchovix said:


> Maybe cards that were already good didn't got much difference? While my chip was barely stable above 2970-2985Mhz at 1.1V, now it breezes at 3030Mhz without issues. Haven't tested more than that actually, My VRAM OC is the same though


Older architechtures like Maxwell could actually push up to around 1.3v on core but software never showed more than 1.275v... I wonder if history is repeating here showing a max of 1.1v in software but is actually pushing slightly more with this BIOS? Also aligns with slight temp increases.


----------



## newls1

Can anyone confirm mem performance clock for clock is the same or worse on this 666 bios?


----------



## Krzych04650

Did anyone notice those strange framtime issues in some games in fullscreen? Something like this:









For me it can be resolved completely by either running borderless or reducing render que to 1 or 0 with low latency mode depending on the game, like here









It is not exactly stuttering, but it reduces 1% and 0.1% lows a lot. For example in Plague Tale here normal is 130 avg/115 1%/112 0.1% and this strange issue is 130 avg/86 1%/82 0.1%


----------



## kryptonfly

jootn2kx said:


> Same here +1700 memory without any artifacts with the Galax bios


I'm still with the stock bios Gigabyte Gaming OC for now. My card heats more than usual Gaming OC +8/10°C. I will flash when I will have the Bykski WB.


Laithan said:


> Older architechtures like Maxwell could actually push up to around 1.3v on core but software never showed more than 1.275v... I wonder if history is repeating here showing a max of 1.1v in software but is actually pushing slightly more with this BIOS? Also aligns with slight temp increases.


Maybe my theory is indeed there's some increase voltages but bios can't see more than 1.1v ...because P(W)=U(V)xI(A) and if it produces more heat, more perf, more watt +650W obviously voltages or currents are higher. Der8auer made a volt-mod and GPU-Z can't show the true voltage so maybe it's more than 1.1v with the Galax bios. Obviously I should say.


----------



## Panchovix

Laithan said:


> Older architechtures like Maxwell could actually push up to around 1.3v on core but software never showed more than 1.275v... I wonder if history is repeating here showing a max of 1.1v in software but is actually pushing slightly more with this BIOS? Also aligns with slight temp increases.


It makes sense tbh, since not even the 4090 Strix at 1.1V did wonders like this HoF VBIOS at "1.1V". Gonna take in mind tho, so only for benchs gonna test "1.1V", on the rest I will be undervolting (which also this VBIOS def helps to get the effective clocks as close as the reported clocks)


----------



## yzonker

kryptonfly said:


> I'm still with the stock bios Gigabyte Gaming OC for now. My card heats more than usual Gaming OC +8/10°C. I will flash when I will have the Bykski WB.
> Maybe my theory is indeed there's some increase voltages but bios can't see more than 1.1v ...because P(W)=U(V)xI(A) and if it produces more heat, more perf, more watt +650W obviously voltages or currents are higher. Der8auer made a volt-mod and GPU-Z can't show the true voltage so maybe it's more than 1.1v with the Galax bios. Obviously I should say.


There are other voltages that affect max clock speed and more importantly effective clocks. You can see some of them at least in the EVGA classy tool. 



https://www.overclock.net/attachments/timespy_21112-jpg.2474334/


----------



## Teussi

Have i lost silicon lottery. Gaming trio only does +110 core and 1500 mem. Can run higher core and it pass benchmark, but 110 is really only stable in raytracing games.


----------



## newls1

Teussi said:


> Have i lost silicon lottery. Gaming trio only does +110 core and 1500 mem. Can run higher core and it pass benchmark, but 110 is really only stable in raytracing games.


drop your mem and see if core increases. I have FC6 CTD for a few days and found it to be my mem @ +1500. Loaded up Heaven Benchmark with cranked settings and artifacting started. I dropped to 1410 and seems ok now. I increased my core to 3090Mhz now where as before 3060 was tops.


----------



## Panchovix

yzonker said:


> There are other voltages that affect max clock speed and more importantly effective clocks. You can see some of them at least in the EVGA classy tool.
> 
> 
> 
> https://www.overclock.net/attachments/timespy_21112-jpg.2474334/


Wondering how much does that other voltages affects in both OC and durability (degradation) of the core. If those voltages are changed on the HoF VBIOS, then them definitely seem to have helped in my case.




newls1 said:


> Can anyone confirm mem performance clock for clock is the same or worse on this 666 bios?


At least, on my TUF 4090 with the HoF VBIOS, I have the same VRAM overclock (+1100Mhz), and more Core overclock. (3030Mhz vs 2970Mhz Stable)




Teussi said:


> Have i lost silicon lottery. Gaming trio only does +110 core and 1500 mem. Can run higher core and it pass benchmark, but 110 is really only stable in raytracing games.


+110 on the core doesn't mean much without the stock boost. For example, if you stock boost was 2920Mhz, +110 would be 3030Mhz and that would be pretty good at 1.05V. 
+1500 VRAM is average/a bit above average.


----------



## Sheyster

Panchovix said:


> Maybe cards that were already good didn't got much difference? While my chip was barely stable above 2970-2985Mhz at 1.1V, now it breezes at 3030Mhz without issues. Haven't tested more than that actually, My VRAM OC is the same though


My VRAM OC is also the same. No question that this BIOS delivers more FPS than the ASUS V2 BIOS at a given clock speed. I will probably stay with it as my daily, at default settings. The only odd behavior I noticed is one of the fans (Fan 1 in GPU-Z) jumping up to ~2700 RPM at some point during a gaming session. Core temps were fine though, 64 max.


----------



## yzonker

Panchovix said:


> Wondering how much does that other voltages affects in both OC and durability (degradation) of the core. If those voltages are changed on the HoF VBIOS, then them definitely seem to have helped in my case.
> 
> 
> 
> At least, on my TUF 4090 with the HoF VBIOS, I have the same VRAM overclock (+1100Mhz), and more Core overclock. (3030Mhz vs 2970Mhz Stable)
> 
> 
> 
> +110 on the core doesn't mean much without the stock boost. For example, if you stock boost was 2920Mhz, +110 would be 3030Mhz and that would be pretty good at 1.05V.
> +1500 VRAM is average/a bit above average.


I found this over on EVGA forum posted by Sajin. Was trying to remember which was mem voltage mainly.

Nvvdd = Gpu die voltage
Fbvdd = Memory voltage
Msvdd = Secondary gpu die voltage. Has a lot to due with stabilizing internal clock speed of the chip aka the frequency you can’t see.

I kinda doubt Galax has done anything with their production bios that would hurt the card. Although it begs the question as to why none of the other AIBs did this.


----------



## RaMsiTo

galax bios 666w



















I scored 20 408 in Time Spy Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

BTW I did mention to the 3DMark support person I contacted that this new bios was going to boost scores higher. Hoping to help them at least some on sorting out bugged runs vs good runs.


----------



## ttnuagmada

Teussi said:


> Have i lost silicon lottery. Gaming trio only does +110 core and 1500 mem. Can run higher core and it pass benchmark, but 110 is really only stable in raytracing games.


If it makes you feel any better, I can't go above +1200 on vram


----------



## yzonker

Made a couple of runs with different VF curves. First one is just a straight +210/+1750.









I scored 29 397 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





2nd one uses a type of curve that I had good success with for 30 series, but not so much on the 4090. It appears to have much higher clocks, but effective drops with it so ends up being about the same on the 4090.

So don't be misled in to thinking I found a bunch more core clock with the Galax. I think there is a little more as the 3090 I did above is about the best I've seen with my card. Mem is very slightly better too. Maybe 30-50.









 I scored 29 416 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Nd4spdvn

ttnuagmada said:


> If it makes you feel any better, I can't go above +1200 on vram


... And mine locks hard at +1000 mem. Rock solid at +968 though. This is since the day I got the card irrespective of its bios, including the 666W Galax, which is great btw on this Suprim X 4090 still on air. Who's gonna beat me (going downwards)?


----------



## Shocchiz

I tried the Galax bios on my Phantom.
Memory oc is the same.
I gained +15mhz on every profile (0.95v, 0.975v, 1.1v).
Average clocks are higher.
Power consumption is about 10W _lower_ (every profile, tested with Requiem).
Temps are about the same (fan profile is really different so it’s difficult to make a direct comparison).
I like this bios but, in the end, game framerates are really the same if you don’t overvolt or push the PL to the max, a tiny bit higher if you do (not really remotely worth the insanely higher power consumption imho).


----------



## Betroz

On my TUF OC 4090 with +200 core and +1400 vram at 1.1v seems stable so far in BF2042 at 3840x1600 Ultra settings. Have'nt tried higher mem speed yet. The core then sits at 2955 to 2970 Mhz in game (BF2042 only game I've tried as of now). My card is handycapped by PCI-E 3.0 and CPU/RAM speeds anyways. It would be nice with a CPU/RAM combo that gives me 30-50% more performance.


----------



## Glerox

I finally installed my 4090 yesterday. What an ABSOLUTE BEAST. I don't remember seeing such a gap between 2 generations. I'm currently playing with overclocking this monster. This kind of fun only happens every 2 years haha. I'm waiting for the EKWB water block (gigabyte RTX 4090 GAMING OC) but the stock cooler is doing an awesome job! 

My questions are :
-What is the average OC we see on this card? I've read around +150 on core, +1000 on vram?
-Do you guys force constant voltage or not?

Finally a card that I don't need to shunt mod! I don't reach power limit ever. Amazing.


----------



## J7SC

Laithan said:


> Older architechtures like Maxwell could actually push up to around 1.3v on core but software never showed more than 1.275v... I wonder if history is repeating here showing a max of 1.1v in software but is actually pushing slightly more with this BIOS? Also aligns with slight temp increases.


...those were the good old Maxwell & Co days ...and with some earlier Kepler models, one could just extract the vbios, rename it as a text file and edit away on notepad - DIY all the way ! 

I'll continue to use the stock vbios for gaming (dual bios cards are great !); just have to remember to hold the shift key down when shutting down before switching. For gaming on a 48 OLED (some of it in widescreen mode), I rarely even use_ any_ oc and 1.05v or less on the core - typically, that comes to between 370W and 430W peak load, depending on game.

This Galax is great for benching, though...and I wonder a little bit how they accomplish some of it. Assuming that the 1.1v limit and monitoring results are real, it must be some sort of secondary voltage & other parameters. I had already mentioned the faster apparent polling at idle. With the stock vBios, my best regular (as opposed to irregular🥴) results in PortRoyal were just under 29500 in full benching mode and conditions (including lower ambient temps etc)...when I have more time and feel like another benching session with 18 C instead of 25 C, it's time to see what the Galax bios can do in a full bench mode setting. 

What is a bit odd in the few runs I have done so far with it though is that the relationship between peak MHz and effective MHz has changed - with the Galax, they are much closer together, and while various monitoring software tended to agree on peak core MHz (even with 3DM SysInf) with the stock vbios, now that seems different - the min / effective clocks mostly match, but the max clocks diverge more.


----------



## yzonker

FWIW, here's my runs on the Galax bios,









I scored 29 423 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 11 294 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 20 522 in Time Spy Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





My DDR4 board hurts me here in the CPU score relative to DDR5,

w/o reBar,









I scored 38 551 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





w/ reBar,









I scored 38 359 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Slipknot79

Slipknot79 said:


> Since I have moved to water-cooling, i have the problem with VRAM stability. I have read that this is a temp issues, due to too low temps. My temps are around 36-40°C in-game. I would like to increase them up to 60°C to regain stability. What solutions could help? Let the thin foil on the VRAM pads? Or half coverage of the VRAM chips? What else could help?


Ok, i have put the thin foil back on the VRAM pads. Temps went up from 36-40°C to 58°C. +1800MHz seem to be stable for now, long term testing needed.


----------



## yzonker

Slipknot79 said:


> Ok, i have put the thin foil back on the VRAM pads. Temps went up from 36-40°C to 58°C. +1800MHz seem to be stable for now, long term testing needed.


Both sides, or just block side?


----------



## kryptonfly

yzonker said:


> Made a couple of runs with different VF curves. First one is just a straight +210/+1750.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 397 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 2nd one uses a type of curve that I had good success with for 30 series, but not so much on the 4090. It appears to have much higher clocks, but effective drops with it so ends up being about the same on the 4090.
> 
> So don't be misled in to thinking I found a bunch more core clock with the Galax. I think there is a little more as the 3090 I did above is about the best I've seen with my card. Mem is very slightly better too. Maybe 30-50.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 416 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> View attachment 2588183


Why my V/F curve is limited to 3000 ?








The minimum voltage that I could see is 865mv for 2565mhz (~2535mhz effective, depends of temp and load) with stock bios. Stable Cyberpunk 2077 & co at 870mv.


----------



## yzonker

kryptonfly said:


> Why my V/F curve is limited to 3000 ?
> View attachment 2588205
> 
> The minimum voltage that I could see is 865mv for 2565mhz (~2535mhz effective, depends of temp and load) with stock bios. Stable Cyberpunk 2077 & co at 870mv.


You have to edit the AB config file. Just search it for 3000.


----------



## yzonker

Anyone gotten an artifacted run out of the Galax bios? Everytime I moved the mem clock up I just got crashes. Never saw a single artifact.


----------



## Nd4spdvn

yzonker said:


> Anyone gotten an artifacted run out of the Galax bios? Everytime I moved the mem clock up I just got crashes. Never saw a single artifact.


Same here with the Galax bios, no artifacts just freezing when pushed the mem higher than what I know it can do.


----------



## chibi

Nizzen said:


> *PG27AQDM*


This would be my end game monitor, fingers crossed for real gsync.


----------



## Slipknot79

yzonker said:


> Both sides, or just block side?



One side: VRAM-chip -> foil -> pad -> block
Removed air bubbles between foil and pad.

Speed way runs stable now. Next test will be Warhammer Darktide, as this game is also a good VRAM oc stability test.
But i can already say, that i got VRAM stability back, things became better.


----------



## mirkendargen

Slipknot79 said:


> One side: VRAM-chip -> foil -> pad -> block
> Removed air bubbles between foil and pad.
> 
> Speed way runs stable now. Next test will be Warhammer Darktide, as this game is also a good VRAM oc stability test.
> But i can already say, that i got VRAM stability back, things became better.


And people all thought I was crazy suggesting to keep the backing on one side of the thermal pads. I would have done it on the block side though since that's a flatter surface and let the pads contour around the VRAM chips.


----------



## KingEngineRevUp

newls1 said:


> drop your mem and see if core increases. I have FC6 CTD for a few days and found it to be my mem @ +1500. Loaded up Heaven Benchmark with cranked settings and artifacting started. I dropped to 1410 and seems ok now. I increased my core to 3090Mhz now where as before 3060 was tops.


If you're going to have to choose between memory or core, better memory is the way to go unless your core can do like +200-300.


----------



## Glerox

Update on first day of overclocking with Gigabyte RTX 4090 Gaming OC with stock cooler/stock bios (OC) paired with 7700x :

My best result :

+210 core
+1650 mem
+133% PL
+100% voltage
Force constant voltage

28001 in Port Royal

Did I get a good card?


----------



## dr/owned

J7SC said:


> ...I don't scare easily, but this Galax vbios...
> 
> Actually, it is just what the doctor ordered...I haven't pushed clocks or PL 'all the way' yet but more than 660 W at 115% out of 121% is enough . I used OCCT at 115% to generate the max wattskies you see below. Ambient for this was 25 C, and for 'regular' testing, I used Superposition 8K. VRAM definitely picking up some extra warmth, which is what I wanted.


I just want the 1000W-er so I can finisher her off:










For power monitoring I just watch my whole-house since it's freakishly accurate current clamps on the mains. I was trying to find the most hard core Furmark settings and this was 1400W -> 2218W jump:


----------



## Baasha

Guys,

I started getting black screen error on one of my 4090 FE (Z690 Rig) since yesterday evening. While playing a game (CoD Vanguard), screens went black, fans spun up (100%?) and I had to hard-reset to get back into Windows.

Anyway, got spooked and checked the 16-pin connection - no problems there; nothing melted/damaged etc.

I got my Cable-Mod 12VHPWR cable earlier in the week and installed it - made sure I pushed it in all the way and things seemed to work fine until yesterday.

Am on the latest driver 527.56, installed Nvidia Firmware for the GPU, installed latest BIOS for mobo - Asus Z690 Extreme (BIOS v. 2204), latest Intel ME etc.

Can't figure out what's causing it. Ran some Time Sply Ultra - ran fine. Ran Portal Royal, ran fine. 2nd time running Port Royal caused black screen issue.

Played GTAV, was able to play for ~ 10 mins befure black screen. Happens on other games as well.

Is it the CableMod cable? Or is it the GPU itself?

I'm really concerned and would appreciate any help with this.

Thanks in advance.


----------



## Frosted racquet

@Baasha OCed or stock? Air cooled or watercooled?


----------



## J7SC

...in the old days, this would have been nice benching weather, but now it is probably too cold for 4090's VRAM . Besides, Grouse Mtn. is only 10 min from here, so time for a quick trip to refuel.


----------



## Baasha

Frosted racquet said:


> @Baasha OCed or stock? Air cooled or watercooled?


OC'd, power slider at 133%. +200 on core and +1500 on Mem. Air-cooled (i.e. 4090 FE)

Will try with it stock to test it.. only happened when gaming (OC'd) and 3DMark Port Royal (OC'd).


----------



## yzonker

Baasha said:


> OC'd, power slider at 133%. +200 on core and +1500 on Mem. Air-cooled (i.e. 4090 FE)
> 
> Will try with it stock to test it.. only happened when gaming (OC'd) and 3DMark Port Royal (OC'd).


Try lowering the memory.


----------



## Faltzer

I wonder if anyone here knows whats the difference between Gigabyte waterforce 4090 rev 1.0 and rev 1.1
In the specs page the newest rev (1.1) has a 1000 PSU requirement, while the older rev 1.0 requires 850 watts.

My waterforce 4090 rev 1.0 scores 27600 at max with around 190+ core 1350+mem, I have managed to use 600watts at furmark. But I came to a realization that 27.5ish at port royal isn't that much of a good score. I have a 12900ks 5.3 p cores 5.2 e cores, 6200 ram 64gb two sticks. I began thinking that it's a good card, but not for overclocking.(i might be opening a dedicated threat for this question).


----------



## bmagnien

Nd4spdvn said:


> Same here with the Galax bios, no artifacts just freezing when pushed the mem higher than what I know it can do.


this is an artifacted run of mine: +1700 mem on Galax bios. +1690 gave me no artifacts and highest 'legit' run of 29,305.








I scored 29 737 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

bmagnien said:


> this is an artifacted run of mine: +1700 mem on Galax bios. +1690 gave me no artifacts and highest 'legit' run of 29,305.
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 737 in Port Royal
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Well I thought it was probably too good to be true (that the bios wouldn't artifact).


----------



## chibi

What's the go to 4090 to get right now that's not super exotic and hard to get? Is the Galax HOF still the halo card that's very scarce and we have to go with one of the normal vendors such as ASUS/MSI/Zotac?


----------



## newls1

chibi said:


> What's the go to 4090 to get right now that's not super exotic and hard to get? Is the Galax HOF still the halo card that's very scarce and we have to go with one of the normal vendors such as ASUS/MSI/Zotac?


giga gaming OC.... solid card to choose from and availability seems good at times


----------



## newls1

KingEngineRevUp said:


> If you're going to have to choose between memory or core, better memory is the way to go unless your core can do like +200-300.


my memory gives up the ghost @ anything faster then +1425, so I leave it at +1410. Core is good for +220 (3090MHz @ 1.1v) Im happy.. all I need as its already stupid fast for anything I need it to do.


----------



## Baasha

yzonker said:


> Try lowering the memory.


Well, I think I figured it out - the metallic cover on the Cable-Mod cable (that covers the GPU-side connection) was somehow preventing the connection from being 100% in - I removed that cover and plugged in the cable and have been playing all evening - CoD, GTAV etc etc. and not a single crash - with the same OC settings etc.

So PSA to those who have the Cable-Mod cables - make sure your cable is 100% plugged in - remove the metallic (or plastic if you didn't get that option) cover to really push it in there!

Previously, I couldn't even get 1 complete round in CoD before it would shut off - I just played for 2+ hours without pause and there were no hiccups at all!

I will keep 'testing' but so far so good.

EDIT: Welp, that was shortlived!  

It crashed on desktop under no load!

I tried playing CoD MWII and it crashed within 5 minutes. Damn.. this is not good.

I think I'll try using the stock adapter/cables and see if that works. Any other ideas!?!


----------



## th3illusiveman

Baasha said:


> Well, I think I figured it out - the metallic cover on the Cable-Mod cable (that covers the GPU-side connection) was somehow preventing the connection from being 100% in - I removed that cover and plugged in the cable and have been playing all evening - CoD, GTAV etc etc. and not a single crash - with the same OC settings etc.
> 
> So PSA to those who have the Cable-Mod cables - make sure your cable is 100% plugged in - remove the metallic (or plastic if you didn't get that option) cover to really push it in there!
> 
> Previously, I couldn't even get 1 complete round in CoD before it would shut off - I just played for 2+ hours without pause and there were no hiccups at all!
> 
> I will keep 'testing' but so far so good.
> 
> EDIT: Welp, that was shortlived!
> 
> It crashed on desktop under no load!
> 
> I tried playing CoD MWII and it crashed within 5 minutes. Damn.. this is not good.
> 
> I think I'll try using the stock adapter/cables and see if that works. Any other ideas!?!


I had no crashes on my FE w/ the default adapter and the cablemod cable. Good call on removing the shroud though, had to do the same on my cable because it stopped the connections from being 100% in. Don't know why they didn't trouble shoot that before sale.

Regarding the crashes, always a good idea to isolate the thing you suspect (GPU in your case), so run everything stock (CPU, RAM (xmp defaults), etc...) just to make sure its the GPU playing up.


----------



## Laithan

MSI Suprim Liquid X 4090 with the MSI 600W BIOS vs Galax 666W BIOS @ identical settings









Result







www.3dmark.com


----------



## J7SC

Laithan said:


> MSI Suprim Liquid X 4090 with the MSI 600W BIOS vs Galax 666W BIOS @ identical settings
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> View attachment 2588247


...nice and useful comparison. How's your temp delta on and between core and VRAM when comparing the two vbios, i.e. with GPUz and/or HWInfo ?

Also, after today's torrent of 'interesting' results at HoF, I think we better start learning how to bench with EEC on. I wonder if there are any special steps to take compared to benching with regular VRAM settings .

---

EDIT - some fascinating vids on a 4090 Strix repair attempt, including from Krisfix.de !


----------



## Youngtimer

Unreal content from Krisfix.de


----------



## Frosted racquet

Baasha said:


> I think I'll try using the stock adapter/cables and see if that works. Any other ideas!?!


It's not the cables. Lower your VRAM OC.


----------



## newls1

Frosted racquet said:


> It's not the cables. Lower your VRAM OC.


Agreed.... MAn Games on my +1450- +1500 mem OC were CTD and I kept thinking it was my GPU Core OC, NOPE... I think its temp related tho and this might be one of the times where my WB is working against me. Seems I can be just fine if I get mem temps over 50c-ish but below that temp, seems +1410 is where shes happy. Mostly my mem temp is in the 25-30c range tho, so maybe when summer hits, I can increase my OC (never thought id ever say that and is complete backwards!)


----------



## elbramso

Laithan said:


> MSI Suprim Liquid X 4090 with the MSI 600W BIOS vs Galax 666W BIOS @ identical settings
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> View attachment 2588247


What's the difference between these BIOS versions other than the power target? My card is never power starving but this cold mem bug drives me crazy. It's pretty much like the colder my water gets the less points I score...
Is this some kind of wierd overcurrent protection for the memory?


----------



## wirx

Galax BIOS rules!
Gigabyte Waterforce 4090 witx 666W Galax BIOS, room temp 24, stock AIO, fans max








I scored 29 356 in Port Royal


AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com












I scored 11 198 in Speed Way


AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com












I scored 18 093 in Time Spy Extreme


AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





I used before Asus 600W bios and made runs with same Mhz settings.








Result







www.3dmark.com




28 537 vs 29 174
I tried everything with Asus but can´t go higher in PR, 3dmark gives error and shuts down, but with Galax I get ~650 points (+2fps) more.









With Galax I was able to add more Mhz, resulting 22123 graphic score(I still use "old" AMD 5900x)
Here is screenshot with GPU-Z working, max board power draw was 650W

















I scored 18 060 in Time Spy Extreme


AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





I also tried latest nvidia driver 31.0.15.2756 and compared previos 31.0.15.2737. 
With newer 31.0.15.2756 driver max stable GPU MHz was ~50MHz lower than older driver.
There were no visible artifacts with any test.
btw PSU is Corsair RM850X 850W and I use stock cables, what come with Gigabyte card.

There are quite a few guys in 3Dmarks top before me who don´t use waterblocks or liquid nitrogen, Gigabyte Waterforce was good choise. Card is realy quiet in games and temps stays 50-60c, fans less than 1000rpm 95% of times.


----------



## Faltzer

wirx said:


> Galax BIOS rules!
> Gigabyte Waterforce 4090 witx 666W Galax BIOS, room temp 24, stock AIO, fans max
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 356 in Port Royal
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 198 in Speed Way
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 093 in Time Spy Extreme
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I used before Asus 600W bios and made runs with same Mhz settings.
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 28 537 vs 29 174
> I tried everything with Asus but can´t go higher in PR, 3dmark gives error and shuts down, but with Galax I get ~650 points (+2fps) more.
> View attachment 2588287
> 
> 
> With Galax I was able to add more Mhz, resulting 22123 graphic score(I still use "old" AMD 5900x)
> Here is screenshot with GPU-Z working, max board power draw was 650W
> 
> View attachment 2588288
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 060 in Time Spy Extreme
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I also tried latest nvidia driver 31.0.15.2756 and compared previos 31.0.15.2737.
> With newer 31.0.15.2756 driver max stable GPU MHz was ~50MHz lower than older driver.
> There were no visible artifacts with any test.
> btw PSU is Corsair RM850X 850W and I use stock cables, what come with Gigabyte card.
> 
> There are quite a few guys in 3Dmarks top before me who don´t use waterblocks or liquid nitrogen, Gigabyte Waterforce was good choise. Card is realy quiet in games and temps stays 50-60c, fans less than 1000rpm 95% of times.


 impressive, can you check if your card is REV 1.0 or 1.1?


----------



## wirx

Card is 1.0, also there is no voltage curve, just 100%.
Rebar is enabled.


----------



## LtMatt

Krzych04650 said:


> Did anyone notice those strange framtime issues in some games in fullscreen? Something like this:
> View attachment 2588142
> 
> 
> For me it can be resolved completely by either running borderless or reducing render que to 1 or 0 with low latency mode depending on the game, like here
> View attachment 2588143
> 
> 
> It is not exactly stuttering, but it reduces 1% and 0.1% lows a lot. For example in Plague Tale here normal is 130 avg/115 1%/112 0.1% and this strange issue is 130 avg/86 1%/82 0.1%


I'm actually getting this in some games too. Can you share a screenshot of how you fixed it ?


----------



## simonabamber

Baasha said:


> Well, I think I figured it out - the metallic cover on the Cable-Mod cable (that covers the GPU-side connection) was somehow preventing the connection from being 100% in - I removed that cover and plugged in the cable and have been playing all evening - CoD, GTAV etc etc. and not a single crash - with the same OC settings etc.
> 
> So PSA to those who have the Cable-Mod cables - make sure your cable is 100% plugged in - remove the metallic (or plastic if you didn't get that option) cover to really push it in there!
> 
> Previously, I couldn't even get 1 complete round in CoD before it would shut off - I just played for 2+ hours without pause and there were no hiccups at all!
> 
> I will keep 'testing' but so far so good.
> 
> EDIT: Welp, that was shortlived!
> 
> It crashed on desktop under no load!
> 
> I tried playing CoD MWII and it crashed within 5 minutes. Damn.. this is not good.
> 
> I think I'll try using the stock adapter/cables and see if that works. Any other ideas!?!


I had same issue with my Cablemod cable, crashes randomly. Cablemod have RMA the cable and new one is with Fedex due for delivery on Wednesday, if it turns up on Wednesday it'll be 8 days from telling them about the issue to having a new cable.


----------



## elbramso

wirx said:


> Galax BIOS rules!
> Gigabyte Waterforce 4090 witx 666W Galax BIOS, room temp 24, stock AIO, fans max
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 356 in Port Royal
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 198 in Speed Way
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 093 in Time Spy Extreme
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I used before Asus 600W bios and made runs with same Mhz settings.
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 28 537 vs 29 174
> I tried everything with Asus but can´t go higher in PR, 3dmark gives error and shuts down, but with Galax I get ~650 points (+2fps) more.
> View attachment 2588287
> 
> 
> With Galax I was able to add more Mhz, resulting 22123 graphic score(I still use "old" AMD 5900x)
> Here is screenshot with GPU-Z working, max board power draw was 650W
> 
> View attachment 2588288
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 18 060 in Time Spy Extreme
> 
> 
> AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> I also tried latest nvidia driver 31.0.15.2756 and compared previos 31.0.15.2737.
> With newer 31.0.15.2756 driver max stable GPU MHz was ~50MHz lower than older driver.
> There were no visible artifacts with any test.
> btw PSU is Corsair RM850X 850W and I use stock cables, what come with Gigabyte card.
> 
> There are quite a few guys in 3Dmarks top before me who don´t use waterblocks or liquid nitrogen, Gigabyte Waterforce was good choise. Card is realy quiet in games and temps stays 50-60c, fans less than 1000rpm 95% of times.


is it the* 95.02.18.C0.66* ?? I'm still wondering what the difference could be - ok I guess this BIOS makes more sense for AiO or Aircooled cards as you won't be able to push 666w with a watercooled card anyways...


----------



## alasdairvfr

Glerox said:


> Update on first day of overclocking with Gigabyte RTX 4090 Gaming OC with stock cooler/stock bios (OC) paired with 7700x :
> 
> My best result :
> 
> +210 core
> +1650 mem
> +133% PL
> +100% voltage
> Force constant voltage
> 
> 28001 in Port Royal
> 
> Did I get a good card?


I'd say pretty good, yeah. Did you force rebar on? Thats 3-400 pts right there.

What's your actual core at +210?

Its pretty much on par with my card, I hit 28.3k on PR but I'm on a 5950x and needed to do some CPU/RAM tweaking (vanilla PBO I was at 27.5k) to get there.


----------



## wirx

Yes, this 95.02.18.C0.66 BIOS - GALAX RTX 4090 VBIOS
Attached is GPU-Z log from this 22067 run.
GPU temperature and delta have quite a big difference, but this is AIO card and over 200W more power than default 450W.


----------



## alasdairvfr

Baasha said:


> Well, I think I figured it out - the metallic cover on the Cable-Mod cable (that covers the GPU-side connection) was somehow preventing the connection from being 100% in - I removed that cover and plugged in the cable and have been playing all evening - CoD, GTAV etc etc. and not a single crash - with the same OC settings etc.
> 
> So PSA to those who have the Cable-Mod cables - make sure your cable is 100% plugged in - remove the metallic (or plastic if you didn't get that option) cover to really push it in there!
> 
> Previously, I couldn't even get 1 complete round in CoD before it would shut off - I just played for 2+ hours without pause and there were no hiccups at all!
> 
> I will keep 'testing' but so far so good.
> 
> EDIT: Welp, that was shortlived!
> 
> It crashed on desktop under no load!
> 
> I tried playing CoD MWII and it crashed within 5 minutes. Damn.. this is not good.
> 
> I think I'll try using the stock adapter/cables and see if that works. Any other ideas!?!


My money would be riding on VRAM losing stability due to temperature. Not sure where you are but it's starting to get cold in Canada. Even air cooled cards in colder ambient conditions are losing memory stability. Crashing early on or around idle suggests to me the memory hasn't had a chance to warm up when some load hits it at a higher frequency. Try running at a lower clock or even at stock and track your temperatures if you plan to trigger the crash.

There's a chance your crash is generating an event in event viewer that might help diagnose as well; the off chance it's another system component that's your culprit. Also a new GPU that draws several hundred watts could push a tired power supply over the edge.


----------



## yzonker

elbramso said:


> What's the difference between these BIOS versions other than the power target? My card is never power starving but this cold mem bug drives me crazy. It's pretty much like the colder my water gets the less points I score...
> Is this some kind of wierd overcurrent protection for the memory?


The Galax bios runs at a higher effective clock than any of the other AIB bios and performs quite a bit better because of this.

I've found as everyone else that mem OC falls off as mem temps go down. Using my chiller, I've also found that there is a sudden performance loss around 10-12C water temp. It's right at 300 pts in Port Royal. So I always make runs in the 12-15C range water temp. The Galax bios unfortunately does not fix this issue.

I'm also using a heater on the backplate to keep the mem warm, but that has no effect on the water temp that causes the big performance loss. It stays in the 10-12C range irregardless to the mem temp. But the heater does allow me to stay close to my max mem OC with the original air cooler (+1750 water vs +1800 air).


----------



## elbramso

yzonker said:


> The Galax bios runs at a higher effective clock than any of the other AIB bios and performs quite a bit better because of this.
> 
> I've found as everyone else that mem OC falls off as mem temps go down. Using my chiller, I've also found that there is a sudden performance loss around 10-12C water temp. It's right at 300 pts in Port Royal. So I always make runs in the 12-15C range water temp. The Galax bios unfortunately does not fix this issue.
> 
> I'm also using a heater on the backplate to keep the mem warm, but that has no effect on the water temp that causes the big performance loss. It stays in the 10-12C range irregardless to the mem temp. But the heater does allow me to stay close to my max mem OC with the original air cooler (+1750 water vs +1800 air).


Well I guess I have to test it against the 600w MSI BIOS on my Suprim


----------



## Betroz

Is there a free Nvidia tech demo or something that I can use to test OC and cooler performance? Yes I know about Heaven 4.0, but that is old. Do I have to buy 3DMark... Not really a benchmark guy anymore.


----------



## Faltzer

Betroz said:


> Is there a free Nvidia tech demo or something that I can use to test OC and cooler performance? Yes I know about Heaven 4.0, but that is old. Do I have to buy 3DMark... Not really a benchmark guy anymore.


Search Google for 3d mark key, you can find it for a fraction of the original steam price


----------



## elbramso

Ok, the Galax BIOS does infact push my score with lower clocks...
It seems like this BIOS is heavily optimized for Benchmarks


----------



## Faltzer

I can confirm alongside other users here that the Galax bios is indeed better performing. 
BUT! 
Is it not dangerous in any way to run higher than specced wattage? My 4090 waterforce ran at almost 651 watts, while the 12vhpwr cable is only rated to 600 watts. Stock vbios didn't run above 530, and the original update decreased it to under 500.

I am afraid to run furmark as that will probably get it over 700watts

Couldn't this cause potential card damage?


----------



## Nizzen

Faltzer said:


> I can confirm alongside other users here that the Galax bios is indeed better performing.
> BUT!
> Is it not dangerous in any way to run higher than specced wattage? My 4090 waterforce ran at almost 651 watts, while the 12vhpwr cable is only rated to 600 watts. Stock vbios didn't run above 530, and the original update decreased it to under 500.
> 
> I am afraid to run furmark as that will probably get it over 700watts
> 
> Couldn't this cause potential card damage?


We ran 650w on 2x 8 pin in the old days with 780ti classified with evbot... One 8 pin is "rated" for 150w 

Long extention cable isn't the best, that's why I'm using the orginal Seasonic cable that is connected directly to the psu. That cable is more heavy than nvidia cable too. It looks quality too


----------



## Krzych04650

LtMatt said:


> I'm actually getting this in some games too. Can you share a screenshot of how you fixed it ?


Either setting to borderless (doesn't fix all games, but most) or setting Low Latency Mode in NVIDIA Control Panel to On or Ultra depending on the game. FPS limiter works as well. It seems to happen only under DX11, mostly in fullscreen mode, and with full GPU saturation with unlocked framerate. Also the issue is not replicable every time for some reason. I did try to tweak some things like disabling HAGS, disconnecting second monitor, quiting all background apps and etc but nothing works.


----------



## SilenMar

It's more like the new driver 527.56 fixes tons of bugs and improves the OC stability. 
I get lower power draw after driver update but the OC range is increased with 15MHz. 
Galax Bios probably doesn't have much to do with it.


----------



## dk_mic

EKWB support comment on why they don't supply a dual slot bracket with their MSI 4090 waterblocks:



> Sadly, the custom-made dual-slot bracket will not be included.
> 
> 
> The 40 series FE included it since the original bracket has a gigantic hole that leaves the case weirdly open if you re-install the card with a GPU block.
> The 40 series Strix/TUF included it to stop the card from falling apart when the cooler is removed, it has to be attached to the block itself because it is not secured to the PCB at all.
> 
> Neither of these two scenarios applies to the Master, Trio or Zotac consequently they did not include a single slot bracket.


----------



## katates

Hello guys, i have zotac 4090 amp extreme airo. Everything is great so far except fans starts to spin at 48 celcius until it becomes 40. Then they stop. So this is causing an infinite uncomfortable loop. I tried to override the fan curve but no luck with the custom fan curve I cant put 0%.

Did anyone encounter this problem? I want to put 0rpm till 55 celcius.

Or I have something else in my mind, I don't have much experience but since card has dual bios, im assuming using NVFlash with some other gpu's bios won't be a problem.If i cant find a solution I will try some Gigabyte 4090 VBIOS since ive read that you can use 0 rpm on the custom fan curve and plus 600w PL. But I don't know the risks. I am assuming LEDs won't work? Maybe ports?


----------



## gamerMwM

What kind of temps are you guys seeing in Quake RTX?

I'm getting 55 degrees with 21c ambient pulling a pretty stable 520 watt load. Before I turned up my fans I actually saw 60c.

Have a Corsair waterblock all stock pads/paste with Nvidia FE 4090 and 4 Rads.

Coolant temp 31c so about 10 degrees Delta T.

Nothing else in the loop but the GPU right now.

Sent from my SM-G975U using Tapatalk


----------



## Baasha

Frosted racquet said:


> It's not the cables. Lower your VRAM OC.


It crashed on the desktop when there was NO OC. It also crashed when just browsing the web (Chrome). Most of the times, however, was during gaming. I don't think it's the VRAM tbh... I tested it with the Vulkan tool and it passed without errors at +1600Mhz.



simonabamber said:


> I had same issue with my Cablemod cable, crashes randomly. Cablemod have RMA the cable and new one is with Fedex due for delivery on Wednesday, if it turns up on Wednesday it'll be 8 days from telling them about the issue to having a new cable.


Definitely the culprit for me as well - going to RMA it today. Plugged in the original adapter and cables and gamed last night for ~ 2.5 hours OC'ing the GPU with +100mV on the voltage, +1600Mhz on the VRAM, and +230 on the Core and was rock solid at 3045Mhz core clock the whole time - never skipped a beat. I am so glad it's NOT the GPU. Definitely seems to be the CableMod cable - wondering if the length of the cable has to do anything with it(?). I got the 1000mm version with the metallic shroud. The crashes occurred even when I removed the shroud so it most definitely is something to do with the cable itself - the connection seemed solid too (I checked multiple times).


----------



## BigMack70

Baasha said:


> It crashed on the desktop when there was NO OC. It also crashed when just browsing the web (Chrome). Most of the times, however, was during gaming. I don't think it's the VRAM tbh... I tested it with the Vulkan tool and it passed without errors at +1600Mhz.
> 
> 
> 
> Definitely the culprit for me as well - going to RMA it today. Plugged in the original adapter and cables and gamed last night for ~ 2.5 hours OC'ing the GPU with +100mV on the voltage, +1600Mhz on the VRAM, and +230 on the Core and was rock solid at 3045Mhz core clock the whole time - never skipped a beat. I am so glad it's NOT the GPU. Definitely seems to be the CableMod cable - wondering if the length of the cable has to do anything with it(?). I got the 1000mm version with the metallic shroud. The crashes occurred even when I removed the shroud so it most definitely is something to do with the cable itself - the connection seemed solid too (I checked multiple times).


The 12VHPWR connector is just a disaster on this gen of cards. My original adapter turned out to be what was causing all my crashes, and a CableMod cable is what fixed it for me. I spent almost a month trying to troubleshoot the PC because the adapter connection seemed solid, but nope...

I got the standard 27cm cable - CableMod Basics C-Series 12VHPWR PCI-e Cable for Corsair – CableMod Global Store


----------



## Krzych04650

Waterblock showed up nearly 2 weeks ahead of ETA


----------



## J7SC

Krzych04650 said:


> Waterblock showed up nearly 2 weeks ahead of ETA
> View attachment 2588345
> 
> View attachment 2588344
> View attachment 2588346
> 
> View attachment 2588343


...nice. always a bit of a surprise once you strip those giant air-coolers off a 4090; you're left with a much smaller PCB and waterblocks that don't need the overhang. FYI, may be put soft thermal pads on either side of the 12VHPWR area on the pcb.


----------



## Krzych04650

Wow the memory stability issues on water are definitely a thing. I just got an immediate system reset the second I launched the first game  Lowering by 100 seems to be fine for now, and that was with 20C memory temp that only went up to 30 in the game, so that is a bit on the extreme end of low memory temp, but still.


----------



## Nizzen

Asus strix 4090 white incoming! 

90YV0ID2-M0NA00


----------



## J7SC

Krzych04650 said:


> Wow the memory stability issues on water are definitely a thing. I just got an immediate system reset the second I launched the first game  Lowering by 100 seems to be fine for now, and that was with 20C memory temp that only went up to 30 in the game, so that is a bit on the extreme end of low memory temp, but still.


...I went through s.th. similar; depending on app, stable memory (incl. EEC) can be higher by +80 to +120 on the MSI AB slider when it gets past 43 C or so. Also, I am beginning to think that there are 'memory holes' even when warmed up properly.


----------



## Krzych04650

Looks like lower temps shave off around 30W of power draw and increase performance (OC settings are the same for both expect watercooled one has +1250 on memory instead of +1350)
















Port Royal also went up from 28 880 to 29 069 despite lowering the memory overclock by 100.


----------



## MikeS3000

On a funny note I think my wife is ready to kill me. I ran OCCT GPU test on my 4090 at about 600w and also ran CB23 on my 13900k because I was curious what power I was drawing from the wall (my UPS has a display readout). So I hit 1100 W and .....Bam the PC and all of the lights in my office shut off. I blew the breaker switch and my UPS starts beeping like crazy because of an over current protection as it could not run my system at 1100w (UPS only rated for 780w). Need more amperage on that circuit in my house apparently!


----------



## mirkendargen

MikeS3000 said:


> On a funny note I think my wife is ready to kill me. I ran OCCT GPU test on my 4090 at about 600w and also ran CB23 on my 13900k because I was curious what power I was drawing from the wall (my UPS has a display readout). So I hit 1100 W and .....Bam the PC and all of the lights in my office shut off. I blew the breaker switch and my UPS starts beeping like crazy because of an over current protection as it could not run my system at 1100w (UPS only rated for 780w). Need more amperage on that circuit in my house apparently!


Time to swap that breaker out if it's tripping on less than 10A! Unless you have a bunch of other stuff running on the circuit...


----------



## Wolverine2349

I am on a mission to find an RTX 4090 with as little to no coil whine as possible

I have tried 4 different RTX 4090 including 2 Gigabyte Gaming OCs and a Gigabyte Aorus Master and a PNY XLR8

Unfortunately all had too much a buzz like noise that will not be drowned out by the sound of game in a quiet area of the game. Though in areas of the game where sound is louder it would mostly if not all be drowned out, though looking for less whine than that.

I had a GeForce RTX Gaming OC 3090 Ti and it had almost no coil whine at all in any situation and it was so hush and faint and very hard to detect even if you looked for it without putting your ear right next to the card

Sadly not the case for the 4 RTX 4090s I have tried.

3 of the I sold and now left with a Gaming OC again with the same issues.

Is there any setup or any ones with no or extremely little coil whine??

Or is it just a reality that all 4090s have it more than just a little and no way around it??

Its hard to believe they could not have any without them as many 3090 Tis tested by Guru3D stated they had a hard time detecting it on those cards and the 3090 Ti power draw is very similar to factory RTX 4090 power draw around 450 to 460 watts,

And yeah I did try power limit and it does almost nothing unless I put it below 60% which is unacceptable gimping of performance. If I could almost eliminate the buzzing/whine at 80 to 85% power usage, great. But having to go below 60% or even as low as 50% is not good.

Strange thing is the whine is actually bad in high end AAA games even with FPS only 80 to 120. In Furmark bench Mark at 1080P with MSAA at 8X FPS is like 200 to 300 and like no buzz or whine. Or course FPS goes into 700s with MSAA off in Furmark and it starts to make a different kind of whine which is ok if FPS goes that high as never in real world usage.

I have seen many reports that say Zotac, Gigabyte supposedly have less chance of it and are better. Though the Gigabytes I had all have too much.

And have heard Asus is worst and MSI also has lots of complaints.

Though I have heard some reports of people insisting they have none even with their ear right by an MSI Suprim X. I wonder is the newer batch of cards from MSI better. I know Suprim X has the best VRM of all cards. The Gaming X Trio has 8 less power stages and only 50 amps at that.


----------



## WayWayUp

hmmm, no coil whine on my founders card but i wonder if that will change once I install the water block


----------



## Frosted racquet

Coil whine is a result of a specific PSU-> GPU combination. You cannot compare GPU coil whine between systems running a different PSU


----------



## 8472

Wolverine2349 said:


> I am on a mission to find an RTX 4090 with as little to no coil whine as possible
> 
> I have tried 4 different RTX 4090 including 2 Gigabyte Gaming OCs and a Gigabyte Aorus Master and a PNY XLR8
> 
> Unfortunately all had too much a buzz like noise that will not be drowned out by the sound of game in a quiet area of the game. Though in areas of the game where sound is louder it would mostly if not all be drowned out, though looking for less whine than that.
> 
> I had a GeForce RTX Gaming OC 3090 Ti and it had almost no coil whine at all in any situation and it was so hush and faint and very hard to detect even if you looked for it without putting your ear right next to the card
> 
> Sadly not the case for the 4 RTX 4090s I have tried.
> 
> 3 of the I sold and now left with a Gaming OC again with the same issues.
> 
> Is there any setup or any ones with no or extremely little coil whine??
> 
> Or is it just a reality that all 4090s have it more than just a little and no way around it??
> 
> Its hard to believe they could not have any without them as many 3090 Tis tested by Guru3D stated they had a hard time detecting it on those cards and the 3090 Ti power draw is very similar to factory RTX 4090 power draw around 450 to 460 watts,
> 
> And yeah I did try power limit and it does almost nothing unless I put it below 60% which is unacceptable gimping of performance. If I could almost eliminate the buzzing/whine at 80 to 85% power usage, great. But having to go below 60% or even as low as 50% is not good.
> 
> Strange thing is the whine is actually bad in high end AAA games even with FPS only 80 to 120. In Furmark bench Mark at 1080P with MSAA at 8X FPS is like 200 to 300 and like no buzz or whine. Or course FPS goes into 700s with MSAA off in Furmark and it starts to make a different kind of whine which is ok if FPS goes that high as never in real world usage.
> 
> I have seen many reports that say Zotac, Gigabyte supposedly have less chance of it and are better. Though the Gigabytes I had all have too much.
> 
> And have heard Asus is worst and MSI also has lots of complaints.
> 
> Though I have heard some reports of people insisting they have none even with their ear right by an MSI Suprim X. I wonder is the newer batch of cards from MSI better. I know Suprim X has the best VRM of all cards. The Gaming X Trio has 8 less power stages and only 50 amps at that.


I'm really surprised that you've experienced it on multiple gaming OCs. Most owners swear that theirs doesn't have any. 

I honestly think that the reason that some people don't have it is because they either play games that don't make it a worst case scenario like cyberpunk does or they have headphones/fans that cover up the noise. 

It seems insane that it really is luck of the draw, and it's really frustrating having to constantly hunt down cards that might take forever to come back in stock. 

I personally don't buy the explanation that the PSU is to blame, there have been plenty of people that tried several power supplies from a variety of companies with the same results. 

I'm going to give Asus one more try and try to get a hold of a strix. If that whines I'll try the Suprim X.


----------



## Wolverine2349

Frosted racquet said:


> Coil whine is a result of a specific PSU-> GPU combination. You cannot compare GPU coil whine between systems running a different PSU



Do you think motherboard would also play a role?

I have an MSI Z690 Unify X. A Corsair HX1200 power supply. I have the 12vhpwr adapter for the Corsair, but it has no inline filtering caps unlike the rest of cables, so wonder if losing some of the excellent ripple suppression they provide

I wonder if swapping to an Asus ROG Thor Platinum II which has a 12vhpwr connector and has excellent ripple suppression without the filtering caps as SeaSonic cables have none and still test to great ripple suppression anyways, if it would help. I think it is a rebranded SeaSonic Prime Platinum PSU I believe.


----------



## Frosted racquet

Wolverine2349 said:


> Do you think motherboard would also play a role?


Don't think so. Although the MoBo can also have coil whine depending on CPU load.


----------



## Wolverine2349

8472 said:


> I'm really surprised that you've experienced it on multiple gaming OCs. Most owners swear that theirs doesn't have any.
> 
> I honestly think that the reason that some people don't have it is because they either play games that don't make it a worst case scenario like cyberpunk does or they have headphones/fans that cover up the noise.
> 
> It seems insane that it really is luck of the draw, and it's really frustrating having to constantly hunt down cards that might take forever to come back in stock.
> 
> I personally don't buy the explanation that the PSU is to blame, there have been plenty of people that tried several power supplies from a variety of companies with the same results.
> 
> I'm going to give Asus one more try and try to get a hold of a strix. If that whines I'll try the Suprim X.



Are you trying to find a card with no coil whine. DO you have an Asus 4090 Strix with bad coil whine?

I have heard the MSI cards have it too, though there have ben some people who swear there's has none.

I have not heard near as many swear their Asus card has no whine and have heard reports of it being worse with Asus even though MSI also has lots of complaints this gen.


----------



## mirkendargen

8472 said:


> I'm really surprised that you've experienced it on multiple gaming OCs. Most owners swear that theirs doesn't have any.
> 
> I honestly think that the reason that some people don't have it is because they either play games that don't make it a worst case scenario like cyberpunk does or they have headphones/fans that cover up the noise.
> 
> It seems insane that it really is luck of the draw, and it's really frustrating having to constantly hunt down cards that might take forever to come back in stock.
> 
> I personally don't buy the explanation that the PSU is to blame, there have been plenty of people that tried several power supplies from a variety of companies with the same results.
> 
> I'm going to give Asus one more try and try to get a hold of a strix. If that whines I'll try the Suprim X.


I think "coil whine" needs some caveats too. I don't think a GPU exists that I couldn't hear at high load....but I also have everything watercooled with pumps/rad/fans in another room, and truly silent super slow fans in my case. So yeah, I can hear literally anything. I can hear coil whine from my *motherboard.*

I absolutely *could not* hear coil whine on my GPU when the stock cooler was still on it. I think people with stock coolers, AIO's, rads/fans in their cases, etc. have a much higher floor for being able to hear coil whine, and may say their card has none. It's gonna depend on how loud your computer is in general.

Now I will say that any coil whine that can be heard over the stock cooler qualifies as bad coil whine, and you can definitely do better. Beyond that though....it depends.


----------



## Wolverine2349

Frosted racquet said:


> Don't think so. Although the MoBo can also have coil whine depending on CPU load.



Interesting you say that because I noticed on my mobo when running a Super Position benchmark at 1440P CUstom extreme DirectX if my CPU was 8% or lower load, the video card had very little to almost no buzz.

But in a game when CPU load exceeded 10% there was whine. SO I wondered is it the motherboard making the card whine when trying t load both CPU and card at same time?? Though if I stres only my CPU with Prime95 or Linpack XTREME, no whine from motherboard that I can hear.

SO wondered if maybe the dual load form mobo caused whine. Though not so much because if I switched to OpenGL in Super Position benchmark CPU load was still under 10% ad GFPU coil whine came back. It was all GPU just volatile as to what does it and sadly it happens with an annoying buzz in high end games like GTA V and RDR2 even in RDR2 with settings maxed and MSAA 8X which only allows 80 to 100 FPS at 1440P as MSAA is beyond taxing even for a 4090.


----------



## Wolverine2349

mirkendargen said:


> I think "coil whine" needs some caveats too. I don't think a GPU exists that I couldn't hear at high load....but I also have everything watercooled with pumps/rad/fans in another room, and truly silent super slow fans in my case. So yeah, I can hear literally anything. I can hear coil whine from my *motherboard.*
> 
> I absolutely *could not* hear coil whine on my GPU when the stock cooler was still on it. I think people with stock coolers, AIO's, rads/fans in their cases, etc. have a much higher floor for being able to hear coil whine, and may say their card has none. It's gonna depend on how loud your computer is in general.
> 
> Now I will say that any coil whine that can be heard over the stock cooler qualifies as bad coil whine, and you can definitely do better. Beyond that though....it depends.



Yeah I would say any coil whine heard over stock cooler is terrible. Though you have to count that means when stock cooler is running fans at low smooth RPM with minimal noise. If coil whien can be heard over that that is bad. The fans should not need to sound loud to not hear coil whine over it or otherwise coil whine is way too much.


----------



## J7SC

mirkendargen said:


> I think "coil whine" needs some caveats too. I don't think a GPU exists that I couldn't hear at high load....but I also have everything watercooled with pumps/rad/fans in another room, and truly silent super slow fans in my case. So yeah, I can hear literally anything. I can hear coil whine from my *motherboard.*
> 
> I absolutely *could not* hear coil whine on my GPU when the stock cooler was still on it. I think people with stock coolers, AIO's, rads/fans in their cases, etc. have a much higher floor for being able to hear coil whine, and may say their card has none. It's gonna depend on how loud your computer is in general.
> 
> Now I will say that any coil whine that can be heard over the stock cooler qualifies as bad coil whine, and you can definitely do better. Beyond that though....it depends.


...no coil whine on my Giga-G-OC either, whether originally with the stock air cooler, or now with the water-block. As the system is whisper quiet now, I would definitely pick it up. I learned all about coil whine when I had a MSI 290 Lightning doing a lot of it 🥴.

As was already pointed out, other factors such as PSU and cabling can also be a culprit of / contribute to coil whine, depending on the combination of GPU, PSU etc.


----------



## Wolverine2349

J7SC said:


> ...no coil whine on my Giga-G-OC either, whether originally with the stock air cooler, or now with the water-block. As the system is whisper quiet now, I would definitely pick it up. I learned all about coil whine when I had a MSI 290 Lightning doing a lot of it 🥴.
> 
> As was already pointed out, other factors such as PSU and cabling can also be a culprit of / contribute to coil whine, depending on the combination of GPU, PSU etc.



What kind of cabling could contribute to coil whine and PSU. Would inline caps in cables or the motherboard play a role.

Is a PSU with better ripple suppression with no inline caps like the SeaSonic Prime based maybe better than a Corsair HX1200 as they need no inline cable caps to get just as good of ripple suppression?


----------



## 8472

Wolverine2349 said:


> Are you trying to find a card with no coil whine. DO you have an Asus 4090 Strix with bad coil whine?
> 
> I have heard the MSI cards have it too, though there have ben some people who swear there's has none.
> 
> I have not heard near as many swear their Asus card has no whine and have heard reports of it being worse with Asus even though MSI also has lots of complaints this gen.


Yes, I'm trying to find a 4090 that has no noticeable coil whine at normal frame rates (60-120fps) and that isn't a lower tier company like zotac, pny, etc.

I've tried two Asus Tuf's with horrible coil whine. The strix seems to have less reports of it by owners compared to the Tuf, however that could be influenced by the fact that more people will have the Tuf than the strix. 

Jayztwocents said that his strix had it but Derbauer said that his strix 4090 didn't have it even though his 4080 strix does. 

For MSI, it seems hit or miss as well. Some people have horrible coil whine and some swear that they don't have any.


----------



## yzonker

Thought I was a little soft in SW yesterday. It likes mem so much that I can score higher at ambient, but with the 2x100w heaters on the backplate.









I scored 11 336 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Made a few Superposition 4k runs too. Might do better with the chiller, but probably not much since the mem is so much higher. Depends on what it likes the most. It likes some Galax though. 










And I re-ran with LOD +3 for the heck of it, although kinda stupid since it turns the benchmark in to a blurry mess. But I suspect some of the runs in their database are run this way.


----------



## kryptonfly

I soon will have a Bykski block for my Gigabyte gaming OC, what is the perfect size for pads ? And is there a way to keep untouched the warranty sticker on the GPU screw or maybe find the same sticker elsewhere ?


----------



## J7SC

Wolverine2349 said:


> What kind of cabling could contribute to coil whine and PSU. Would inline caps in cables or the motherboard play a role.
> 
> Is a PSU with better ripple suppression with no inline caps like the SeaSonic Prime based maybe better than a Corsair HX1200 as they need no inline cable caps to get just as good of ripple suppression?


...there are too many different permutations and combinations of GPU,PSU and cabling to consider, but that said, I am running a newish Seasonic Prime Platinum PX 1300 with the Giga-G-OC (no coil whine). Adding a CableMod 12VHPWR recently did not impact that, either.


----------



## mirkendargen

kryptonfly said:


> I soon will have a Bykski block for my Gigabyte gaming OC, what is the perfect size for pads ? And is there a way to keep untouched the warranty sticker on the GPU screw or maybe find the same sticker elsewhere ?


Arctic TP-3 1.5mm worked great for me.

I didn't bother trying to save the sticker, but being extremely careful with a hairdryer and some tweezers might do the job.


----------



## bmagnien

yzonker said:


> Thought I was a little soft in SW yesterday. It likes mem so much that I can score higher at ambient, but with the 2x100w heaters on the backplate.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 336 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Made a few Superposition 4k runs too. Might do better with the chiller, but probably not much since the mem is so much higher. Depends on what it likes the most. It likes some Galax though.
> 
> View attachment 2588380
> 
> 
> And I re-ran with LOD +3 for the heck of it, although kinda stupid since it turns the benchmark in to a blurry mess. But I suspect some of the runs in their database are run this way.
> 
> View attachment 2588381


Go for 40k 4K!


----------



## yzonker

bmagnien said:


> Go for 40k 4K!


Yea I considered running a bit more since I was so close but I'd already run SW a bunch of times. I'll run it again next time I have the loop cooled down with the chiller.


----------



## Wolverine2349

J7SC said:


> ...there are too many different permutations and combinations of GPU,PSU and cabling to consider, but that said, I am running a newish Seasonic Prime Platinum PX 1300 with the Giga-G-OC (no coil whine). Adding a CableMod 12VHPWR recently did not impact that, either.



Do you think having the included extension cable 12vhpwr adapter rather than a cable mod 12vhpwr cable direct to PSU would make it more likely to have coil whine as it adds more complexity where it is a cleaner more direct connection to the PSU?

And you state no coil whine. Can you put your ear close to it and not hear any whine when the fans are at low speed and GPU under load in a high end AAA game like RDR2 or Cyber Punk or GTAV

What kind of motherboard and CPU and case do you have and are your case fans quiet when system under full load?


----------



## mirkendargen

Wolverine2349 said:


> Do you think having the included extension cable 12vhpwr adapter rather than a cable mod 12vhpwr cable direct to PSU would make it more likely to have coil whine as it adds more complexity where it is a cleaner more direct connection to the PSU?
> 
> And you state no coil whine. Can you put your ear close to it and not hear any whine when the fans are at low speed and GPU under load in a high end AAA game like RDR2 or Cyber Punk or GTAV
> 
> What kind of motherboard and CPU and case do you have and are your case fans quiet when system under full load?


I have a hard time believing someone with a watercooled card and all their fans off couldn't hear *some* kind of buzz/noise under load. It's just a fact of life with those kinds of electrical loads. The question is just whether it's a faint buzz/hum or a loud clicky screech.


----------



## J7SC

Wolverine2349 said:


> Do you think having the included extension cable 12vhpwr adapter rather than a cable mod 12vhpwr cable direct to PSU would make it more likely to have coil whine as it adds more complexity where it is a cleaner more direct connection to the PSU?
> 
> And you state no coil whine. Can you put your ear close to it and not hear any whine when the fans are at low speed and GPU under load in a high end AAA game like RDR2 or Cyber Punk or GTAV
> 
> What kind of motherboard and CPU and case do you have and are your case fans quiet when system under full load?


...I didn't have coil whine with either the included 4-into1 dongle, or the 12VHPWR CableMod single cable. The case is TT Core P8 below (lower right is the 4090 setup that includes 5950X on Asus X570 DarkHero). The case has plenty of venting on the sides and neither system has coil whine, so no stereo pitch (thankfully). I often run heavy loads and high fps (showed 660W+ the other day),










@Nizzen ...looking forward to see your 'white' Asus 4090 Strix on the white Z790 Apex. If all else fails, you can go for the white Galax HoF OCL; heater on the back optional ☺ ...the Galax XOC vbios that has been making the rounds lately may very well have half of what the vbios for this one has, according to rumour.


----------



## Wolverine2349

mirkendargen said:


> I have a hard time believing someone with a watercooled card and all their fans off couldn't hear *some* kind of buzz/noise under load. It's just a fact of life with those kinds of electrical loads. The question is just whether it's a faint buzz/hum or a loud clicky screech.



Are you saying it is a fact that anything with that kind of electrical load will whine??

One thing is I always thought electronics were silent and it was motors and fans that made noise and only insane electrical whine or noise came form insanely high electric draw like in industrial settings or a power box to a large building.

I mean a toaster makes like no noise when and the coils generating that heat generate nearly 1000 watts. from the AC power outlet. I remember almost 13 years ago in early 2010 plugging one into a UPS to see and it was like 900 Watts or so.


----------



## yzonker

My TUF is audible at high framerates, but I never hear it while gaming (using desktop speakers at moderate levels), other than maybe in a menu with no sound. It is noticeable benchmarking (with the side panel off) if I don't have my fans turned up. I would say it's slightly louder than my 3080ti FTW3 (which is very quiet) but quieter than my Zotac 3090. My machine sits on my desk 2-3 feet away. All cards are blocked with a similar loop to @J7SC. Fans don't normally run over 1000 RPM.

This is so subjective though. It's tough to really quantify without actual measurements. I suspect it's not loud enough to hardly be detectible by a sound meter (maybe a really good one).

Even though I'm getting a bit older, I can still hear higher pitch sounds very clearly FWIW.


----------



## Wolverine2349

J7SC said:


> ...I didn't have coil whine with either the included 4-into1 dongle, or the 12VHPWR CableMod single cable. The case is TT Core P8 below (lower right is the 4090 setup that includes 5950X on Asus X570 DarkHero). The case has plenty of venting on the sides and neither system has coil whine, so no stereo pitch (thankfully).
> View attachment 2588411
> 
> 
> 
> @Nizzen ...looking forward to see your 'white' Asus 4090 Strix on the white Z790 Apex. If all else fails, you can go for the white Galax HoF OCL; heater on the back optional ☺ ...the Galax XOC vbios that has been making the rounds lately may very well have half of what the vbios for this one has, according to rumour.



Thanks for the info.

Interesting another poster Chispy over at Tech Power Up forums says they have an Asus Z790 Apex with 13700K and an MSI Suprim X 4090 with 0 whine even their ear right up to it. They have an eVGA P2 1200 watt PSU.

Common theme in both high end Asus motherboards yours being X570 Dark Hero and the other a Z790 Apex

I have an MSI Z690 Unify X. Maybe Asus motherboards are better quality and mitigate electrical noise and thus video card whine/buzz better than other high end motherboards?

Though go figure their motherboards best at mitigating and not producing it, but their actual video cards the worst at emitting it??


----------



## yzonker

Well I'll sit back and watch people swap mobos to try to fix coil whine. Lol.


----------



## J7SC

Wolverine2349 said:


> Thanks for the info.
> 
> Interesting another poster Chispy over at Tech Power Up forums says they have an Asus Z790 Apex with 13700K and an MSI Suprim X 4090 with 0 whine even their ear right up to it. They have an eVGA P2 1200 watt PSU.
> 
> Common theme in both high end Asus motherboards yours being X570 Dark Hero and the other a Z790 Apex
> 
> I have an MSI Z690 Unify X. Maybe Asus motherboards are better quality and mitigate electrical noise and thus video card whine/buzz better than other high end motherboards?
> 
> Though go figure their motherboards best at mitigating and not producing it, but their actual video cards the worst at emitting it??


...I never tried any of these fixes since I don't have the coil whine issue, but some folks swear by > these kind of things.


----------



## Laithan

MikeS3000 said:


> On a funny note I think my wife is ready to kill me. I ran OCCT GPU test on my 4090 at about 600w and also ran CB23 on my 13900k because I was curious what power I was drawing from the wall (my UPS has a display readout). So I hit 1100 W and .....Bam the PC and all of the lights in my office shut off. I blew the breaker switch and my UPS starts beeping like crazy because of an over current protection as it could not run my system at 1100w (UPS only rated for 780w). Need more amperage on that circuit in my house apparently!


I reailze that sometimes these things are not possible but here is what I do with some advice.

A battery backup on a modern high powered PC is just a bad idea unless you have the money to size it properly. The ones to properly handle 1500 Watts+ are priced in the *thousands*... Don't be confused with a "1500VA" APC UPS as that size is affordabe but is actually only rated for 1000 865 watts.. Anything higher is just big bucks. Our PC isn't portable so it doesn't need a battery . I use a Tripp Lite ISOBAR 6 Ultra for each circuit with PC gear. It is a very good quaity surge protector and all you need as the PSU does the rest.

You should treat your PC as you would a space heater or an air conditioner. Never use extension cords to power your main tower.. never use any other device on that circuit at the same time. My main tower is always on its own dedicated circuit. In the USA, this gives me 120V @ 15A which is a maximum limit of 1800W. My power supply is 1600W so even if I were to somehow draw 1600W (or spike higher) I should still be within spec of the power source. Get a dedicated circuit and have peace of mind and then it is there for you forever no matter what you throw at it.

Your *2nd* circuit is for everything else. Monitors, peripherals, USB hubs, printers etc. You'll likely never have an issue even if something else is on that circuit. If the distance is too far, you can use a heavy duty extension cord with 16AWG. There is never a high draw from those devices but I still use a higher gague wire for the extension because simplified the extension cable length determines the need for higher gauge not so much the load presented.




Wolverine2349 said:


> I am on a mission to find an RTX 4090 with as little to no coil whine as possible
> 
> I have tried 4 different RTX 4090 including 2 Gigabyte Gaming OCs and a Gigabyte Aorus Master and a PNY XLR8
> 
> Unfortunately all had too much a buzz like noise that will not be drowned out by the sound of game in a quiet area of the game. Though in areas of the game where sound is louder it would mostly if not all be drowned out, though looking for less whine than that.


One word: Headphones 

FWIW: My MSI Suprim Liquid X driven by my Corsair AX1600i doesn't produce any audible coil whine - not that I went looking for it either but I'm tweaking it on a test bench literally 2 feet away and so far haven't heard any. I always though it was just luck of the draw (silicon lottery) not so much board design but I'm not an expert as to how variable conditions affect it. Perhaps the PSU used and number of phases does contribute IDK...


----------



## Wolverine2349

Laithan said:


> I reailze that sometimes these things are not possible but here is what I do with some advice.
> 
> A battery backup on a modern high powered PC is just a bad idea unless you have the money to size it properly. The ones to properly handle 1500 Watts+ are priced in the *thousands*... Don't be confused with a "1500VA" APC UPS as that size is affordabe but is actually only rated for 1000 watts.. Anything higher is just big bucks. Our PC isn't portable so it doesn't need a battery . I use a Tripp Lite ISOBAR 6 Ultra for each circuit with PC gear. It is a very good quaity surge protector and all you need as the PSU does the rest.
> 
> You should treat your PC as you would a space heater or an air conditioner. Never use extension cords to power your main tower.. never use any other device on that circuit at the same time. My main tower is always on its own dedicated circuit. In the USA, this gives me 120V @ 15A which is a maximum limit of 1800W. My power supply is 1600W so even if I were to somehow draw 1600W (or spike higher) I should still be within spec of the power source. Get a dedicated circuit and have peace of mind and then it is there for you forever no matter what you throw at it.
> 
> Your *2nd* circuit is for everything else. Monitors, peripherals, USB hubs, printers etc. You'll likely never have an issue even if something else is on that circuit. If the distance is too far, you can use a heavy duty extension cord with 16AWG. There is never a high draw from those devices but I still use a higher gague wire for the extension because simplified the extension cable length determines the need for higher gauge not so much the load presented.
> 
> 
> 
> 
> One word: Headphones
> 
> FWIW: My MSI Suprim Liquid X driven by my Corsair AX1600i doesn't produce any audible coil whine - not that I went looking for it either but I'm tweaking it on a test bench literally 2 feet away and so far haven't heard any. I always though it was just luck of the draw (silicon lottery) not so much board design but I'm not an expert as to how variable conditions affect it. Perhaps the PSU used and number of phases does contribute IDK...



What motherboard is your Suprim X RTX 4090 in?


----------



## Laithan

Wolverine2349 said:


> What motherboard is your Suprim X RTX 4090 in?


ASUS Z790 Rampage Extreme


----------



## Wolverine2349

Laithan said:


> ASUS Z790 Rampage Extreme



Another Asus board. I really wonder if it is true Asus motherboards mitigate GPU coil whine better. The common theme so far the ones with no coil whine on their RTX 4090 GPUs are using some type of Asus motherboard for those that I asked. 3 so far 2 here and one on TechPowerup that stated the motherboard they have and it is an Asus.

Just a coincidence or is there something to it?


----------



## mirkendargen

Wolverine2349 said:


> Are you saying it is a fact that anything with that kind of electrical load will whine??
> 
> One thing is I always thought electronics were silent and it was motors and fans that made noise and only insane electrical whine or noise came form insanely high electric draw like in industrial settings or a power box to a large building.
> 
> I mean a toaster makes like no noise when and the coils generating that heat generate nearly 1000 watts. from the AC power outlet. I remember almost 13 years ago in early 2010 plugging one into a UPS to see and it was like 900 Watts or so.


Not necessarily a full on whine, it's normally a very faint static-y buzz. You can hear it from PSU's, power bricks, UPS's, etc. If you put your ear to a laptop power brick you'll hear what sounds like a mouse inside it, but it won't be audible without your ear to it unless it's super malfunctioning (I did have a laptop dock power brick that made a headache inducing ultrasonic while any time it was plugged in).Transformers outside like you say are also the same thing. A toaster doesn't have any kind of voltage conversion so there's nothing to whine, it's just straight resistive coils, they can sometimes vibrate too as they heat up, but it's a different kind of sound.


----------



## J7SC

mirkendargen said:


> (...)
> If you put your ear to a laptop power brick you'll hear what sounds like a mouse inside it


You know, in all my years, it has never occurred to me once to put my ear to a laptop power brick to hear the sounds of a mouse inside it. I wonder what else I missed out on...


----------



## mirkendargen

J7SC said:


> You know, in all my years, it has never occurred to me once to put my ear to a laptop power brick to hear the sounds of a mouse inside it. I wonder what else I missed out on...


The things you find when there's an annoying sound that's too high pitch to tell where it's coming from in your office and you're crawling around on the floor...


----------



## Patay Roland

Is it safe to flash Galax BIOS on Inno3d x3 oc?


----------



## J7SC

mirkendargen said:


> The things you find when there's an annoying sound that's too high pitch to tell where it's coming from in your office and you're crawling around on the floor...


There can be a lot of annoying high-pitch noises in an office (not naming names...)...I'll either go with @Laithan 's suggestion of headphones, or my trusty old tinfoil hat.


----------



## Baasha

BigMack70 said:


> The 12VHPWR connector is just a disaster on this gen of cards. My original adapter turned out to be what was causing all my crashes, and a CableMod cable is what fixed it for me. I spent almost a month trying to troubleshoot the PC because the adapter connection seemed solid, but nope...
> 
> I got the standard 27cm cable - CableMod Basics C-Series 12VHPWR PCI-e Cable for Corsair – CableMod Global Store


Interesting... yea for me it was the CableMod cable - it's definitely irritating that Nvidia dropped the ball on the connector/adapter on this new GPU. I'm just glad it wasn't the GPU itself.


----------



## 8472

Derbauer used an Asus MB when he had the loud coil whine on his strix 4080. There are also a couple of folks on YouTube that have coil whine with their cards paired with an Asus MB.


----------



## Wolverine2349

8472 said:


> Derbauer used an Asus MB when he had the loud coil whine on his strix 4080. There are also a couple of folks on YouTube that have coil whine with their cards paired with an Asus MB.



Yeah true though it seems Asus cards are just bad period. Any with non Asus RTX 4090 cards with modest or worse coil whine on high end Asus motherboards?

I wonder what determines all this lottery?? I mean all the components are manufactured the same and it is inductors not silicon that can cause the coil whine?? Is it the inductor lottery? Do all inductors have different tolerances for frequency vibrations?


----------



## KingEngineRevUp

newls1 said:


> Agreed.... MAn Games on my +1450- +1500 mem OC were CTD and I kept thinking it was my GPU Core OC, NOPE... I think its temp related tho and this might be one of the times where my WB is working against me. Seems I can be just fine if I get mem temps over 50c-ish but below that temp, seems +1410 is where shes happy. Mostly my mem temp is in the 25-30c range tho, so maybe when summer hits, I can increase my OC (never thought id ever say that and is complete backwards!)


Same story here, Even with 1400, there's a possibility you might crash when your system is idling and your memory drops down to ambient temperatures.

My memory dropped to 22c right now and ended up green screening again and I'm at 1350. I had to drop from 1500.


----------



## KingEngineRevUp

It's starting to feel like we need to write our own boost algorithm for the memory versus temperature

Like our system only applies a certain memory overclock if it reaches a certain temperature and if it goes below a temperature of our choosing then it can turn the overclocks off.

Is this possible? Can we write a DLL file or something to be ran into? MSI afterburner?


----------



## elbramso

yzonker said:


> Thought I was a little soft in SW yesterday. It likes mem so much that I can score higher at ambient, but with the 2x100w heaters on the backplate.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 336 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Made a few Superposition 4k runs too. Might do better with the chiller, but probably not much since the mem is so much higher. Depends on what it likes the most. It likes some Galax though.
> 
> View attachment 2588380
> 
> 
> And I re-ran with LOD +3 for the heck of it, although kinda stupid since it turns the benchmark in to a blurry mess. But I suspect some of the runs in their database are run this way.
> 
> View attachment 2588381


What's that mem offset in this Speedway run?
It's SOOO crazy^^ Yesterday I was able to finish a Speedway run with 3180mhz start to finish and 1600+ on the mem but still I've less points...
I still would like to know why the Galax BIOS is so much better in benchmarks.
And Superposition does not favor mem and warm temps? Maybe I should go for this


----------



## Betroz

Nizzen said:


> Long extention cable isn't the best, that's why I'm using the orginal Seasonic cable that is connected directly to the psu. That cable is more heavy than nvidia cable too. It looks quality too


Where did you buy it from? Last time I checked it wasn't for sale here in Norway where we live.


----------



## foresttree1

Did anyone notice any major score bump from upgrading from driver v31.0.15.2698 to driver v31.0.15.2756 . I saw significant increase in all 3DMark scores. Port royal went from 28000 to 28700 and speedway went from 11000 to 11250. I did do a clean install OS install between the drivers. All clock setting was the same in both driver run. I am just trying to see if it was the driver or something else that cause the increase in scores


----------



## J7SC

foresttree1 said:


> Did anyone notice any major score bump from upgrading from driver v31.0.15.2698 to driver v31.0.15.2756 . I saw significant increase in all 3DMark scores. Port royal went from 28000 to 28700 and speedway went from 11000 to 11250. I did do a clean install OS install between the drivers. All clock setting was the same in both driver run. I am just trying to see if it was the driver or something else that cause the increase in scores


...yup, it bumped scores not only for 4090s but also 3090s etc. in Port Royal and Speedway and perhaps other apps. Also, you might want to try to bump your VRAM speed by one or two bins with the new driver as it appears to allow for a bit more memory speed.


----------



## newls1

J7SC said:


> ...yup, it bumped scores not only for 4090s but also 3090s etc. in Port Royal and Speedway and perhaps other apps. Also, you might want to try to bump your VRAM speed by one or two bins with the new driver as it appears to allow for a bit more memory speed.


hmmm, I had no idea about the performance of this driver. will do this tonight. Thank you for your info


----------



## foresttree1

J7SC said:


> ...yup, it bumped scores not only for 4090s but also 3090s etc. in Port Royal and Speedway and perhaps other apps. Also, you might want to try to bump your VRAM speed by one or two bins with the new driver as it appears to allow for a bit more memory speed.


 Cool. I am just happy my port royal score seems more inline with what it should be. I was struggling to go past 28000. Now it increased by 750 point. Will test out the ram bins. I wonder if there is any core clock improvement too.


----------



## Nizzen

Betroz said:


> Where did you buy it from? Last time I checked it wasn't for sale here in Norway where we live.


Got it for free from seasonic


----------



## newls1

foresttree1 said:


> Cool. I am just happy my port royal score seems more inline with what it should be. I was struggling to go past 28000. Now it increased by 750 point. Will test out the ram bins. I wonder if there is any core clock improvement too.


I had no idea a driver could effect ram OC.. thats interesting. Can you report back with your findings please


----------



## yzonker

J7SC said:


> ...yup, it bumped scores not only for 4090s but also 3090s etc. in Port Royal and Speedway and perhaps other apps. Also, you might want to try to bump your VRAM speed by one or two bins with the new driver as it appears to allow for a bit more memory speed.


Huh, didn't see that in SW. Biggest difference is the mem. Although core clocks ended up about the same (indicated clock anyway) despite the higher temp. So maybe there is something different. I'm certain I ran less offset too which is more odd.









Result







www.3dmark.com


----------



## yzonker

elbramso said:


> What's that mem offset in this Speedway run?
> It's SOOO crazy^^ Yesterday I was able to finish a Speedway run with 3180mhz start to finish and 1600+ on the mem but still I've less points...
> I still would like to know why the Galax BIOS is so much better in benchmarks.
> And Superposition does not favor mem and warm temps? Maybe I should go for this


It was +1825. You can figure it out from the 3DMark page. (1541-1313)*8 = 1824 (it was set to +1825 though) 

Never saw any artifacts in SW if anyone is wondering. Going much at all above +1825 just resulted in a crash. 

I did get one artifacted Superposition run though. I just stopped it before it finished.

I'm not sure about Superposition though in regards to core vs mem. I only ran it once before on the 4090, back when I first got it. Probably prefers core given how much it gained with the Galax bios.


----------



## foresttree1

newls1 said:


> I had no idea a driver could effect ram OC.. thats interesting. Can you report back with your findings please


Sure. I will do some more testing when I get back.


----------



## shiokarai

Wolverine2349 said:


> I am on a mission to find an RTX 4090 with as little to no coil whine as possible
> 
> I have tried 4 different RTX 4090 including 2 Gigabyte Gaming OCs and a Gigabyte Aorus Master and a PNY XLR8
> 
> Unfortunately all had too much a buzz like noise that will not be drowned out by the sound of game in a quiet area of the game. Though in areas of the game where sound is louder it would mostly if not all be drowned out, though looking for less whine than that.
> 
> I had a GeForce RTX Gaming OC 3090 Ti and it had almost no coil whine at all in any situation and it was so hush and faint and very hard to detect even if you looked for it without putting your ear right next to the card
> 
> Sadly not the case for the 4 RTX 4090s I have tried.
> 
> 3 of the I sold and now left with a Gaming OC again with the same issues.
> 
> Is there any setup or any ones with no or extremely little coil whine??
> 
> Or is it just a reality that all 4090s have it more than just a little and no way around it??
> 
> Its hard to believe they could not have any without them as many 3090 Tis tested by Guru3D stated they had a hard time detecting it on those cards and the 3090 Ti power draw is very similar to factory RTX 4090 power draw around 450 to 460 watts,
> 
> And yeah I did try power limit and it does almost nothing unless I put it below 60% which is unacceptable gimping of performance. If I could almost eliminate the buzzing/whine at 80 to 85% power usage, great. But having to go below 60% or even as low as 50% is not good.
> 
> Strange thing is the whine is actually bad in high end AAA games even with FPS only 80 to 120. In Furmark bench Mark at 1080P with MSAA at 8X FPS is like 200 to 300 and like no buzz or whine. Or course FPS goes into 700s with MSAA off in Furmark and it starts to make a different kind of whine which is ok if FPS goes that high as never in real world usage.
> 
> I have seen many reports that say Zotac, Gigabyte supposedly have less chance of it and are better. Though the Gigabytes I had all have too much.
> 
> And have heard Asus is worst and MSI also has lots of complaints.
> 
> Though I have heard some reports of people insisting they have none even with their ear right by an MSI Suprim X. I wonder is the newer batch of cards from MSI better. I know Suprim X has the best VRM of all cards. The Gaming X Trio has 8 less power stages and only 50 amps at that.


My Zotac AMP Extreme AIRO has virtually no coil whine, it's basically silent.


----------



## elbramso

yzonker said:


> It was +1825. You can figure it out from the 3DMark page. (1541-1313)*8 = 1824 (it was set to +1825 though)
> 
> Never saw any artifacts in SW if anyone is wondering. Going much at all above +1825 just resulted in a crash.
> 
> I did get one artifacted Superposition run though. I just stopped it before it finished.
> 
> I'm not sure about Superposition though in regards to core vs mem. I only ran it once before on the 4090, back when I first got it. Probably prefers core given how much it gained with the Galax bios.


thanks that was helpful 

I'm gonna be in the top10 Speedway HOF by the end of today 4 sure^^ I'm still on the old driver oO


----------



## Glottis

Wolverine2349 said:


> Yeah true though it seems Asus cards are just bad period. Any with non Asus RTX 4090 cards with modest or worse coil whine on high end Asus motherboards?
> 
> I wonder what determines all this lottery?? I mean all the components are manufactured the same and it is inductors not silicon that can cause the coil whine?? Is it the inductor lottery? Do all inductors have different tolerances for frequency vibrations?


Most likely explanation is AIBs have different suppliers for components. One card has crappy power delivery components that still "meet the specs" while other has good ones. This practice is widely used across the industry, it happens in consoles, phones, TVs etc.

Shifting blame on PSU, Mobo, power companies, "coil whine gets better over time", etc. are all red herrings dreamt up by companies so customers run out of return period while desperately trying to fix the issue themselves.


----------



## simonabamber

I have a EK 4090 block (no ABP) and my Die-to Hotspot delta is about 13 degrees, with memory being around same temp as the hotspot give or take - is it worth repasting/repadding? I used the stock EK pads and paste but I'm not sure if I'm leaving some thermal performance on the table?


----------



## shiokarai

simonabamber said:


> I have a EK 4090 block (no ABP) and my Die-to Hotspot delta is about 13 degrees, with memory being around same temp as the hotspot give or take - is it worth repasting/repadding? I used the stock EK pads and paste but I'm not sure if I'm leaving some thermal performance on the table?


Is it worth the extra 15-30 MHz?


----------



## MrTOOSHORT

simonabamber said:


> I have a EK 4090 block (no ABP) and my Die-to Hotspot delta is about 13 degrees, with memory being around same temp as the hotspot give or take - is it worth repasting/repadding? I used the stock EK pads and paste but I'm not sure if I'm leaving some thermal performance on the table?


I've read up to 15'C delta gpu to hotspot is fine. I'd just leave it.


----------



## Edge0fsanity

Glottis said:


> Most likely explanation is AIBs have different suppliers for components. One card has crappy power delivery components that still "meet the specs" while other has good ones. This practice is widely used across the industry, it happens in consoles, phones, TVs etc.
> 
> Shifting blame on PSU, Mobo, power companies, "coil whine gets better over time", etc. are all red herrings dreamt up by companies so customers run out of return period while desperately trying to fix the issue themselves.


Yep, exactly this.

My asus tuf has bad cw, can hear it over the fans at full speed. Was tested on a aorus x570 xtreme mobo w/evga 1200w p2 psu and an asus z790 maximus extreme w/superflower 1600w platinum psu. No change. Its going into an open frame case with a waterblock and enough rad and pumps for a silent build at any load. The cw will drive me nuts so I'll have to play gpu roulette once stock normalizes and sell this card. I think if aibs are seeing enough returns over this issue they'll make an effort to fix it so its less common.


----------



## MrTOOSHORT

4 PCs in the house, no coil whine on any gpu. Since I can remember, no coil whine on my gpus except one. Titan X card stock had no coil whine. I installed an EK block and nickel back plate. All of a sudden coil whine. I had to put plastic washers when reinstalling the backplate, no more coil whine.

I believe it's a dirty power or grounding issue in a lot of cases where people live.


----------



## Nd4spdvn

MrTOOSHORT said:


> I've read up to 15'C delta gpu to hotspot is fine. I'd just leave it.


I noticed that this delta from GPU to hotspot die actually varies with how of the silicon's portions are loaded. A fully loaded silicon with RT, tensor, raster and mem controller will show a higher delta which could be, yes, up to 15C, whereas a lighter load with just the raster can stay below 10C usually. This can be easily seen for example with Unigene Heaven 4.0 where the core is loaded 100% raster side but the delta is like 4-6C, but with Cyberpunk 2077 where more silicon parts are under stress with RT and DLSS the delta is higher, like 8-11C, at least easily observable in my particular 4090 sample (Suprim X air).


----------



## elbramso

Well guys, I couldn't do it... I'm 18 points short to the Top10 in Speedway.
I did not try cold water + heat gun on the mems yet - but if this is necessary this bench-season than... well^^
And before someone asks, I did not upload my scores yet


----------



## MrTOOSHORT

elbramso said:


> Well guys, I couldn't do it... I'm 18 points short to the Top10 in Speedway.
> I did not try cold water + heat gun on the mems yet - but if this is necessary this bench-season than... well^^
> And before someone asks, I did not upload my scores yet


11th is a proud spot to be in anyways. good job.

My card is very average. Alot more tough for me to get good scores in 3dmarks.


----------



## WayWayUp

delete


----------



## Edge0fsanity

MrTOOSHORT said:


> 4 PCs in the house, no coil whine on any gpu. Since I can remember, no coil whine on my gpus except one. Titan X card stock had no coil whine. I installed an EK block and nickel back plate. All of a sudden coil whine. I had to put plastic washers when reinstalling the backplate, no more coil whine.
> 
> I believe it's a dirty power or grounding issue in a lot of cases where people live.


It's not. I had a 3090 kpe, a couple of 3090 strix, 3 3090 ftw3s last gen that were run in this same system. 0 cw. I've had other cards with various levels of cw over the years the worst of which was a launch 2080ti fe. Loudest cw I've ever heard. Returned it for another, 0 issues. I really don't know where you guys come up with this dirty ground or other system issue crap. Totally false and unfounded.


----------



## Wolverine2349

shiokarai said:


> My Zotac AMP Extreme AIRO has virtually no coil whine, it's basically silent.



Which motherboard and PSU and CPU are you using? And how are the fans on the Zotac AMP AIRO Extreme? Are they ok or does the motor make a bad noise when they spin like some have said?


----------



## Wolverine2349

Glottis said:


> Most likely explanation is AIBs have different suppliers for components. One card has crappy power delivery components that still "meet the specs" while other has good ones. This practice is widely used across the industry, it happens in consoles, phones, TVs etc.
> 
> Shifting blame on PSU, Mobo, power companies, "coil whine gets better over time", etc. are all red herrings dreamt up by companies so customers run out of return period while desperately trying to fix the issue themselves.



Yeah though how come Asus Strix cards have such bad whine if they have componenets that are very high end for power delivery? Or are they for power delviery but cheap in other areas thus whine?

And do you think the same manufacturers for same card can change components without us knowing that provide same power delivery, but have more vibrations and thus whine as they get them cheaper.


----------



## MrTOOSHORT

Wolverine2349 said:


> Yeah though how come Asus Strix cards have such bad whine if they have componenets that are very high end for power delivery? Or are they for power delviery but cheap in other areas thus whine?
> 
> And do you think the same manufacturers for same card can change components without us knowing that provide same power delivery, but have more vibrations and thus whine as they get them cheaper.


Also throw in 2080ti fe like *Edge0fsanity *mentions had worst of cw of his past cards when that card was over built.

You went through four 4090s right?, problem is somewhere else logically.


----------



## Krzych04650

8472 said:


> I'm really surprised that you've experienced it on multiple gaming OCs. Most owners swear that theirs doesn't have any.
> 
> I honestly think that the reason that some people don't have it is because they either play games that don't make it a worst case scenario like cyberpunk does or they have headphones/fans that cover up the noise.
> 
> It seems insane that it really is luck of the draw, and it's really frustrating having to constantly hunt down cards that might take forever to come back in stock.
> 
> I personally don't buy the explanation that the PSU is to blame, there have been plenty of people that tried several power supplies from a variety of companies with the same results.
> 
> I'm going to give Asus one more try and try to get a hold of a strix. If that whines I'll try the Suprim X.


The number of variables here is massive. First of all, sensitivity to this kind of low pitch whine sounds varies massively from person to person, so that already complicates things massively. Then, just like with displays, surrounding environment can have a massive impact on how things are perceived. Also, different kinds of games and different kinds of loads causing different kind of coil whine. This creates the situation so complex that drawing any kind of reliable conclusions is impossible.

My Inno3D has the last whine I've ever seen, and I was very impressed by it, but this doesn't really stand once you start pushing the card and it gets a lot worse than stock. And from what I am seeing this is mostly due to voltage and not necessarily power. In games that do scale up to 600W you can hear massive difference when switching from 600W to 450W and then to 350W profiles, 600 has big whine, 450W light whine and 350W no whine, but there are games with unusually low power draw where even full 1100mV locked OC draws only around 350W and the whine is just as bad as at 600W. And what that power limit switching in power hungry games does is that 600W runs at 1100mV, 450W around 1000mV and 350W around 900mV and this is what makes the difference. You still need a card that has low base whine at 450W/1000mV, you cannot paint **** as we say here, but it is possible to avoid whine this way, you'd just need to forgo that 8-10% performance gain from overclocking and run 900mV undervolt with memory OC that would generally match stock FE performance.

Other effective way is to run external watercooling which enables you to use almost fully enclosed case, which if placed at a decent distance like 2 meters away, facing away and not on ear level can make a massive difference by itself. It is always surprising that it needs to be said but having your case on a desk 30cm away from you with GPU right on your ear level so it can whine straight into your ear all day is a terrible design. Don't know how you have it arranged, but most people have it this way it seems.

Or if you don't care much about the aesthetics of the setup then just throw it away into a different room, this solves all problems just like that, only drawback is that you won't see it in your room.


----------



## Wolverine2349

MrTOOSHORT said:


> Also throw in 2080ti fe like *Edge0fsanity *mentions had worst of cw of his past cards when that card was over built.
> 
> You went through four 4090s right?, problem is somewhere else logically.



Yeah must be. Likely the mobo. My power supply is top notch and powered a 12th gen system on Asus Tuf DDR4 Z690 no issues and almost no coil whine on 3090 Ti.

And to make matters worse, even setting a power limit does not seem to help. And when it did help I had to gimp performance of the card a lot by going to 60% or lower and even that seemed short lived as buzzing in RDR2 on Giga Gaming OC 4090 came right back and fast. And that was even with the power slider at 45% Yikes. Its not a severe buzz that can be heard across the room, but ear like a foot form card it is a harsh buzz. I understand hearing a minor squeal or whine or more faint buzz with ear close to the card, but a solid annoying buzz that spreads even to hear annoying sound modestly 3 to 5 feet from the case even when case is closed is just bad.

So as you replied to my other thread looking at switching to an Asus mobo. Even though this MSI Z690 Unify X with right BIOS worked with full stability at XMP settings for DDR5 out of the box. Hopefully issues with my other Asus boards were that they were 4 DIMM instead of 2 DIMM even though I always only use 2 sticks either way and not a general Asus DDR5 XMP problem??


----------



## Sheyster

Baasha said:


> Interesting... yea for me it was the CableMod cable - it's definitely irritating that Nvidia dropped the ball on the connector/adapter on this new GPU. I'm just glad it wasn't the GPU itself.


I have the Cablemod 4x8-pin standard cable with the plastic shroud (not the cheaper basic one they recently introduced). I also removed the shroud as I felt it interfered with the connector being fully inserted. You mentioned your cable was the custom 1000mm (1 meter) style. That sounds like it might be too long and I can't imagine many folks have one of those. Hopefully your next cable works. If not, definitely get a shorter one.


----------



## Sheyster

KingEngineRevUp said:


> It's starting to feel like we need to write our own boost algorithm for the memory versus temperature
> 
> Like our system only applies a certain memory overclock if it reaches a certain temperature and if it goes below a temperature of our choosing then it can turn the overclocks off.
> 
> Is this possible? Can we write a DLL file or something to be ran into? MSI afterburner?


Run Vulkan memory test for a minute, then CTRL-C out of it. That will get the mem temps up a bit. Quickest way I can think of to warm up VRAM. One mouse click to run it, CTRL-C to get out, and done!


----------



## Krzych04650

Around 18C delta to water on Alphacool ES block after 20 minutes of Port Royal, seems pretty normal. Nice block quality wise as well. Looks a bit silly once installed like all of those short blocks but what to do...


----------



## yzonker

Krzych04650 said:


> Around 18C delta to water on Alphacool ES block after 20 minutes of Port Royal, seems pretty normal. Nice block quality wise as well. Looks a bit silly once installed like all of those short blocks but what to do...
> View attachment 2588503


18C at 500w is pretty good. My Bykski block is 18-20C in PR I think at full OC with the Galax bios. But it's still around 500w. My card doesn't pull a lot of power. It was only 470w or so on the Strix bios.


----------



## Krzych04650

yzonker said:


> 18C at 500w is pretty good. My Bykski block is 18-20C in PR I think at full OC with the Galax bios. But it's still around 500w. My card doesn't pull a lot of power. It was only 470w or so on the Strix bios.


Yea the average power from those 20 minutes was 484W, Port Royal doesn't draw that much power. Also lower temperatures decrease the power draw by quite a bit, for me it shaved off up to 30W on the high-end, could push some games to up to 620W and now cannot go above 590W. 

590W can push delta to around 22C









Still great though, this basically isn't any worse than with my 2080 Ti and it was 380W max, around 350W in Port Royal. 

This is also 30C+ cooler than air, hot spot is 40C cooler. I only need +90 core offset for the same effect as +150 on air.


----------



## Sheyster

yzonker said:


> My card doesn't pull a lot of power. It was only 470w or so on the Strix bios.


My GB-G-OC pulled 498w in WZ2 with the Strix BIOS and 501W with the Satan/Galax BIOS.  This is at stock settings on the video card with 4K120, G-Sync+V-Sync and frame capped to 117! Granted, all game settings are on Ultra. It's a flawless experience though, no frame dips whatsoever.


----------



## J7SC

Sheyster said:


> My GB-G-OC pulled 498w in WZ2 with the Strix BIOS and 501W with the Satan/Galax BIOS.  This is at stock settings on the video card with 4K120, G-Sync+V-Sync and frame capped to 117! Granted, all game settings are on Ultra. It's a flawless experience though, no frame dips whatsoever.


I have been switching vbios for comparative runs (love those dual vbios), and one of the keys with the not-evil Galax isn't so much higher peak clocks but just less downclocking, so higher average clocks, even at the same temps. In some more memory-sensitive benches where core clocks matter less, the stock Giga-G-OC can still compete quite well, in others, its Galax all-the-way.


----------



## shiokarai

Wolverine2349 said:


> Which motherboard and PSU and CPU are you using? And how are the fans on the Zotac AMP AIRO Extreme? Are they ok or does the motor make a bad noise when they spin like some have said?


Corsair AX1500i + Asus board, there are no issues with the fans, weird noises etc. Also silent with Bykski waterblock


----------



## mirkendargen

I have two cards that were tied for worst coil whine. I had a 5900 Ultra that made an audible click sound every time you scrolled the mouse wheel on something with a white background (yes really, it was apparently a design flaw with a lot of them). Then a 970 from....eVGA I think it was? That just plain squealed. I think that was also a known thing and people were returning them to Newegg en masse for different brands.


----------



## ttnuagmada

simonabamber said:


> I have a EK 4090 block (no ABP) and my Die-to Hotspot delta is about 13 degrees, with memory being around same temp as the hotspot give or take - is it worth repasting/repadding? I used the stock EK pads and paste but I'm not sure if I'm leaving some thermal performance on the table?


I have the EK block on an FE. Im getting about a 9-10C hotspot delta, and memory more or less tracks the core temp when the card is capped at 80% power using a +1100 mem overclock. I did use Thermaltake pads though.


----------



## Wolverine2349

mirkendargen said:


> I have two cards that were tied for worst coil whine. I had a 5900 Ultra that made an audible click sound every time you scrolled the mouse wheel on something with a white background (yes really, it was apparently a design flaw with a lot of them). Then a 970 from....eVGA I think it was? That just plain squealed. I think that was also a known thing and people were returning them to Newegg en masse for different brands.



Was that FX5900 Ultra from back in 2003? I remember reading about that card and how Radeon was the better buy 9700 Pro and 9800 Pro then as I was about to graduate high school. Then NVIDIA almost always on top since or they traded blows with maybe a rare outlier here or there where ATI/AMD was better after that time.


----------



## mirkendargen

Wolverine2349 said:


> Was that FX5900 Ultra from back in 2003? I remember reading about that card and how Radeon was the better buy 9700 Pro and 9800 Pro then as I was about to graduate high school. Then NVIDIA almost always on top since or they traded blows with maybe a rare outlier here or there where ATI/AMD was better after that time.


Yes indeed, worst GPU I've ever owned lol. But you could straight flash the BIOS to a 5950U!


----------



## jootn2kx

Small additional test on the galax666 bios: 
Even on my lowest gaming profile with limiting powerlimit to 70% and +165core i'm still getting around 29K score in 3Dmark which is impressive .


----------



## raad11

Ok, I may have stumbled onto a sort of temporary solution for the stuttering problems in Overwatch 2. Someone on Bnet forums recommended it. Apparently running Razer Cortex, but not the Game Boost thing (which seems like it's a cool idea, except it crashes my Windows). Rather, the 'Optimize' feature which optimizes games' graphical settings. You select OW2, hit 'Optimize' and it should change the profile/config files. For me it suggested all Ultra/Epic settings (I had it at that anyway except AA and Shadows).

Well, doing it results in the stutter going away. Lasted 4 days before I had to do it again.

Interestingly, it says there's a option called "Shader Quality" or something, except there isn't. It's just the Graphical Presets from the dropdown (i.e, Low, Medium, High, Ultra, Epic). And when you select Ultra/Epic or whatever, the in-game option seems to only be switched to 'High' (I used to have it set on 'Ultra' myself). I didn't think this did anything or meant anything, because I had all the individual options set to their max. And I didn't notice it reload shaders or anything either. I dunno, just trying to make sense of why this works. Or why it only works temporarily.

Clearly this must be some sort of bug with that particular game, right?

Another thing I noticed and wanted to ask you guys about:

So on default GPU settings, with OW2 profile set to max settings, the GPU Power and GPU Core Load graphs are nice and flat. Very little fluctuation. But when I load my OC settings (Voltage/Power sliders set all the way to right, plus a Curve which was generated by the auto feature in MSI Afterburner), Core speed goes from 2735 or so to 2910 or so, and Core Load goes down and fluctuates more and power usage is less "flat" but overall higher than before (though Power Limit only maxes at 106%).

This sort of makes sense I suppose, the game might have been limited by the stock curve before and now it uses only as much as it needs, so there's more fluctuations in core load and power usage, and the OC curve plus raised power limit makes it use more power.

Or is the curve I'm using all wrong and should I just not use a curve but enter a frequency manually and lock it? I want to only enable the OC profile when running a game.

I use MSI Afterburner but I also have EVGA Precision installed. Card is a MSI RTX 4090 Gaming Trio 24G.


----------



## Krzych04650

Did anyone try running Galax 666W BIOS with ECC enabled? I cannot complete any of the 3D mark stress tests even on stock settings. It got me quite concerned since it would suggest that VRAM is bad so I flashed back to Inno3D just to see if it has this issue as well and it doesn't, everything is fine on this BIOS.


----------



## bmagnien

Few more benches with the latest drivers + Galax. I can't adequately compete in 4k superposition anymore due to being CPU limited with the 5800x3D, which shows how much of a beast the 4090 in this configuration is. Although I did score a 39,212 (7th place) despite my GPU utiliziation dropping to 78%  8k results were good enough for 12th place and 4th place among ambient temps:
















Upped my scores a bit in SpeedWay and PR as well:

















Now to test it all out in real world gaming with the relase of Witcher 3 Enhanced Edition tonight!


----------



## Shoggoth

Does anyone here have a Gainward 4090 Phantom GS and want to share their impressions? I've got a handle on a second hand card and wouldn't mind some input before I bite.


----------



## J7SC

Krzych04650 said:


> Did anyone try running Galax 666W BIOS with ECC enabled? I cannot complete any of the 3D mark stress tests even on stock settings. It got me quite concerned since it would suggest that VRAM is bad so I flashed back to Inno3D just to see if it has this issue as well and it doesn't, everything is fine on this BIOS.


...Yes, I ran the Galax vbios with EEC in 3DM Port Royal and Speedway. So far only with a mid-level core and VRAM / EEC oc, but it worked fine. Speedway however is not only losing relatively more score than Port Royal w/EEC on, it is also much more finicky. Once you crashed / didn't complete the Speedway run, even reducing VRAM speed to stock is unlikely to work in a subsequent run (badly written code for SW?). Anyway, after a reboot, you'll be able to run it again, even with EEC oc (just how high is the question...).

...In general, noticed that the Galax doesn't peak on core quite as high as my stock vbios (the 3210 / 3195 below), but I can now do runs at 3165 MHz and even 3180 MHz with the Galax - and the Galax does not downclock nearly as much as the stock vbios...often peak and effective are either the same or only one bin apart. The Galax really does behave like a full-fledged XOC vbios (probably more aggressive PLL). As I mentioned before, for gaming, it is probably better to stick with a milder vbios.


Spoiler


----------



## KingEngineRevUp

simonabamber said:


> I have a EK 4090 block (no ABP) and my Die-to Hotspot delta is about 13 degrees, with memory being around same temp as the hotspot give or take - is it worth repasting/repadding? I used the stock EK pads and paste but I'm not sure if I'm leaving some thermal performance on the table?


Your delta should be 7-10C. More than likely the thermal pads were hard and you didn't actually torque the screws down to the point they bottomed out. In the future, use a blow dryer and warm the pads up so they're softer then mount the block.


----------



## yzonker

Why are you guys testing ECC (unless you're doing HWBot I guess). I will be surprised if 3DMark implements/enforces it for the 4090. Seems like this is a situation where the horse has already left the barn. Only way to fix that is to delete the entire 4090 portion of the database. 

Seems more likely something will get implemented for 50 series (if ever).


----------



## Laithan

Secret sauce?


----------



## gamerMwM

yzonker said:


> 18C at 500w is pretty good. My Bykski block is 18-20C in PR I think at full OC with the Galax bios. But it's still around 500w. My card doesn't pull a lot of power. It was only 470w or so on the Strix bios.


My FE 4090 with Corsiar water block is getting around 22° Delta (coolant temp to GPU Core temp) with +225 Core +1100 VRAM on a Port Royal loop custom run (no settings changed, just looped). 10° delta Core to Hotspot.

Not as cool as some of you guys with Bykski blocks. Pads and GPU paste all stock, though I did warm up the pads before tightening things down. 21°C ambient and several rads.

Sent from my SM-G975U using Tapatalk


----------



## raad11

Ok holy ****. It was more than that. It was electrical noise or interference of some kind. Hell, it still is, I'm not sure what is even happening.

So I have a USB Sound Blaster sound card and an internal Sound Blaster Z sound card. The internal one was known to give some noise and the USB one had better surround. I've been using the USB one for years. Literally since 2016-2017.

First clue was that when I muted the sound, the stuttering all but disappeared.

But then running Razer Cortex to optimize the game's settings profile also made the stuttering all but disappear. _But not always_. So I went back to the sound thing, and I knew I had ground interference issues because I have my laptop right next to my desktop. Two actually, my work laptop and my personal laptop, and just a few days ago I had wired everything into a cheap mixer off eBay. It actually helped reduce that ground interference noise which was coming out of like everything at that point.

So I unplug the USB Sound Blaster and the GPU Core Load curve is FLAT af. Like a straight line at 85-90% (well, relatively). I was like whaaaaaaaaaaaaaaaaaaat.

Then I turned on my OC profile via Afterburner and put on Ultimate Performance power mode. Bam, stuttering is back! I was like ***. So I went back to Balanced power mode, turned on OC profile and it worked. I also tried enabling High Performance mode and that seemed to be okay too. I will have to experiment with these repeatedly to be sure since the problem was of a tricky nature to reproduce in the first place.

I'm using internal SBZ sound card for audio right now. The issue with this card is I hear interference noise from the GPU, like a coil whine coming out of my headphones, but it fades into the background. And come to think of it, the USB Sound Blaster was also giving this lately.

What I cannot understand is why this just started on one day (Nov 29th update) and is noticeable in one game (Overwatch 2). And why using Razer Cortex to "reset" the game profile/preset/config for graphical details also somehow temporarily fixes it. But now that I suspect some kind of interference may be involved, I guess it's feasible just any random thing can set it off and perhaps using the Razer Cortex profile just had it use the GPU in a way that stopped the interference from "taking hold" more often.

Like I said, this doesn't happen in other games. It may be related to OW being a very high framerate game (600fps) which is the sort of thing to get the GPU's coil whine going in general. My previous card was an EVGA 3080 Ti FTW3 Ultra which had plenty of it, but never any sort of stuttering.

And lastly, something makes the CPU more vulnerable to getting involved with this. Thus the power profile thing. Well, first thing I'm gonna do next is try to plug some **** into other outlets to see if I can reduce any interference.

The motherboard is a ROG Strix Z690-A Gaming WiFi D4 btw. CPU is 13900K.


----------



## mirkendargen

raad11 said:


> Ok holy ****. It was more than that. It was electrical noise or interference of some kind. Hell, it still is, I'm not sure what is even happening.
> 
> So I have a USB Sound Blaster sound card and an internal Sound Blaster Z sound card. The internal one was known to give some noise and the USB one had better surround. I've been using the USB one for years. Literally since 2016-2017.
> 
> First clue was that when I muted the sound, the stuttering all but disappeared.
> 
> But then running Razer Cortex to optimize the game's settings profile also made the stuttering all but disappear. _But not always_. So I went back to the sound thing, and I knew I had ground interference issues because I have my laptop right next to my desktop. Two actually, my work laptop and my personal laptop, and just a few days ago I had wired everything into a cheap mixer off eBay. It actually helped reduce that ground interference noise which was coming out of like everything at that point.
> 
> So I unplug the USB Sound Blaster and the GPU Core Load curve is FLAT af. Like a straight line at 85-90% (well, relatively). I was like whaaaaaaaaaaaaaaaaaaat.
> 
> Then I turned on my OC profile via Afterburner and put on Ultimate Performance power mode. Bam, stuttering is back! I was like ***. So I went back to Balanced power mode, turned on OC profile and it worked. I also tried enabling High Performance mode and that seemed to be okay too. I will have to experiment with these repeatedly to be sure since the problem was of a tricky nature to reproduce in the first place.
> 
> I'm using internal SBZ sound card for audio right now. The issue with this card is I hear interference noise from the GPU, like a coil whine coming out of my headphones, but it fades into the background. And come to think of it, the USB Sound Blaster was also giving this lately.
> 
> What I cannot understand is why this just started on one day (Nov 29th update) and is noticeable in one game (Overwatch 2). And why using Razer Cortex to "reset" the game profile/preset/config for graphical details also somehow temporarily fixes it. But now that I suspect some kind of interference may be involved, I guess it's feasible just any random thing can set it off and perhaps using the Razer Cortex profile just had it use the GPU in a way that stopped the interference from "taking hold" more often.
> 
> Like I said, this doesn't happen in other games. It may be related to OW being a very high framerate game (600fps) which is the sort of thing to get the GPU's coil whine going in general. My previous card was an EVGA 3080 Ti FTW3 Ultra which had plenty of it, but never any sort of stuttering.
> 
> And lastly, something makes the CPU more vulnerable to getting involved with this. Thus the power profile thing. Well, first thing I'm gonna do next is try to plug some **** into other outlets to see if I can reduce any interference.
> 
> The motherboard is a ROG Strix Z690-A Gaming WiFi D4 btw. CPU is 13900K.


Check your DPC latency while playing the game with sound and see if it's spiking too, and what specifically is spiking.


----------



## BigMack70

Today is a good day... Witcher 3 maxed out with ray tracing at 4k 110-120fps... I bought two Titan X cards back in the day to play this game at 4k 60 when it came out, and it's awesome to now be able to play it at 4k120 with all the next gen features!

Some slight stuttering with DLSS frame generation on but nothing too distracting. Assume it's the DLSS 3 vsync bug that hopefully they can fix.


----------



## mirkendargen

BigMack70 said:


> Today is a good day... Witcher 3 maxed out with ray tracing at 4k 110-120fps... I bought two Titan X cards back in the day to play this game at 4k 60 when it came out, and it's awesome to now be able to play it at 4k120 with all the next gen features!
> 
> Some slight stuttering with DLSS frame generation on but nothing too distracting. Assume it's the DLSS 3 vsync bug that hopefully they can fix.


I'm pretty sure the stuttering is shader compilation. It's real bad but clears over time.


----------



## jediblr

BigMack70 said:


> Today is a good day... Witcher 3 maxed out with ray tracing at 4k 110-120fps


tell me how you can see fps in this game right now? AB not working


----------



## BigMack70

mirkendargen said:


> I'm pretty sure the stuttering is shader compilation. It's real bad but clears over time.


I dunno... I can just spin the camera in circles while standing still in some scenes and get periodic stuttering. I don't notice it too much in normal gameplay - I like 4k120 with slight stuttering better than 4k80 with less/no stutter


----------



## BigMack70

jediblr said:


> tell me how you can see fps in this game right now? AB not working


The AB overlay isn't working but the readout is working at desktop - I just alt-tab and look at what it's doing


----------



## Talon2016

Nvidia performance overlay works great in Witcher 3. First time I've used it. Though you have to have GFE installed and overlay turned on.

Alt+R to get overlay with FPS.


----------



## Sheyster

jediblr said:


> tell me how you can see fps in this game right now? AB not working





BigMack70 said:


> The AB overlay isn't working but the readout is working at desktop - I just alt-tab and look at what it's doing





Talon2016 said:


> Nvidia performance overlay works great in Witcher 3. First time I've used it. Though you have to have GFE installed and overlay turned on.
> 
> Alt+R to get overlay with FPS.


Apparently this is a fix for the AB overlay:









MSI AB / RTSS development news thread


@Unwinder Suggestions for the list of pre-created profiles in RTSS for disabling OSD by default: https://clipgrab.org - clipgrab.exe...




forums.guru3d.com


----------



## bmagnien

Sheyster said:


> Apparently this is a fix for the AB overlay:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI AB / RTSS development news thread
> 
> 
> @Unwinder Suggestions for the list of pre-created profiles in RTSS for disabling OSD by default: https://clipgrab.org - clipgrab.exe...
> 
> 
> 
> 
> forums.guru3d.com


Yah that works to get the AB OSD to display. but turning on Ray tracing in the game literally crashes it for me 9/10 times; and when it doesn’t theres awful stutters especially when coming out of the pause screen or inventory menus. Tons of complaints on Reddit with similarly highly specced systems. Its not shader comp, they use their own red engine and this is lasting way longer than normal shader comp and never improves. Hopefully a fix will be inbound soon


----------



## SilenMar

Pretty funny anybody with a GALAX BIOS clocked at 3,000MHz core, +1600MHz VRAM can get 29,000+ bug run in Portal Royal higher than the actual GALAX 3,200MHz LN2 XOC.
It seems VRAM OC is the new king.


----------



## KingEngineRevUp

gamerMwM said:


> My FE 4090 with Corsiar water block is getting around 22° Delta (coolant temp to GPU Core temp) with +225 Core +1100 VRAM on a Port Royal loop custom run (no settings changed, just looped). 10° delta Core to Hotspot.
> 
> Not as cool as some of you guys with Bykski blocks. Pads and GPU paste all stock, though I did warm up the pads before tightening things down. 21°C ambient and several rads.
> 
> Sent from my SM-G975U using Tapatalk


It would be nice to know power draw also.


----------



## th3illusiveman

im getting ~70fps w/ dlss at 4k max in the new witcher.


----------



## jootn2kx

th3illusiveman said:


> im getting ~70fps w/ dlss at 4k max in the new witcher.


 Have you turned on frame generation + dlss?


----------



## Krzych04650

Yea Witcher 3 next-gen is an absolute mess. You could already see that on some preview videos where people would get constant artifacts with shadows flickering, pop-in and etc, but in person it is even a lot worse. The game runs and looks as if you forced random SLI profile on it. Plus it crashes a lot when changing settings. Running just Ultra without enabling any new things also has 15-20% less performance than original. Ultra+ settings are a joke, LODs and foliage/grass distances for example are way worse than modded. Not impressed at all.

EDIT

Correction on the foliage/grass LODs. Grass is pushed further, 2.5 instead or original 1.5, which is about the maximum before getting issues with stutters and displacement, but foliage is left on default 1.8, so you still get those ugly imposter trees at fairly close distances. It is tweakable in the .ini.

There is still a lot of pop-in on the bushes and shadows in general, so that will need modding, but the game does generally look very nice, just needs manual configuration of settings in the .ini and some LOD mods. Or at least I hope that is all it will need, some of those things may not be fixable.


----------



## keikei

Krzych04650 said:


> Yea Witcher 3 next-gen is an absolute mess. You could already see that on some preview videos where people would get constant artifacts with shadows flickering, pop-in and etc, but in person it is even a lot worse. The game runs and looks as if you forced random SLI profile on it. Plus it crashes a lot when changing settings. Running just Ultra without enabling any new things also has 15-20% less performance than original. Ultra+ settings are a joke, LODs and foliage/grass distances for example are way worse than modded. Not impressed at all.


While the intro looked decent, I could see some issues already. I'mma give it some time for some additional patches to play it. Apparently, there's a gog launcher now with the game even with a steam version. I'm more concerned with the state of the game atm. Dangit. Guess I'll be playing Death Stranding in the meantime.


----------



## Krzych04650

keikei said:


> While the intro looked decent, I could see some issues already. I'mma give it some time for some additional patches to play it. Apparently, there's a gog launcher now with the game even with a steam version. I'm more concerned with the state of the game atm. Dangit. Guess I'll be playing Death Stranding in the meantime.


Yea I am not even touching anything until at least 6 months to 2 years after release these days, I did run this only because it is an update for an existing game that I already know and have, but still, the way it looks and runs now is really concerning. This is more than just launch day broken, although standards for that are constantly moving


----------



## th3illusiveman

jootn2kx said:


> Have you turned on frame generation + dlss?


No, my screen is max 4k120hz anything over screen tears horribly. I don't think you can cap Frame Gen FPS so it would be over and tear.


----------



## jootn2kx

th3illusiveman said:


> No, my screen is max 4k120hz anything over screen tears horribly. I don't think you can cap Frame Gen FPS so it would be over and tear.


No tear issues here weird but yeah there are some stability issues for sure had a couple of crashes and stutters when changing settings.
Now it's running fine for me with dldsr on 1.75x and frame generation + dlss quality 
Maybe you can try capping your fps in nvidia control panel?


----------



## SilenMar

Krzych04650 said:


> Yea Witcher 3 next-gen is an absolute mess. You could already see that on some preview videos where people would get constant artifacts with shadows flickering, pop-in and etc, but in person it is even a lot worse. The game runs and looks as if you forced random SLI profile on it. Plus it crashes a lot when changing settings. Running just Ultra without enabling any new things also has 15-20% less performance than original. Ultra+ settings are a joke, LODs and foliage/grass distances for example are way worse than modded. Not impressed at all.


If a game has flickers or it crashes then there is high chance your GPU is not stable. 

Despite the new look which I don't care about at all, I have seen the same thing not changed with newer games at 4K, max settings, less than 60fps on whatever the newest, the most powerful GPU available.


----------



## SilenMar

The Witcher 3 looks a lot more dashing after the update. They just need to fix the texture pop.

4.00









1.32


----------



## KingEngineRevUp

I'll tell you guys one thing I'm happy about with the Witcher 3. The addition of HDR. The original didn't have HDR support. The colors really make the game look much better. 

That being said, I game is pretty damn buggy. The tutorial menus won't close at some points and then Geralt's hair flies off. Wish I grabbed a screenshot of that. 








f.


----------



## keikei

Considering that we're near the holiday season, I don't expect any patches till 2023. I could be wrong.


----------



## SilenMar

The old version doesn't fix the texture pop either. 
The game is meant to be played with DLSS 3.0 with the current gen GPU. Maybe next-gen GPUs can run native 4K 60fps. 
The HDR is not good. It's like HDR 200 while Windows Auto HDR on the old DX11 is easy HDR 1000.


----------



## Krzych04650

Krzych04650 said:


> Did anyone try running Galax 666W BIOS with ECC enabled? I cannot complete any of the 3D mark stress tests even on stock settings. It got me quite concerned since it would suggest that VRAM is bad so I flashed back to Inno3D just to see if it has this issue as well and it doesn't, everything is fine on this BIOS.





J7SC said:


> ...Yes, I ran the Galax vbios with EEC in 3DM Port Royal and Speedway. So far only with a mid-level core and VRAM / EEC oc, but it worked fine. Speedway however is not only losing relatively more score than Port Royal w/EEC on, it is also much more finicky. Once you crashed / didn't complete the Speedway run, even reducing VRAM speed to stock is unlikely to work in a subsequent run (badly written code for SW?). Anyway, after a reboot, you'll be able to run it again, even with EEC oc (just how high is the question...).
> 
> ...In general, noticed that the Galax doesn't peak on core quite as high as my stock vbios (the 3210 / 3195 below), but I can now do runs at 3165 MHz and even 3180 MHz with the Galax - and the Galax does not downclock nearly as much as the stock vbios...often peak and effective are either the same or only one bin apart. The Galax really does behave like a full-fledged XOC vbios (probably more aggressive PLL). As I mentioned before, for gaming, it is probably better to stick with a milder vbios.


Turns out it was unstable DDR4. I was too lazy to connect RAM waterblock when installing GPU block since this whole setup is temporary anyway, and since I had this DRR4 squeezed to the very last bit, it became unstable. Not sure why ECC for the GPU would amplify that as it doesn't really crash without ECC, or why Inno3D BIOS did not have this issue at all (maybe different timings, part of the reason why Galax BIOS is better per clock?), but anyways, problem solved.


----------



## pat182

BigMack70 said:


> Today is a good day... Witcher 3 maxed out with ray tracing at 4k 110-120fps... I bought two Titan X cards back in the day to play this game at 4k 60 when it came out, and it's awesome to now be able to play it at 4k120 with all the next gen features!
> 
> Some slight stuttering with DLSS frame generation on but nothing too distracting. Assume it's the DLSS 3 vsync bug that hopefully they can fix.


stutter is cpu bound, my 13900kf at 5.7 shows 1 or 2 cores perma maxed out


----------



## jediblr

Krzych04650 said:


> Plus it crashes a lot when changing settings


LOL i changing setting 3 h in a row in game for testing FPS , 0 (zero!) crashes and PQ is SUPERB


----------



## pat182

SilenMar said:


> The old version doesn't fix the texture pop either.
> The game is meant to be played with DLSS 3.0 with the current gen GPU. Maybe next-gen GPUs can run native 4K 60fps.
> The HDR is not good. It's like HDR 200 while Windows Auto HDR on the old DX11 is easy HDR 1000.


i have the pg32uqx and putting the nvidia filter with 10% exposure fix this problem, getting 1500 peak nit with the sun


----------



## pat182

also dlss3 seems to hard cap the fps at 120 even on a 144hz monitor, i dont think i saw 144fps even if i put dlss at performance, i think from quality to performance theres no fps gain with dlss3 since its either cpu or gpu bound


----------



## jootn2kx

pat182 said:


> also dlss3 seems to hard cap the fps at 120 even on a 144hz monitor, i dont think i saw 144fps even if i put dlss at performance, i think from quality to performance theres no fps gain with dlss3 since its either cpu or gpu bound


Yes what I do is scaling the resolution up with dldsr  I mostly play @ 5160x2160 with 2.25x scaling. 
It improves the image quality and it uses the full capability with dlss + generation.


----------



## AvengedRobix

Wolverine2349 said:


> Yeah though how come Asus Strix cards have such bad whine if they have componenets that are very high end for power delivery? Or are they for power delviery but cheap in other areas thus whine?
> 
> And do you think the same manufacturers for same card can change components without us knowing that provide same power delivery, but have more vibrations and thus whine as they get them cheaper.


 Same, zotac amp Extreme and no coil whine.. 12900k, aorus master z690, and bequiet dark11 1200 psu.. good temp. I evaluate for the First Time After years and years ti stai at stock and not mount a waterblock.. bit Hey, for some people Is a low tier card 🤣


----------



## lawson67

AvengedRobix said:


> Same, zotac amp Extreme and no coil whine.. 12900k, aorus master z690, and bequiet dark11 1200 psu.. good temp. I evaluate for the First Time After years and years ti stai at stock and not mount a waterblock.. bit Hey, for some people Is a low tier card 🤣


Same here zero coil wine on my zotac amp Extreme , Asus B550-e motherboard, Ryzen 5800x3D , Corsair HX 1000 Psu, oh and the fans sound like fans no electrical wine from them also


----------



## BigMack70

pat182 said:


> stutter is cpu bound, my 13900kf at 5.7 shows 1 or 2 cores perma maxed out


I don't have any cores permanently maxed out on my 13700k. I'm pretty sure it's just a DLSS 3 bug/feature. Stuttering is more common and noticeable on my 4k120 Hz TV than on my 5120x1440 240 Hz monitor; I think the majority of it for me is related to DLSS 3 interacting with vsync.

I'm surprised that so many people are having issues / complaints with the game. It looks WAY better than the original and 4k 120 performance maxed out is just fine by me. There's pop-in, but it's a lot less than the original game, which had ridiculous amounts of pop-in even when you fully modded it; it wasn't possible to get LODs extended out to where you want them, particularly in towns/cities.



pat182 said:


> also dlss3 seems to hard cap the fps at 120 even on a 144hz monitor,


I can get 160fps at 5120x1440 when looking up at the sky with everything maxed and using DLSS 3. Doesn't seem like there's a hard cap.


----------



## raad11

mirkendargen said:


> Check your DPC latency while playing the game with sound and see if it's spiking too, and what specifically is spiking.


How do I check this?

I can say that there was definitely a lag added to mouse input independent of GPU issues. Not sure how to describe it, but I could feel it the minute I started a game session that would start stuttering a few seconds later. It felt slower and laggier, and I realized I was experiencing this previously without knowing what it was, and that also explained the times where the mouse felt extra responsive. I thought it was in my head. Though partially, it technically is I suppose (in terms of how you subconsciously react to input lag).

I would have to assume CPU was affected too. The difference in the GPU power/core graph before/after the USB sound device was removed was stark. Even what I thought was no stutter before was actually still not as flat a curve as it should be.

Which also explains why there was no problem in CoD. That graph was flat the whole time, both power and core load, albeit at lower power levels than in OW2.

So I'm not sure if it's electrical noise/interference or just something particular to that device which was causing issues. However, there is no ground loop noise with my internal sound card as there was with the USB sound.


----------



## raad11

mirkendargen said:


> I'm pretty sure the stuttering is shader compilation. It's real bad but clears over time.


The actual stutter wasn't due to that, because the main compilation such as when you install new GPU drivers briefly pushes your CPU to full load for a few seconds (250+ watts on a 13900K). But something with the settings and related to the shaders did reset the interference on subsequent starts.

But when the stutter started it wouldn't stop, even after multiple hours or even if I stopped and came back next day. I had to switch something (like use Cortex to reset/optimize game profile) to get it to stop.


----------



## alasdairvfr

Witcher 3 framerate with no Frame Generation and DLSS on Quality is pretty bad in the 40-50 FPS range in this city:










Turning RT off nearly doubles the framerate to 80-100 ish. I don't really think RT is worth the frame hit. I used to run this at my monitor's 144hz 4k before the update in this city which is pretty tough on a system.










Turning RT off makes a huge difference but the Steam overlay for FPS is broken.
CPU utilization seems to be the cause of the poor performance. GPU is in the 60% utilization range. All of the remastering they did taxes the CPU to the point nobody will have the game running well without dialing in settings. Never saw an RT "tax" this heavy on a 4090 before.


----------



## Malworthy

Hey y'all. Looking to swap the bios on my AX Gaming Renegade 4090 X3W, but I want to make sure I use a good/compatible bios. Tired of the hard lock at 450w. If I'm not mistaken, this is the EXACT same card as the Inno3D X3 but with a paint job. Any recommendations? I want to make sure to get some input before I do this. Any advice is appreciated!


----------



## jootn2kx

alasdairvfr said:


> Witcher 3 framerate with no Frame Generation and DLSS on Quality is pretty bad in the 40-50 FPS range in this city:
> Never saw an RT "tax" this heavy on a 4090 before.


Well remember callisto protocol?


----------



## Wolverine2349

pat182 said:


> stutter is cpu bound, my 13900kf at 5.7 shows 1 or 2 cores perma maxed out



With Witcher 3 which came out in 2015 and is 7 years old. Why would such a an excellent CPU P core at 5.7GHz bottleneck any video card or game released in 2015 at only 120 FPS.


----------



## jediblr

alasdairvfr said:


> I don't really think RT is worth the frame hit


find an ophthalmologist, you're blind.


alasdairvfr said:


> Witcher 3 framerate with no Frame Generation and DLSS on Quality is pretty bad in the 40-50 FPS range in this city


yes, 4090 sux, take 7900xtx and look at fps🤣


----------



## BigMack70

alasdairvfr said:


> All of the remastering they did taxes the CPU to the point nobody will have the game running well without dialing in settings. Never saw an RT "tax" this heavy on a 4090 before.


It runs pretty well if you use frame generation. 4k 120 most of the time with 95-99% GPU usage. Only place I've found where performance is really limited by CPU is around Hierarch Square in Novigrad where there's tons and tons of NPCs on screen. GPU usage dips down to 80% ish there. 

Outside of some slight occasional stuttering, my eyes can't really tell the difference in visual quality in this game between frame generation on and frame generation off.



Wolverine2349 said:


> With Witcher 3 which came out in 2015 and is 7 years old. Why would such a an excellent CPU P core at 5.7GHz bottleneck any video card or game released in 2015 at only 120 FPS.


Witcher 3 has always had areas (like Novigrad) with high CPU requirements, and RT increases CPU load. Hopefully they can optimize all of this some, but it's not that surprising.


----------



## J7SC

Krzych04650 said:


> ...() I had this DRR4 squeezed to the very last bit, it became unstable. Not sure why ECC for the GPU would amplify that as it doesn't really crash without ECC, or why Inno3D BIOS did not have this issue at all (maybe different timings, part of the reason why Galax BIOS is better per clock?), but anyways, problem solved.


...glad you solved it. It may be that with r_BAR on, any issue with system RAM could get amplified by ECC, but not sure. 

Quick general question: Which is better - Win 10 Pro or Win 11 Pro with the RTX 4090 (leaving out need for latest CPU gen etc) ?


----------



## mirkendargen

raad11 said:


> How do I check this?
> 
> I can say that there was definitely a lag added to mouse input independent of GPU issues. Not sure how to describe it, but I could feel it the minute I started a game session that would start stuttering a few seconds later. It felt slower and laggier, and I realized I was experiencing this previously without knowing what it was, and that also explained the times where the mouse felt extra responsive. I thought it was in my head. Though partially, it technically is I suppose (in terms of how you subconsciously react to input lag).
> 
> I would have to assume CPU was affected too. The difference in the GPU power/core graph before/after the USB sound device was removed was stark. Even what I thought was no stutter before was actually still not as flat a curve as it should be.
> 
> Which also explains why there was no problem in CoD. That graph was flat the whole time, both power and core load, albeit at lower power levels than in OW2.
> 
> So I'm not sure if it's electrical noise/interference or just something particular to that device which was causing issues. However, there is no ground loop noise with my internal sound card as there was with the USB sound.








Resplendence Software - LatencyMon: suitability checker for real-time audio and other tasks


LatencyMon: suitability checker for real-time audio and other tasks



www.resplendence.com





Use this program.


----------



## mirkendargen

J7SC said:


> ...glad you solved it. It may be that with r_BAR on, any issue with system RAM could get amplified by ECC, but not sure.
> 
> Quick general question: Which is better - Win 10 Pro or Win 11 Pro with the RTX 4090 (leaving out need for latest CPU gen etc) ?


If you're playing games, Windows 11 for Auto HDR, hands down no questions asked. If you're trying to get the hottest benchmarking scores, I'm not as sure.


----------



## Glottis

jediblr said:


> find an ophthalmologist, you're blind.


You are a bit insecure about your super expensive graphics card purchase, aren't you? He didn't say RT is bad, only that it isn't worth the excessive performance hit. And I tend to agree, often gotta flip between screenshots to notice darker corner somewhere or sharper reflection elsewhere. But games aren't paintings. Framerate is more important as it improves fluidity and responsiveness of the game. If he was able to play with RT at 60+ FPS I'm sure he wouldn't be thinking about compromises, but as it is now, you HAVE to compromise, even with a 4090.


----------



## jootn2kx

Glottis said:


> You are a bit insecure about your super expensive graphics card purchase, aren't you? He didn't say RT is bad, only that it isn't worth the excessive performance hit. And I tend to agree, often gotta flip between screenshots to notice darker corner somewhere or sharper reflection elsewhere. But games aren't paintings. Framerate is more important as it improves fluidity and responsiveness of the game. If he was able to play with RT at 60+ FPS I'm sure he wouldn't be thinking about compromises, but as it is now, you HAVE to compromise, even with a 4090.


Depends on what you are looking for me personally I will always choose my graphical experiences above framerates. 
So running every ray tracing setting maxed out and be capped @ 60-70fps is fine for me if i'm seeing that the gpu is 100% utilized.
A story based fps game doesn't need 200fps imo surely it's a nice thing if it is but not required.


----------



## J7SC

mirkendargen said:


> If you're playing games, Windows 11 for Auto HDR, hands down no questions asked. If you're trying to get the hottest benchmarking scores, I'm not as sure.


Thanks ! I actually had updated this system on Win 11 Pro already when running the 3090 Strix, but even back then, I reverted back to Win 10 Pro after a month or so for a variety of reasons (including even more big-data-vacuuming by Windows, bizarre extra-click requirements on copy/paste and, and, and...). Ironically, there were also issues with my XBox account when using different machines (not concurrently) which I don't have when both are using Win 10 Pro.

So I don't know if the 4090 in particular 'likes' Win 11 Pro better, be it for gaming or benching. When I switched back from Win 11 Pro to 11 Pro, it created an issue with updating 3DM 'SystemInfo'...now that I want to update that to version 5.55, I can't and instead end up in an endless loop (created a support ticket).


----------



## Krzych04650

J7SC said:


> ...glad you solved it. It may be that with r_BAR on, any issue with system RAM could get amplified by ECC, but not sure.
> 
> Quick general question: Which is better - Win 10 Pro or Win 11 Pro with the RTX 4090 (leaving out need for latest CPU gen etc) ?


It turned out that the CPU IMC was dying hence why all the problems in last two days. It just died for good today. 

As for Win10 vs Win11, I don't have any data on that, but I can say that while I was obviously very skeptical about upgrading to 11, transition was very smooth and I don't see any drawbacks thus far. Realistically there is no difference 99% of the time, but Win10 will be increasingly neglected and start missing features and develop issues over time.


----------



## coelacanth

mirkendargen said:


> If you're playing games, Windows 11 for Auto HDR, hands down no questions asked. If you're trying to get the hottest benchmarking scores, I'm not as sure.


Windows 10 has HDR as well. Not sure if it's different than what Windows 11 does other than having to manually toggle it.


----------



## mirkendargen

coelacanth said:


> Windows 10 has HDR as well. Not sure if it's different than what Windows 11 does other than having to manually toggle it.


Yes they both support HDR, that isn't what Auto HDR is. Auto HDR expands SDR games to HDR colorspace, and does a quite good job of it (better than some games in-engine HDR, looking at you Cyberpunk...)









Auto HDR Preview for PC Available Today


Today we’re excited to bring you a preview of Auto HDR for your PC gaming experience and we’re looking for your help to test it out. When enabled on your HDR capable gaming PC, you will automatically get awesome HDR visuals on an additional 1000+ DirectX 11 and DirectX 12 games!




devblogs.microsoft.com


----------



## alasdairvfr

jootn2kx said:


> Well remember callisto protocol?


Never played it but well aware that it's a mess



jediblr said:


> find an ophthalmologist, you're blind.
> 
> yes, 4090 sux, take 7900xtx and look at fps🤣


I can see just fine, just no longer satisfied to play 40-50 fps. Why don't you buy the AMD and let us know how it goes?



BigMack70 said:


> It runs pretty well if you use frame generation. 4k 120 most of the time with 95-99% GPU usage. Only place I've found where performance is really limited by CPU is around Hierarch Square in Novigrad where there's tons and tons of NPCs on screen. GPU usage dips down to 80% ish there.
> 
> Outside of some slight occasional stuttering, my eyes can't really tell the difference in visual quality in this game between frame generation on and frame generation off.


I'll give it a go with the frame generation turned on. For me it was the shock of the remaster running so poorly, being egregiously CPU bound and RT just taking the piss. I tested the same area and was at my monitor's limit before the update so I kinda assumed that DLSS Quality would offset most performance hit from improved visuals and RT. I was quite wrong!

3000 series RT absolutely tanked framerate, pretty much every (other) game I tried on the 4090 the performance hit has been pretty minimal


----------



## coelacanth

mirkendargen said:


> Yes they both support HDR, that isn't what Auto HDR is. Auto HDR expands SDR games to HDR colorspace, and does a quite good job of it (better than some games in-engine HDR, looking at you Cyberpunk...)


Ah that's interesting. Very cool.


----------



## BigMack70

alasdairvfr said:


> I'll give it a go with the frame generation turned on. For me it was the shock of the remaster running so poorly, being egregiously CPU bound and RT just taking the piss. I tested the same area and was at my monitor's limit before the update so I kinda assumed that DLSS Quality would offset most performance hit from improved visuals and RT. I was quite wrong!


I've been very impressed with DLSS 3 frame generation when I've been able to use it and stay at or below my screen's refresh rate (A Plague Tale: Requiem and now Witcher 3). It's an absolute disaster of stuttering trash if it pushes framerates above the display's refresh rate, but that's the only big problem I've found with it. I really have to go looking for visual artifacts to notice them, and I play games with a controller and not a mouse/keyboard so I don't notice the minor latency penalty. I'm looking forward to its inclusion in Cyberpunk.

The RT hit in Witcher 3 feels about like the hit in Cyberpunk or Control - it feels like that standard 40% ish performance penalty we've seen in heavy RT games before.


----------



## mirkendargen

BigMack70 said:


> I've been very impressed with DLSS 3 frame generation when I've been able to use it and stay at or below my screen's refresh rate (A Plague Tale: Requiem and now Witcher 3). It's an absolute disaster of stuttering trash if it pushes framerates above the display's refresh rate, but that's the only big problem I've found with it. I really have to go looking for visual artifacts to notice them, and I play games with a controller and not a mouse/keyboard so I don't notice the minor latency penalty. I'm looking forward to its inclusion in Cyberpunk.
> 
> The RT hit in Witcher 3 feels about like the hit in Cyberpunk or Control - it feels like that standard 40% ish performance penalty we've seen in heavy RT games before.


In NVCP, do you have G-sync on, V-sync on, FPS limit 118fps (or whatvever is right for your display)? Because frame generation FPS limiting works completely fine for me, I see a locked 118 FPS and my GPU utilization drop, no stutters in Darktide and Plague Tale. Witcher 3 stutters, but it also stutters with frame generation off so I'm really not sure why people connect the two.


----------



## BigMack70

mirkendargen said:


> In NVCP, do you have G-sync on, V-sync on, FPS limit 118fps (or whatvever is right for your display)? Because frame generation FPS limiting works completely fine for me, I see a locked 118 FPS and my GPU utilization drop, no stutters in Darktide and Plague Tale. Witcher 3 stutters, but it also stutters with frame generation off so I'm really not sure why people connect the two.


I have not tried 117 fps limiting for my display. I do have v-sync forced on in NVCP. I'll try the FPS limit next time I'm running on my 120 Hz screen...


----------



## ttnuagmada

J7SC said:


> Thanks ! I actually had updated this system on Win 11 Pro already when running the 3090 Strix, but even back then, I reverted back to Win 10 Pro after a month or so for a variety of reasons (including even more big-data-vacuuming by Windows, bizarre extra-click requirements on copy/paste and, and, and...). Ironically, there were also issues with my XBox account when using different machines (not concurrently) which I don't have when both are using Win 10 Pro.
> 
> So I don't know if the 4090 in particular 'likes' Win 11 Pro better, be it for gaming or benching. When I switched back from Win 11 Pro to 11 Pro, it created an issue with updating 3DM 'SystemInfo'...now that I want to update that to version 5.55, I can't and instead end up in an endless loop (created a support ticket).


Theres a plugin for win 11 called "explorerpatcher" which fixes most of the complaints I had about the win11 interface (and even some stuff i didnt like about the win10 interface) just fyi.


----------



## mirkendargen

ttnuagmada said:


> Theres a plugin for win 11 called "explorerpatcher" which fixes most of the complaints I had about the win11 interface (and even some stuff i didnt like about the win10 interface) just fyi.


There's a reg key change you can make that reverts to the old right click context menu which is all I needed. I don't remember it off the top of my head but you can google it.


----------



## SilenMar

pat182 said:


> i have the pg32uqx and putting the nvidia filter with 10% exposure fix this problem, getting 1500 peak nit with the sun


10% Exposure alone in the Filter setting only has sub-1,000nits. You need to rise the Highlight to get over 1,000nits.
Or you can just turn up the Contrast on the HDR monitor menu to do the similar.
This patch was rushed. These options should be in the game menu.


----------



## J7SC

ttnuagmada said:


> Theres a plugin for win 11 called "explorerpatcher" which fixes most of the complaints I had about the win11 interface (and even some stuff i didnt like about the win10 interface) just fyi.





mirkendargen said:


> There's a reg key change you can make that reverts to the old right click context menu which is all I needed. I don't remember it off the top of my head but you can google it.


Thanks folks. Win 11 Pro can be annoying with its 'extra (> unwanted') features and even more big-data vacuuming, ads and such; sooner or later, I will have to go back to Win 11 Pro...just wondering for those who have run a 4090 in both OS versions what they found. I do run Win 11 Pro anyways on the very machine I type this on, but that one is running a 6900XT.


----------



## ttnuagmada

mirkendargen said:


> There's a reg key change you can make that reverts to the old right click context menu which is all I needed. I don't remember it off the top of my head but you can google it.


Just check out explorerpatcher, you won't regret it.


----------



## Sheyster

mirkendargen said:


> There's a reg key change you can make that reverts to the old right click context menu which is all I needed. I don't remember it off the top of my head but you can google it.


Thanks, I actually was going to live with it the new Windows 11 way, but not now!

Info:

Microsoft Support Link


----------



## Sheyster

ttnuagmada said:


> Just check out explorerpatcher, you won't regret it.


I prefer a reg key edit over a plug-in, that's just me though.


----------



## Panchovix

I just returned to W10 after being with W11, on stable diffusion (image IA) I got about 5% improvement lol
I had some issues with my headphones on W11 that made crash Windows Store games, really weird.
Also on W11 my 5800X was toasty, on W10 is colder but same performance, is there a difference in CPU scheduler between W10 and W11?


----------



## BigMack70

mirkendargen said:


> In NVCP, do you have G-sync on, V-sync on, FPS limit 118fps (or whatvever is right for your display)? Because frame generation FPS limiting works completely fine for me, I see a locked 118 FPS and my GPU utilization drop, no stutters in Darktide and Plague Tale. Witcher 3 stutters, but it also stutters with frame generation off so I'm really not sure why people connect the two.


117 fps cap in NVCP solved the stuttering 

Running almost flawlessly maxed out 4k 117fps with DLSS frame generation on.


----------



## yzonker

BigMack70 said:


> 117 fps cap in NVCP solved the stuttering
> 
> Running almost flawlessly maxed out 4k 117fps with DLSS frame generation on.


Yea Witcher runs really well for me too if that's what you're referring to. Same, DLSS Quality and frame gen. Mostly locked at 117 fps. No stutters.


----------



## Laithan

Wolverine2349 said:


> With Witcher 3 which came out in 2015 and is 7 years old. Why would such a an excellent CPU P core at 5.7GHz bottleneck any video card or game released in 2015 at only 120 FPS.


Next-gen free update was just released.




J7SC said:


> Thanks folks. Win 11 Pro can be annoying with its 'extra (> unwanted') features and even more big-data vacuuming, ads and such; sooner or later, I will have to go back to Win 11 Pro...just wondering for those who have run a 4090 in both OS versions what they found. I do run Win 11 Pro anyways on the very machine I type this on, but that one is running a 6900XT.


Supposedly Win11 has better CPU scheduling for 12th/13th and possibly Zen4 also. I have a dual-boot W10/W11 on my new rig and it has already come in handy still having W10 because my Avermedia Live bolt 4K refuses to work on W11. I am just getting started I expect more oddities. 

All W10/W11 gets this run on it before I even allow it on my network and Pi-Hole takes care of the DNS aspect.


----------



## kryptonfly

J7SC said:


> Thanks ! I actually had updated this system on Win 11 Pro already when running the 3090 Strix, but even back then, I reverted back to Win 10 Pro after a month or so for a variety of reasons (including even more big-data-vacuuming by Windows, bizarre extra-click requirements on copy/paste and, and, and...). Ironically, there were also issues with my XBox account when using different machines (not concurrently) which I don't have when both are using Win 10 Pro.
> 
> So I don't know if the 4090 in particular 'likes' Win 11 Pro better, be it for gaming or benching. When I switched back from Win 11 Pro to 11 Pro, it created an issue with updating 3DM 'SystemInfo'...now that I want to update that to version 5.55, I can't and instead end up in an endless loop (created a support ticket).


Win 11 handles really well 7zip benchmark, especially for the decompression, ~240k against ~224k from win 11 => 10. It was with my good win 10 image to win 11 update, it solves memories poor performance. Win 11 is clearly better for last cpu gen (12th-13th Intel, thread director...), it should be definitely the same for new gpus.

Finally I've could install the Bykski block on the Giga-G-OC, I will post later the result, I can say right now that I'm no voltage limited anymore, no Vrel, just idle or power limit, even on TimeSpy Extreme, PR, Superposition with the stock bios at 1.05v, it's weird isn't it ?


----------



## inedenimadam

yzonker said:


> Yea Witcher runs really well for me too if that's what you're referring to. Same, DLSS Quality and frame gen. Mostly locked at 117 fps. No stutters.


DLSS 2.0 is as good as native to my eyes, but frame generation gives me tearing and artifacts. I've been playing the update without the frame gen, and getting 70-90 fps, which is good enough for a single player story game.


----------



## motivman

Just wanted to make a Quick post to show Bykski waterblock performance in Port Royal vs Time Spy Extreme on my Gigabyte Gaming OC with Galax 666W bios. Each test was looped around 35 times... I found the result interesting.

Port Royal:
Waterblock Delta T: 22C @ 572W











Timespy Extreme:
Waterblock Delta T: 27.7C @ 632W


----------



## borant

motivman said:


> Just wanted to show a Quick post to show Bykski waterblock performance in Port Royal vs Time Spy Extreme on my Gigabyte Gaming OC with Galax 666W bios. Each test was looped around 35 times... I found the result interesting.
> 
> Port Royal:
> Waterblock Delta T: 22C @ 572W
> 
> View attachment 2588777
> 
> 
> 
> Timespy Extreme:
> Waterblock Delta T: 27.7C @ 632W
> 
> View attachment 2588778


Can you do FurMark run for 10+min at ~600W? Gaming and 3Dmark test may have inconsistent load but FurMark keeps GPU under the same load.
Also, it make take up to 20min to temperatures to stabilize.


----------



## th3illusiveman

BigMack70 said:


> 117 fps cap in NVCP solved the stuttering
> 
> Running almost flawlessly maxed out 4k 117fps with DLSS frame generation on.


I tried those in NVCP but my screen still shows the 119 fps cap even tho i had a 116 fps cap w/ DLSS 3 - did you do anything different to what mirkendargen said?


----------



## dr/owned

Shoved about 1000W through the card to see the connector (cablemod 3 connector). It's not bad (you can tell on the backplate where I put thermal pad strips to help cool the VRM):
















I was able to peak it at about 1250W but I think the VRM was just shutting down since there's only 900A total on the core to give. That was at about 1.175V. About 35C delta on core temp (Alphacool block).


----------



## Betroz

Some say that capping the framerate in Rivatuner is better than NVCP. What do you guys think about that? Any new latency testing done on this?


----------



## Nizzen

Betroz said:


> Some say that capping the framerate in Rivatuner is better than NVCP. What do you guys think about that? Any new latency testing done on this?


Capping fps isn't great for my OCD  Need to see those high frames 
Healing teammates with HIGH fps feels good 🤓 🥳


----------



## Silent Scone

Nizzen said:


> Capping fps isn't great for my OCD  Need to see those high frames
> Healing teammates with HIGH fps feels good 🤓 🥳


Headphones needed for the choke noise if pushing 240Hz+ panels 😄


----------



## Betroz

Nizzen said:


> Capping fps isn't great for my OCD  Need to see those high frames
> Healing teammates with HIGH fps feels good 🤓 🥳


If you want to enjoy G-Sync, then using a framecap can be necessary in some situations 
For your monitor (175 Hz) - when the framerate is over 175, G-Sync is off anyways. But if you cap the framerate at 172 G-Sync is active. G-Sync vs lower latency is the choice here I guess.


----------



## satinghostrider

Betroz said:


> If you want to enjoy G-Sync, then using a framecap can be necessary in some situations
> For your monitor (175 Hz) - when the framerate is over 175, G-Sync is off anyways. But if you cap the framerate at 172 G-Sync is active. G-Sync vs lower latency is the choice here I guess.


I noticed this much more on 144hz-165hz monitors where tearing is very obvious.
On my 240hz monitor, even if it much in excess of even 240hz say 300fps or so, you can't notice any screen tearing.
For this case, I just let the FPS rip and set V-Sync off and let the FPS go as high as it can. Yes G-sync might be off anyways but I find G-sync rather useful for games that sits at relatively low FPS like <120fps. I keep mine on anyways since I do not notice anything off.


----------



## elbramso

Has someone here already tried to connect the evc2 to an MSI 4090 SUPRIM X? I'd love to tweak the memcontroller on colder temps^^

I was able to get in the Top10 HOF for TimeSpy this morning.... benching on ambient room temps
Still I think some of the scores for the Top10 Speedway are bugged.
Anyways here are my runs:
TimeSpy Rank7
SpeedWay Rank11

Card is an absolute beast and by far the best non-KingPin I ever had^^


----------



## 8472

__ https://twitter.com/i/web/status/1603304047922085889


----------



## tomerturbo

hello everyone i have asus strix 4090 oc and my setings is +190 core +1500 memory the latest nvidia driver and my port royal score is very low i get 27526 in speed way i get 10898
what can i do to increases my score? i see other card get to 28000 on port royal with no problem


----------



## Nizzen

tomerturbo said:


> hello everyone i have asus strix 4090 oc and my setings is +190 core +1500 memory the latest nvidia driver and my port royal score is very low i get 27526 in speed way i get 10898
> what can i do to increases my score? i see other card get to 28000 on port royal with no problem


Don't bother with Port Royal. Best tip of the day 

Play games


----------



## 8472

tomerturbo said:


> hello everyone i have asus strix 4090 oc and my setings is +190 core +1500 memory the latest nvidia driver and my port royal score is very low i get 27526 in speed way i get 10898
> what can i do to increases my score? i see other card get to 28000 on port royal with no problem


What's the specs for the rest of your system (cpu, ram speed)?


----------



## tomerturbo

8472 said:


> What's the specs for the rest of your system (cpu, ram speed)?


ASUS Z690 HERO

13900K PCORE 5.5 ECORE .4.4

MEMORY 6400 2X16GB G SKIL


----------



## 8472

tomerturbo said:


> ASUS Z690 HERO
> 
> 13900K PCORE 5.5 ECORE .4.4
> 
> MEMORY 6400 2X16GB G SKIL


Some things I'd check:

1. Cpu temperature. Make sure that it's not throttling due to overheating. 
2. XMP. Make sure that you have xmp set in bios otherwise your memory will only run at 4800.
3. Run the bench without the overclock to establish a baseline. It's possible that your vram overclock is not fully stable but also not causing port royal to crash. 
4. Run other in game benchmarks to see if those results are in line with what others are getting. 
5. Use a program like afterburner to see whether or not the card is actually maintaining the overclocked frequency.


----------



## keikei

Daniel finds frame gen does help with the cpu bottleneck & smooths out the gameplay. There are artifacts generated however.


----------



## tomerturbo

8472 said:


> Some things I'd check:
> 
> 1. Cpu temperature. Make sure that it's not throttling due to overheating.
> 2. XMP. Make sure that you have xmp set in bios otherwise your memory will only run at 4800.
> 3. Run the bench without the overclock to establish a baseline. It's possible that your vram overclock is not fully stable but also not causing port royal to crash.
> 4. Run other in game benchmarks to see if those results are in line with what others are getting.
> 5. Use a program like afterburner to see whether or not the card is actually maintaining the overclocked frequency.


cpu max temp 80c not throttling
bios xmp enable
the games run without problem
the card drop 15mhz ON CLOCK NO MORE IN PORT ROYAL


----------



## rahkmae

tomerturbo said:


> cpu max temp 80c not throttling
> bios xmp enable
> the games run without problem
> the card drop 15mhz ON CLOCK NO MORE IN PORT ROYAL





tomerturbo said:


> cpu max temp 80c not throttling
> bios xmp enable
> the games run without problem
> the card drop 15mhz ON CLOCK NO MORE IN PORT ROYAL


my results strix + ekw I scored 26 402 in Port Royal
+196 and mem +1350 max


----------



## tomerturbo

rahkmae said:


> my results strix + ekw I scored 26 402 in Port Royal
> +196 and mem +1350 max


mybe the strix card not wow i am disappointed


----------



## elbramso

rahkmae said:


> my results strix + ekw I scored 26 402 in Port Royal
> +196 and mem +1350 max


10700k bottleneck??


----------



## yzonker

rahkmae said:


> my results strix + ekw I scored 26 402 in Port Royal
> +196 and mem +1350 max


Something isn't right there. That score should be much higher at those settings. I think cards get 25-26k at default.


----------



## yzonker

For Witcher 3, I deleted the "d3d11on12.dll" file to get the overlay working, but what does that do to the game if anything? I can't tell any difference, but I think that's the wrapper library to get a dx11 game to run in dx12 isn't it? 

Game looks identical though at max settings, 4k, RT on, DLSS Quality, frame gen on. I even took screenshots to compare. With those settings, it runs at or near the 117 fps limit I have set. Feels much smoother with frame gen turned on for some reason. More than it should going from 70 to 117 fps.


----------



## jootn2kx

yzonker said:


> For Witcher 3, I deleted the "d3d11on12.dll" file to get the overlay working, but what does that do to the game if anything? I can't tell any difference, but I think that's the wrapper library to get a dx11 game to run in dx12 isn't it?
> 
> Game looks identical though at max settings, 4k, RT on, DLSS Quality, frame gen on. I even took screenshots to compare. With those settings, it runs at or near the 117 fps limit I have set. Feels much smoother with frame gen turned on for some reason. More than it should going from 70 to 117 fps.


Yes indeed i'm feeling the same improvement with frame generation sadly it isn't getting the right credit it deserves, so much negativity about it.


----------



## tomerturbo

yzonker said:


> Something isn't right there. That score should be much higher at those settings. I think cards get 25-26k at default.


i think also something isn't right also my strix card on 200 core and 1500 memory the score should be close to 28000 and i get 27500


----------



## elbramso

tomerturbo said:


> i think also something isn't right also my strix card on 200 core and 1500 memory the score should be close to 28000 and i get 27500


would you mind to post the run here?


----------



## yzonker

tomerturbo said:


> i think also something isn't right also my strix card on 200 core and 1500 memory the score should be close to 28000 and i get 27500


You're a lot closer though. 

What did you do in regards to system settings? Everyone tweaks a bunch of stuff to improve scores. 

I need to write a benchmarking guide I think. I don't want to list all of that out over and over. This thread has a lot of it. 









Tips for better 3Dmark scores on RTX 4090?


Hey guys, I am wondering if there are any tips and tricks for better 3DMark scores, I just did my first GPU OC (stable with 0 artifacts and way lower than artifact clocks) on the KFA2 (GALAX) 4090 and saw im 200 pts shy of getting in the top 100 Port Royal Hall Of Fame, which to be honest is...




www.overclock.net


----------



## yzonker

jootn2kx said:


> Yes indeed i'm feeling the same improvement with frame generation sadly it isn't getting the right credit it deserves, so much negativity about it.


I can easily see the difference by spinning around with the mouse. 

I haven't played the game very much yet, but it seems to work really well. I'm sure there are some impacts to IQ but they must be relatively minor.


----------



## elbramso

We had a Win10 Benchmark iso here on the forums back then.


----------



## yzonker

elbramso said:


> We had a Win10 Benchmark iso here on the forums back then.


I still have it installed, but I've been seeing the same or better results on a clean Win11 install. The script that turns off all the services works on Win11 also. Might not get everything depending on what is different between 10 and 11, but it turns off a lot of stuff. Didn't make much difference on my scores though. Mostly just slightly improves CPU heavy benches.

Although with my 13900k, Win11 22H2 is still badly broken in the TS CPU bench. Have to run 21H2 for that. Although there is a KB patch that might help, but last time I checked it didn't show up in my optional updates.

Just looking through the top 3DMark results, most appear to be on Win11 too.


----------



## elbramso

I'm on Win11 21H2 aswell - just used shutup10 for the cleanup and my benchmarks were pretty solid.
Still looking for a clean iso (or might create it myself) because it's way easier to start with once I crashed my OS because my RAM OC was a bit off^^


----------



## yzonker

elbramso said:


> I'm on Win11 21H2 aswell - just used shutup10 for the cleanup and my benchmarks were pretty solid.
> Still looking for a clean iso (or might create it myself) because it's way easier to start with once I crashed my OS because my RAM OC was a bit off^^


I just make an image backup and restore it if/when that happens. Usually initially test RAM OC's on the bench OS and then do longer final testing on my daily OS.


----------



## alasdairvfr

yzonker said:


> Feels much smoother with frame gen turned on for some reason. More than it should going from 70 to 117 fps.


I'm finding the same, I can't put my finger on it but it's almost like there's some weird lag at lower framerates that is alleviated over 100 fps. Frame Generation really helps but without it the game at 60-70 feels more like 30-40 fps.


----------



## jediblr

yzonker said:


> For Witcher 3, I deleted the "d3d11on12.dll" file to get the overlay working, but what does that do to the game if anything? I can't tell any difference, but I think that's the wrapper library to get a dx11 game to run in dx12 isn't it?


you dont need this for AB RTSS . Read Unwinder:
Second solution way is RTSS architecture specific. RTSS contains two optional OSD renderers for D3D12 applications: D3D11on12 (used by default) and native D3D12 based. You may simply create a profile for Witcher 3 Next Gen and make RTSS using native D3D12 OSD rendered for it instead. So it won't depend on D3D11on12 and won't be affected by Witcher's d3d11on12.dll versions conflict. To do it add the following profile template for it:

witcher3.exe.cfg

Code:
[RendererDirect3D12]
D3D11on12 = 0


----------



## doom26464

Has anyone tried encoding/streaming on there 4090 yet?? I know these new gpus support AV1 encoding but alot of the streaming company have not made the jump to AV1 yet. 

The dream has always been a single pc streaming machine something I have been chasing ever since my 980ti. Every generation the encoder got better or the gpu performance got better. Even on my 3080 though trying to stream/record a competitive shooter like warzone or PUBG at 3440x1440p and maintain 120+fps was challenge and always required fine tuning of in game settings. Also game side frame pacing always seems to take a hit when encoding and is rather annoying when playing any type of competitive shooter.

My 4090 shows up next week but from what I see it's the gpu to truly make dreams come true. I wonder if this will finally be the gpu to do it.


----------



## Sheyster

Betroz said:


> Some say that capping the framerate in Rivatuner is better than NVCP. What do you guys think about that? Any new latency testing done on this?





Nizzen said:


> Capping fps isn't great for my OCD  Need to see those high frames
> Healing teammates with HIGH fps feels good 🤓 🥳


Since we're talking about FPS capping (which I always do), here is a reminder regarding Low Latency Mode (Nvidia Reflex) in NVCP:



> Set “Low Latency Mode” to “Ultra” in the Nvidia Control Panel. When combined with G-SYNC + V-SYNC, this setting will automatically limit the framerate* (in supported games)* to ~59 FPS @60Hz, ~97 FPS @100Hz, ~116 FPS @120Hz, ~138 FPS @144Hz, ~224 FPS @240Hz, etc.


The above info is from BlurBusters G-Sync FAQ.


----------



## mirkendargen

tomerturbo said:


> hello everyone i have asus strix 4090 oc and my setings is +190 core +1500 memory the latest nvidia driver and my port royal score is very low i get 27526 in speed way i get 10898
> what can i do to increases my score? i see other card get to 28000 on port royal with no problem


Make sure you don't have V-sync or an FPS limiter enabled, you can get scores around that if you limit Port Royal to 120FPS.


----------



## SilenMar

doom26464 said:


> Has anyone tried encoding/streaming on there 4090 yet?? I know these new gpus support AV1 encoding but alot of the streaming company have not made the jump to AV1 yet.
> 
> The dream has always been a single pc streaming machine something I have been chasing ever since my 980ti. Every generation the encoder got better or the gpu performance got better. Even on my 3080 though trying to stream/record a competitive shooter like warzone or PUBG at 3440x1440p and maintain 120+fps was challenge and always required fine tuning of in game settings. Also game side frame pacing always seems to take a hit when encoding and is rather annoying when playing any type of competitive shooter.
> 
> My 4090 shows up next week but from what I see it's the gpu to truly make dreams come true. I wonder if this will finally be the gpu to do it.


AV1 has the best image quality. 4090 can easily record 8K 60fps HDR at 250Mps with barely any visual artifact.


----------



## Sheyster

Anyone considering a pre-order on this?









45'' UltraGear™ OLED Gaming Monitor (45GR95QE-B) | PREORDER


Preorder the New 45'' UltraGear™ OLED Curved Gaming Monitor (45GR95QE-B) on LG.com. Gain a Competitive Advantage with a 240Hz Refresh Rate & 0.3ms Response Time.




www.lg.com


----------



## doom26464

SilenMar said:


> AV1 has the best image quality. 4090 can easily record 8K 60fps HDR at 250Mps with barely any visual artifact.


Yes I have seen alot of AV1 videos and it's amazing but currently outside of local recordings on OBS no other stream platform supports it. I'm more curious how traditional h264 performs on it and user end game side performance.


----------



## SilenMar

doom26464 said:


> Yes I have seen alot of AV1 videos and it's amazing but currently outside of local recordings on OBS no other stream platform supports it. I'm more curious how traditional h264 performs on it and user end game side performance.


Except OBS, Nvidia always wants you to use Shadowplay.
These stream platforms need to meet the specs. Discord is about to go AV1.
40 series GPU can encode and decode AV1 with no sweat. CPU h264 will be wrecked.


----------



## doom26464

SilenMar said:


> Except OBS, Nvidia always wants you to use Shadowplay.
> These stream platforms need to meet the specs. Discord is about to go AV1.
> 40 series GPU can encode and decode AV1 with no sweat. CPU h264 will be wrecked.


Yah discord is the only one which is wierd as I'm pretty sure amazon(who owns twitch) is one of the major financial backers for AV1. I expect twitch and youtube to get behind it eventually but probaly not till God knows when next year.

Sorry I mean regular nvenc encoding not cpu h264 encoding. Basiclly wanna know how it does at 6.5k bit rate to twitch compared to something like a 3090.


----------



## tvbaas

@zhrooms Some details pictures and info on Gigabyte Aorus 4090 Master VRM for the start post are on this site GIGABYTE AORUS RTX 4090 MASTER Super Engraved Graphics Card Review: Ada's new flagship - Super Power Network (expreview.com) ( use translate).


----------



## BigMack70

Sheyster said:


> Anyone considering a pre-order on this?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 45'' UltraGear™ OLED Gaming Monitor (45GR95QE-B) | PREORDER
> 
> 
> Preorder the New 45'' UltraGear™ OLED Curved Gaming Monitor (45GR95QE-B) on LG.com. Gain a Competitive Advantage with a 240Hz Refresh Rate & 0.3ms Response Time.
> 
> 
> 
> 
> www.lg.com


Looks pretty nice. I am keen to get away from my 32:9 Neo G9 to a taller 21:9 ultrawide of similar physical size. Might wait a little longer, though. I really did not enjoy my experience with OLED using an LG C9; burn-in is a very real thing with OLED and my monitor is in a bright room; I am hoping for a nice mini-LED VA panel. I'd also love a 21:9 resolution based on 4k instead of 1440p


----------



## Nizzen

BigMack70 said:


> Looks pretty nice. I am keen to get away from my 32:9 Neo G9 to a taller 21:9 ultrawide of similar physical size. Might wait a little longer, though. I really did not enjoy my experience with OLED using an LG C9; burn-in is a very real thing with OLED and my monitor is in a bright room; I am hoping for a nice mini-LED VA panel. I'd also love a 21:9 resolution based on 4k instead of 1440p


Too big, I'm satisfied with my Alienware oled 34" 3440x1440 175hz... Want 240 though


----------



## BigMack70

Nizzen said:


> Too big, I'm satisfied with my Alienware oled 34" 3440x1440 175hz... Want 240 though


I love massive displays. I'm tempted by the Odyssey Ark 55" but it's so expensive. I regret spending $2.5k on the Neo G9. Don't really feel like dropping more than $1500 ish on a display ever again.


----------



## J7SC

Sheyster said:


> Anyone considering a pre-order on this?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 45'' UltraGear™ OLED Gaming Monitor (45GR95QE-B) | PREORDER
> 
> 
> Preorder the New 45'' UltraGear™ OLED Curved Gaming Monitor (45GR95QE-B) on LG.com. Gain a Competitive Advantage with a 240Hz Refresh Rate & 0.3ms Response Time.
> 
> 
> 
> 
> www.lg.com


...I'm not too fond of curved monitors, but that does look very nice spec-wise, if a bit small... OLED 240 Hz 'C3s' next year ?


----------



## mirkendargen

J7SC said:


> ...I'm not too fond of curved monitors, but that does look very nice spec-wise, if a bit small... OLED 240 Hz 'C3s' next year ?


I'd be down with that with DSC.


----------



## alasdairvfr

Personally i prefer multiple monitors over the wide panel but i have been tempted in the past. I always found them to be too short so a lot of neck turning and not enough vertical. im doing 3x4k now and pretty happy.

Im replacing one of my crappy monitors to a 2nd msi optix mpg321ur-qd, maybe a 3rd for surround later on.


----------



## SilenMar

Sheyster said:


> Anyone considering a pre-order on this?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 45'' UltraGear™ OLED Gaming Monitor (45GR95QE-B) | PREORDER
> 
> 
> Preorder the New 45'' UltraGear™ OLED Curved Gaming Monitor (45GR95QE-B) on LG.com. Gain a Competitive Advantage with a 240Hz Refresh Rate & 0.3ms Response Time.
> 
> 
> 
> 
> www.lg.com


Obviously LG cannot make smaller monitors without lowering down the brightness. 
Even at 45", it's as bright as HDR 200. Another one at 27" is HDR 100. 

The problem with OLED is written right in its name. It's organic to have flickering, low APL, aggressive ABL. Picture quality is as much as SDR in fact. 

MiniLED and MicroLED on the other hand have way more potentials.


----------



## mirkendargen

SilenMar said:


> Obviously LG cannot make smaller monitors without lowering down the brightness.
> Even at 45", it's as bright as HDR 200. Another one at 27" is HDR 100.
> 
> The problem with OLED is written right in its name. It's organic to have flickering, low APL, aggressive ABL. Picture quality is as much as SDR in fact.
> 
> MiniLED and MicroLED on the other hand have way more potentials.


Wut? I have a 2.5 year old 48" CX and a calibrator. It still outputs ~700nit just fine, not "HDR 200" at all lol. Newer panels can do ~1000nit. Why would anything flicker? LCD monitors are the ones with strobing lights...

The automatic dimming is easy to disable ¯\_(ツ)_/¯


----------



## yzonker

mirkendargen said:


> Wut? I have a 2.5 year old 48" CX and a calibrator. It still outputs ~700nit just fine, not "HDR 200" at all lol. Newer panels can do ~1000nit. Why would anything flicker? LCD monitors are the ones with strobing lights...
> 
> The automatic dimming is easy to disable ¯\_(ツ)_/¯


Yea I read that and shook my head. No clue where any of that came from. I'm not sure how a display could beat my C1. Already bright enough it feels a little blinding at times. Don't need more. Maybe if it was in a room with a lot of sunlight, but it's not.


----------



## SilenMar

mirkendargen said:


> Wut? I have a 2.5 year old 48" CX and a calibrator. It still outputs ~700nit just fine, not "HDR 200" at all lol. Newer panels can do ~1000nit. Why would anything flicker? LCD monitors are the ones with strobing lights...
> 
> The automatic dimming is easy to disable ¯\_(ツ)_/¯


This is an unpopular topic. OLED is just limited. 
I have a PA32UCG output 1800nits with 1million:1 contrast that makes a game look completely different. 
It's not something OLED can do with only 70% Rec 2020 colorspace and super aggressive ABL plus flickering. Theses 45" and 27" don't even have HDR 400 stickers while true HDR monitors just begin to show the edge of 12bit color which needs 4000nits to display properly.


----------



## Sheyster

J7SC said:


> ...I'm not too fond of curved monitors, but that does look very nice spec-wise, if a bit small... OLED 240 Hz 'C3s' next year ?


I'm mainly looking for a higher refresh rate at this point, it doesn't necessarily have to be curved. This new 45" LG would have been nicer with a 1600P screen res like their previous 38" curved IPS model (38GL950G-B).


----------



## mirkendargen

SilenMar said:


> This is an unpopular topic. OLED is just limited.
> I have a PA32UCG output 1800nits with 1million:1 contrast that makes a game look completely different.
> It's not something OLED can do with only 70% Rec 2020 colorspace and super aggressive ABL plus flickering. Theses 45" and 27" don't even have HDR 400 stickers while true HDR monitors just begin to show the edge of 12bit color which needs 4000nits to display properly.


If you think your monitor has a 1mil:1 contrast ratio between two adjacent pixels, I have a Trump trading card NFT to sell you for $99.99. And an LG OLED does 78% of rec2020, just like your monitor so....ok I guess? Maybe you think your monitor is bad too?

There are two kinds of people: people who will never go back from gaming on an OLED, and people that have never gamed on an OLED.


----------



## J7SC

mirkendargen said:


> Wut? I have a 2.5 year old 48" CX and a calibrator. It still outputs ~700nit just fine, not "HDR 200" at all lol. Newer panels can do ~1000nit. Why would anything flicker? LCD monitors are the ones with strobing lights...
> 
> The automatic dimming is easy to disable ¯\_(ツ)_/¯


...I actually turn brightness _down_ a bit on my 48 C1, the trick is to have all the other light sources behind the back of the monitor (see pic, C1 on the right). In s.th. like FS 2020 w/ 4090, HDR, DLSS3, Frame Insertion & NV Reflex & GSync, this has become unbelievably smooooth and detailed .











Sheyster said:


> I'm mainly looking for a higher refresh rate at this point, it doesn't necessarily have to be curved. This new 45" LG would have been nicer with a 1600P screen res like their previous 38" curved IPS model (38GL950G-B).


...next stop: C3 240(+?) Hz


----------



## SilenMar

mirkendargen said:


> If you think your monitor has a 1mil:1 contrast ratio between two adjacent pixels, I have a Trump trading card NFT to sell you for $99.99. And an LG OLED does 78% of rec2020, just like your monitor so....ok I guess? Maybe you think your monitor is bad too?
> 
> There are two kinds of people: people who will never go back from gaming on an OLED, and people that have never gamed on an OLED.


OLED looks underwhelming with only 2% window size of peak brightness. 
It's day and night difference with 1600nits sustained at any window size and 85% rec2020 colors pace.


----------



## yzonker

SilenMar said:


> OLED looks underwhelming with only 2% window size of peak brightness.
> It's day and night difference with 1600nits sustained at any window size and 85% rec2020 colors pace.





https://www.tomshardware.com/reviews/asus-proart-pa32ucg/4



"We calculated the Rec.2020 coverage at 77.82% "


----------



## SilenMar

yzonker said:


> https://www.tomshardware.com/reviews/asus-proart-pa32ucg/4
> 
> 
> 
> "We calculated the Rec.2020 coverage at 77.82% "


I calculated Rec.2020 coverage at 85%. Even PG32UQX has 85%.


----------



## schoolofmonkey

So I'd been using a Coolermaster PCIe Gen 4 riser cable with vertical mount, I'd been noticing strange behavior, since putting it in, like WHEA errors is PCIe power saving was on..
Today I took it out, ran Port Royal with the same OC settings, 200 points higher than with the riser..



SilenMar said:


> OLED looks underwhelming with only 2% window size of peak brightness.
> It's day and night difference with 1600nits sustained at any window size and 85% rec2020 colors pace.


I'm running a LG C2 42", looks good, like WOW good with some color tweaking, but man that VRR flicker is shocking. My wife has the 55" model (in the main room), the VRR flicker is no where near as bad as the 42".


----------



## satinghostrider

schoolofmonkey said:


> So I'd been using a Coolermaster PCIe Gen 4 riser cable with vertical mount, I'd been noticing strange behavior, since putting it in, like WHEA errors is PCIe power saving was on..
> Today I took it out, ran Port Royal with the same OC settings, 200 points higher than with the riser..
> 
> 
> 
> I'm running a LG C2 42", looks good, like WOW good with some color tweaking, but man that VRR flicker is shocking. My wife has the 55" model (in the main room), the VRR flicker is no where near as bad as the 42".


The CoolerMaster cable is notorious for all those WHEA errors you are having. Invest in a good one like the Linkup PCIE 4.0 cable and you will not have these issues anymore.
I know more than a few who have already complained about WHEA errors and the CoolerMaster Gen4 cable.


----------



## jonnyzeraa

Hey guys, here in brazil have 4 RTX 4090 models available for buy, MSI Suprim X Liquid ( 2868$ ) MSI Gaming Trio ( 2730$ ), Gigabyte Gaming OC ( 2824$ ) and Zotac AMP Extreme AIRO ( 2315$ ), which one u guys recommend me buy?


----------



## Laithan

Sheyster said:


> I'm mainly looking for a higher refresh rate at this point, it doesn't necessarily have to be curved.


^ Exactly

I think since 4K monitors were introduced to retail the "dream" has always to be able to run @4K _*and*_ high refresh rates at the same time. The 4090 is arguably the first GPU to "get the cake and eat it too" and fully realize the best visuals with high referesh rates. Anyone that buys a 4090 and still has a 4K/60Hz monitor should be looking for a higher refresh rate ASAP for this very reason so I agree that the refresh rate should take priority.

Superwide monitors are almost always going to have less pixel density than 4K (plus the added supported resoution challenges as a gamer may be a factor). Just something to consider as a 4090 owner. There's no better time to up your pixel count game.

3440 x 1440 = 4,953,600 pixels
3840 x 2160 = 8,294,400 pixels


----------



## mirkendargen

satinghostrider said:


> The CoolerMaster cable is notorious for all those WHEA errors you are having. Invest in a good one like the Linkup PCIE 4.0 cable and you will not have these issues anymore.
> I know more than a few who have already complained about WHEA errors and the CoolerMaster Gen4 cable.


+1 I have the Linkup cable in the 20cm length (get the shortest that works for you) and benchmarked back to back with and without it and got no difference. I've never had any WHEA errors.


----------



## J7SC

satinghostrider said:


> The CoolerMaster cable is notorious for all those WHEA errors you are having. Invest in a good one like the Linkup PCIE 4.0 cable and you will not have these issues anymore.
> I know more than a few who have already complained about WHEA errors and the CoolerMaster Gen4 cable.





mirkendargen said:


> +1 I have the Linkup cable in the 20cm length (get the shortest that works for you) and benchmarked back to back with and without it and got no difference. I've never had any WHEA errors.


+2 (x2)  ...I have two of the Linkup PCIe 4.0 risers in use for about 1.5 years with zero issues...I also had tested both a 6900XT and 3090 Strix with / without those risers and observed no change in scores. Now one of those Linkup PCIe 4.0 is handling the 4090 problem-free since I got it in October.


----------



## SilenMar

schoolofmonkey said:


> So I'd been using a Coolermaster PCIe Gen 4 riser cable with vertical mount, I'd been noticing strange behavior, since putting it in, like WHEA errors is PCIe power saving was on..
> Today I took it out, ran Port Royal with the same OC settings, 200 points higher than with the riser..
> 
> 
> 
> I'm running a LG C2 42", looks good, like WOW good with some color tweaking, but man that VRR flicker is shocking. My wife has the 55" model (in the main room), the VRR flicker is no where near as bad as the 42".


There is Samsung S95B that is a little bit better than C2. 

The problem is OLED struggles at the end of its wit. It can only jump in the next 5 years before MicroLED while miniLED gets higher density, higher brightness, wider gamut to deliver better images. 

The flickering issue is never solved in the beginning where there is electrical bias. Even with DC diming, OLED still flickers at each frame. 

LG actually shot its own foot with an ad demonstrating flickering problem.





The ad shows only a minimum of 45° shutter angle at 50fps which is equal to only 1/400 shutter speed not enough to capture the OLED.
When the shutter speed goes to 1/3200 or 1/6400, the OLED still has flickers, enough to cause eye strain even if the brightness is just low.

So far the best consumer OLED monitor like PA32DC still flickers hopelessly. The flicker will be more intense when the brightness goes higher though OLED technically cannot go very high. 

And there are limited suppliers of OLED compared to the MiniLED supply chain.


----------



## coelacanth

J7SC said:


> ...I actually turn brightness _down_ a bit on my 48 C1, the trick is to have all the other light sources behind the back of the monitor (see pic, C1 on the right). In s.th. like FS 2020 w/ 4090, HDR, DLSS3, Frame Insertion & NV Reflex & GSync, this has become unbelievably smooooth and detailed .
> View attachment 2588899
> 
> 
> 
> 
> ...next stop: C3 240(+?) Hz


To have a correct picture you need to keep brightness at 50 and use the OLED back light to get to the desired amount of light emitted. For PC keep black level on Auto rather than low.

The LG CX can be so bright I keep the back light relatively low. At higher levels it is positively retina searing.


----------



## yzonker

SilenMar said:


> There is Samsung S95B that is a little bit better than C2.
> 
> The problem is OLED struggles at the end of its wit. It can only jump in the next 5 years before MicroLED while miniLED gets higher density, higher brightness, wider gamut to deliver better images.
> 
> The flickering issue is never solved in the beginning where there is electrical bias. Even with DC diming, OLED still flickers at each frame.
> 
> LG actually shot its own foot with an ad demonstrating flickering problem.
> 
> 
> 
> 
> 
> The ad shows only a minimum of 45° shutter angle at 50fps which is equal to only 1/400 shutter speed not enough to capture the OLED.
> When the shutter speed goes to 1/3200 or 1/6400, the OLED still has flickers, enough to cause eye strain even if the brightness is just low.
> 
> So far the best consumer OLED monitor like PA32DC still flickers hopelessly. The flicker will be more intense when the brightness goes higher though OLED technically cannot go very high.
> 
> And there are limited suppliers of OLED compared to the MiniLED supply chain.


Thankfully I'm flicker blind.


----------



## SilenMar

yzonker said:


> Thankfully I'm flicker blind.


It's not visible but eye strain.


----------



## schoolofmonkey

satinghostrider said:


> The CoolerMaster cable is notorious for all those WHEA errors you are having. Invest in a good one like the Linkup PCIE 4.0 cable and you will not have these issues anymore.
> I know more than a few who have already complained about WHEA errors and the CoolerMaster Gen4 cable.


Yeah the first in game problem I notice was in RDR2 yesterday, there was repeatable issues in a cut scene, where you'd get green boxes like you'd seen with memory corruption/unstable oc, it was in the exact same place on the screen with DX12, vulkan didn't produce it.
Took the riser out, now zero issues with DX12.

Turning off power saving fixed the WHEA's but I feel I was just masking a bigger issue.




SilenMar said:


> There is Samsung S95B that is a little bit better than C2.
> 
> The problem is OLED struggles at the end of its wit. It can only jump in the next 5 years before MicroLED while miniLED gets higher density, higher brightness, wider gamut to deliver better images.
> 
> The flickering issue is never solved in the beginning where there is electrical bias. Even with DC diming, OLED still flickers at each frame.
> 
> LG actually shot its own foot with an ad demonstrating flickering problem.
> 
> 
> 
> 
> 
> The ad shows only a minimum of 45° shutter angle at 50fps which is equal to only 1/400 shutter speed not enough to capture the OLED.
> When the shutter speed goes to 1/3200 or 1/6400, the OLED still has flickers, enough to cause eye strain even if the brightness is just low.
> 
> So far the best consumer OLED monitor like PA32DC still flickers hopelessly. The flicker will be more intense when the brightness goes higher though OLED technically cannot go very high.
> 
> And there are limited suppliers of OLED compared to the MiniLED supply chain.


Anything dark is like a disco, played through High On Life never flickered once, but that's a very bright and colorful game 

Basically I'll set some games like the new Wither 3 Complete to have G-Sync off, tweak a few settings and they run fine.
It's surprising with the 55" it's not where near as bad, my wife and I compared exact scenes from RDR2 mine will flicker, hers only slightly, she can't even notice it, wish I didn't.
Funny thing is I tried a ROG SWIFT PG43UQ, even that had a weird flicker when the frame rate got high, not to mention the horrible blurry smudged text and picture quality.

I couldn't go back to a smaller screen after using this 42" though...


----------



## J7SC

coelacanth said:


> To have a correct picture you need to keep brightness at 50 and use the OLED back light to get to the desired amount of light emitted. For PC keep black level on Auto rather than low.
> 
> The LG CX can be so bright I keep the back light relatively low. At higher levels it is positively retina searing.


...every single one of the five monitors I look at here in a typical day (including OLED C1, VA, IPS HDR, other) has brightness below 50%, and in the case of the C1 black level at auto, as I have sensitive eyesight. I also sit fairly close to the C1 for its 48 inches diagonal size.

Of all the monitors, the OLED stands out with its fantastic picture / true blacks, responsiveness, HDR quality, custom adjustability and not a single flicker or burn-in in sight (if you pardon the pun). This does not mean that I don't look forward to what other display technologies can improve upon beyond marketing blah blah and YT infomercials. But for now, I couldn't be happier, though that may also relate to the fact that it is the Christmas holiday period, the weekend is nearly here and I am enjoying freshly-baked spekulatius cookies 😋


----------



## kyg

Has anyone deshrouded Gigabyte 4090 gaming OC?
Can anyone tell me if you need to undo the heatsink to take the plastic shroud off? Or are the shroud only attached to the backplate?


----------



## mirkendargen

SilenMar said:


> OLED looks underwhelming with only 2% window size of peak brightness.
> It's day and night difference with 1600nits sustained at any window size and 85% rec2020 colors pace.


The entire point of HDR is extremely bright highlights contrasting against not bright elements (and humans perceive brightness by contrast, not by nit). Tanning yourself with a full screen at 1600nit won't do anything but make your eyes adjust and serve no purpose. Being able to have a 0 nit pixel next to a 700+ nit pixel is excellent and more than enough to make you squint. No backlit display will ever do it as well, the future is emissive pixels (OLED then micro LED).

Not to mention the motion clarity of OLED compared to the transition time of LCD. I'm aware some people consider OLED "too perfect" in that regard and ultimately unpleasant, but it's leaving room for ultra high refresh displays.


----------



## rahkmae

yzonker said:


> Something isn't right there. That score should be much higher at those settings. I think cards get 25-26k at default.


default get 24k maybe PCIe-3 bottleneck


----------



## SilenMar

mirkendargen said:


> The entire point of HDR is extremely bright highlights contrasting against not bright elements (and humans perceive brightness by contrast, not by nit). Tanning yourself with a full screen at 1600nit won't do anything but make your eyes adjust and serve no purpose. Being able to have a 0 nit pixel next to a 700+ nit pixel is excellent and more than enough to make you squint. No backlit display will ever do it as well, the future is emissive pixels (OLED then micro LED).
> 
> Not to mention the motion clarity of OLED compared to the transition time of LCD. I'm aware some people consider OLED "too perfect" in that regard and ultimately unpleasant, but it's leaving room for ultra high refresh displays.


The point of HDR is the range. 
2% of peak brightness is rather pathetic. There is a difference between any window size 1,800nits next to 0nits and ABL 2% 700nits next to 0nits. There is also a difference in color coverage. 
I have HDR footages/pictures that crash OLED all the time. Sometimes it is funny on OLED when the night is brighter than daylight due to ABL. 
This is why the HDR content is graded on professional monitors like PA32UCG.


----------



## Silent Scone

Laithan said:


> ^ Exactly
> 
> I think since 4K monitors were introduced to retail the "dream" has always to be able to run @4K _*and*_ high refresh rates at the same time. The 4090 is arguably the first GPU to "get the cake and eat it too" and fully realize the best visuals with high referesh rates. Anyone that buys a 4090 and still has a 4K/60Hz monitor should be looking for a higher refresh rate ASAP for this very reason so I agree that the refresh rate should take priority.
> 
> Superwide monitors are almost always going to have less pixel density than 4K (plus the added supported resoution challenges as a gamer may be a factor). Just something to consider as a 4090 owner. There's no better time to up your pixel count game.
> 
> 3440 x 1440 = 4,953,600 pixels
> 3840 x 2160 = 8,294,400 pixels


What if you'd rather just supersample the hell out of everything? Don't discriminate! 😄


----------



## man from atlantis

So I just ordered a Palit GameRock OC, waiting the mailman hit the door


----------



## pat182

SilenMar said:


> This is an unpopular topic. OLED is just limited.
> I have a PA32UCG output 1800nits with 1million:1 contrast that makes a game look completely different.
> It's not something OLED can do with only 70% Rec 2020 colorspace and super aggressive ABL plus flickering. Theses 45" and 27" don't even have HDR 400 stickers while true HDR monitors just begin to show the edge of 12bit color which needs 4000nits to display properly.


i have the pg32uqx, so almost the same gaming hdr experience, i wouldnt trade small trade local dimming bloom with 1800 peak nit for a 300 nit oled experience lol
kinda miss my pg27uq too, it was a good monitor with so much ppi, 32inch is nice tho


----------



## ttnuagmada

SilenMar said:


> The point of HDR is the range.
> 2% of peak brightness is rather pathetic. There is a difference between any window size 1,800nits next to 0nits and ABL 2% 700nits next to 0nits. There is also a difference in color coverage.
> I have HDR footages/pictures that crash OLED all the time. Sometimes it is funny on OLED when the night is brighter than daylight due to ABL.
> This is why the HDR content is graded on professional monitors like PA32UCG.


Lol.

No. For starters, your PA32UCG isn't doing anything close to 0 nits next to 1800 nits. maybe halfway across the screen it is. A starfield would make that thing curl up into fetal position. Your post reads like someone whos entire OLED experience is derived from reading about them.


----------



## yzonker

ttnuagmada said:


> Lol.
> 
> No. For starters, your PA32UCG isn't doing anything close to 0 nits next to 1800 nits. maybe halfway across the screen it is. A starfield would make that thing curl up into fetal position. Your post reads like someone whos entire OLED experience is derived from reading about them.


Just quoting specs too, which displays rarely achieve in real world use.

All of the display technologies have tradeoffs. One of my other expensive hobbies is home theater. In that world you get endless debate between flat panel and front projector people. 

For me I wouldn't be happy at all with any of these smaller 32" displays being discussed. Even if there is an image quality loss in whatever way, I'd much rather have a large display that fills my field of view.


----------



## ttnuagmada

yzonker said:


> Just quoting specs too, which displays rarely achieve in real world use.
> 
> All of the display technologies have tradeoffs. One of my other expensive hobbies is home theater. In that world you get endless debate between flat panel and front projector people.
> 
> For me I wouldn't be happy at all with any of these smaller 32" displays being discussed. Even if there is an image quality loss in whatever way, I'd much rather have a large display that fills my field of view.


His posts read like people from a few years ago before OLED became wide-spread. Now you won't see anyone anywhere arguing that any kind of LCD is better other than people who obviously still haven't experienced an OLED first-hand. Absolute nits are not king. Enough nits and perfect blacks are king. FALD is nothing but a fancy trick. Static contrast cannot be beaten, that's why 13-14 year old Kuro plasmas still best any LCD in a dark room for SDR content.


----------



## ttnuagmada

SilenMar said:


> I calculated Rec.2020 coverage at 85%. Even PG32UQX has 85%.


With what, a Spyder probably? lol


----------



## pat182

ttnuagmada said:


> Lol.
> 
> No. For starters, your PA32UCG isn't doing anything close to 0 nits next to 1800 nits. maybe halfway across the screen it is. A starfield would make that thing curl up into fetal position. Your post reads like someone whos entire OLED experience is derived from reading about them.


you dont really need 0 nit unless you game in a dark room with all the lights out , local dimming does go super low that you wont notice any light, i have a dual monitor setup and the black on a fald display is so ink black vs a normal ips panel

fald have trades off, so does oled, unless they come up with a panel that does 1000nit at 50% or 80% screen, oled isnt that attractive vs good fald display

tbh im waiting on a ultrawide 2160p 43inch with fald to upgrade, those wont come until 2 years i guess.


----------



## ttnuagmada

pat182 said:


> you dont really need 0 nit unless you game in a dark room with all the lights out , local dimming does go super low that you wont notice any light, i have a dual monitor setup and the black on a fald display is so ink black vs a normal ips panel
> 
> fald have trades off, so does oled, unless they come up with a panel that does 1000nit at 50% or 80% screen, oled isnt that attractive vs good fald display


No one is clamoring for 1000 nits at 50 or 80%. Nothing you're watching or playing is calling for that 99.999% of the time. Also, what do you think most of us are doing with our OLEDs? Of course we're viewing stuff with the lights down. FALD isn't very important either if you're not. Also, people are talking like LCD doesn't also have ABL, it might not be quite as aggressive but it's 100% still going on to a large degree. The vast majority of the world has come to learn over recent years that having a picture that's clearly superior 99% of the time, especially in a dark room, is preferable to some quick flash of brightness being a little bit brighter in a very tiny fraction of content that exists.

All display technologies have tradeoffs, but you're giving way way way way too much weight to something that barely ever matters.


----------



## jonnyzeraa

jonnyzeraa said:


> Hey guys, here in brazil have 4 RTX 4090 models available for buy, MSI Suprim X Liquid ( 2868$ ) MSI Gaming Trio ( 2730$ ), Gigabyte Gaming OC ( 2824$ ) and Zotac AMP Extreme AIRO ( 2315$ ), which one u guys recommend me buy?


any suggestions for this super confused guy?


----------



## Glottis

jonnyzeraa said:


> any suggestions for this super confused guy?


I know component prices in Brazil can be insane, but just go for the cheapest Zotac. There's no point paying 500$ more for other cards, it's not like you'll be getting 500$ worth of extra performance.


----------



## Sheyster

pat182 said:


> 32inch is nice tho


32 inch 16:9 is not near immersive enough for me. I've had a Samsung 32" parked in the corner since the OLED hit the desk. Also, CX/C1/C2 HDR is much higher than 300 nits, it's more like 700-800 nits.


----------



## Sheyster

Glottis said:


> I know component prices in Brazil can be insane, but just go for the cheapest Zotac. There's no point paying 500$ more for other cards, it's not like you'll be getting 500$ worth of extra performance.


Agree, at those prices definitely the Zotac. Easy to update the BIOS if needed.


----------



## SilenMar

ttnuagmada said:


> Lol.
> 
> No. For starters, your PA32UCG isn't doing anything close to 0 nits next to 1800 nits. maybe halfway across the screen it is. A starfield would make that thing curl up into fetal position. Your post reads like someone whos entire OLED experience is derived from reading about them.


You are talking about a professional monitor that grades HDR for other displays including OLED. Several pixels apart there is 1million:1 contrast. Nobody grades at pixels level though. A little bit of bloom won't even do anything.

Speaking like this only exposes how limited you can see. You probably have never seen HDR starfield where the stars are 2,000nits and APL is well over 400nits.


----------



## Sheyster

SilenMar said:


> You are talking about a professional monitor that grades HDR for other displays including OLED. Several pixels apart there is 1million:1 contrast. Nobody grades at pixels level though. A little bit of bloom won't even do anything.
> 
> Speaking like this only exposes how limited you can see. You probably have never seen HDR starfield where the stars are 2,000nits and APL is well over 400nits.


I think we can all agree that these monitors are very different beasts. This monitor of yours has its uses for pro content. The basic fact is, no one will actually buy this for gaming and streaming, except maybe 1 in a million users. There are too many drawbacks, including the price itself. Plus, it's really not immersive enough in this day and age..


----------



## jonnyzeraa

Glottis said:


> I know component prices in Brazil can be insane, but just go for the cheapest Zotac. There's no point paying 500$ more for other cards, it's not like you'll be getting 500$ worth of extra performance.





Sheyster said:


> Agree, at those prices definitely the Zotac. Easy to update the BIOS if needed.


The performance between this 4 models will be the same? or

Btw the Zotac AIRO have the price really close to the other 3, but in that site who's selling that one i got a 10% code discount from birthday so this is the reason why the zotac is more cheap then others


----------



## ttnuagmada

SilenMar said:


> You are talking about a professional monitor that grades HDR for other displays including OLED. Several pixels apart there is 1million:1 contrast. Nobody grades at pixels level though. A little bit of bloom won't even do anything.


There are no professional grading houses using that display for anything. Several hundred pixels apart maybe. 1000 zones is not many zones. I'd be surprised if it could do 20,000:1 ANSI. There aren't any other FALD displays out there, even with more zones, that do much higher. 



> Speaking like this only exposes how limited you can see. You probably have never seen HDR starfield where the stars are 2,000nits and APL is well over 400nits.


2000 nits? Stars at 200 nits would expose the PA32UCG.


----------



## Nd4spdvn

pat182 said:


> you dont really need 0 nit unless you game in a dark room with all the lights out , local dimming does go super low that you wont notice any light, i have a dual monitor setup and the black on a fald display is so ink black vs a normal ips panel
> 
> fald have trades off, so does oled, unless they come up with a panel that does 1000nit at 50% or 80% screen, oled isnt that attractive vs good fald display
> 
> tbh im waiting on a ultrawide 2160p 43inch with fald to upgrade, those wont come until 2 years i guess.


Even though kinda off topic for the thread here (sorry all), but it looks to me that you are preaching your perceptions as if they were actual facts or even common sense, but let me tell you that some of us here in this very PC hardware thread might have studied color science and related hardware for both content consumption and/or measurements assessments for more than 15 years. And we may even be more passionate about this other hobby, or for some even profession, than the PC hardware builds and overclocking, with some having owned multiple display devices over the years including the above mentioned Kuro which is still unsurpassed in terms of color purity reproduction. And even though we moved on from yesterday's technologies to OLED as a natural successor and other new tech merits, we also understand intimately its technology shortcomings albeit none of which were pointed out by you, you simply repeated the same narative we've heard over the years. But the simple fact is that the black level is the canvas of a display, responsible for not just contrast ratio as merely a number but also for the _perceived_ contrast. As such, OLED, with its 0 black, per pixel control technology is creating the natural, necessary ground for as pure as possible color reproduction (given current tech limitations) as well as (infinite) contrast levels no matter if we talk SDR and/or HDR with HDR tending to look spectacular on properly EOTF calibrated OLED displays (such as the LGs as they come pretty much accurate oob in their HDR modes). Downplaying this fact shows that you either do not understand the fundamentals or simply don't have enough experience in this field. Or both.


----------



## SilenMar

Sheyster said:


> I think we can all agree that these monitors are very different beasts. This monitor of yours has its uses for pro content. The basic fact is, no one will actually buy this for gaming and streaming, except maybe 1 in a million users. There are too many drawbacks, including the price itself. Plus, it's really not immersive enough in this day and age..


The major drawback is still the price. If money is not an issue, everybody will use monitors like PA32UCG to see the higher range of HDR. There are large microLED TVs available as well.
I have seen all kinds of OLED including the latest variations of QD-OLED, JOLED such as PA32DC, AW3423DW, S95B. The range is still very limited. They all inherit flickers. OLED is just fast, not much special with other problems.




ttnuagmada said:


> No one is using that display for any type of professional content grading.


Really funny professionals like Jacob Schwarz use it. Check his works on YouTube.
You probably won't notice the differences between Alaska and Bulgaria. Alaska rainy day was graded with a HDR 400 OLED PA32DC while Bulgaria was graded with HDR 1000 miniLED PA32UCX. The range is different. OLED can be only used at the low range.


----------



## ttnuagmada

SilenMar said:


> Really funny professionals like Jacob Schwarz use it. Check his works on YouTube.
> You probably won't notice the differences between Alaska and Bulgaria. Alaska rainy day was graded with a HDR 400 OLED PA32DC while Bulgaria was graded with HDR 1000 miniLED PA32UCX. The rang is different. OLED can be only used at the low range.


5 minutes on liftgamagain will show you that no one takes this monitor nearly as seriously as you're insisting. I'm not calling it a bad monitor, but trying to act like it's superior to an OLED for much of anything beyond peak brightness is making you look a little ill-informed.


----------



## Sheyster

Nd4spdvn said:


> As such, OLED, with its 0 black, per pixel control technology is creating the natural, necessary ground for as pure as possible color reproduction (given current tech limitations) as well as (infinite) contrast levels no matter if we talk SDR and/or HDR with HDR tending to look spectacular on properly EOTF calibrated OLED displays (such as the LGs as they come pretty much accurate oob in their HDR modes).


A lot of these recent anti OLED posts are obviously coming from folks who don't use an OLED. I can't imagine using anything else at this point.

You're right about HDR mode being dialed in well OOB on the LG OLEDs, particularly the C2. Minimal tweaking is needed there. SDR is another story though, it's much too "blue" OOB and Warm 50 color temp (or at least 30) is required. On the older LG models it's Warm 2 (or Warm 1).


----------



## SilenMar

ttnuagmada said:


> 5 minutes on liftgamagain will show you that no one takes this monitor nearly as seriously as you're insisting. I'm not calling it a bad monitor, but trying to act like it's superior to an OLED for much of anything beyond peak brightness is making you look a little ill-informed.


Another 5 minutes on liftgamagain will show you that how fewer HDR editors are on that forum. Most of them are still doing SDR works. 
Saying no professionals use this monitor is a joke.
It's also one of the few monitors where you can calibrate HDR on it. There aren't many monitors where HDR can be calibrated.


----------



## ttnuagmada

SilenMar said:


> Another 5 minutes on liftgamagain will show you that how fewer HDR editors are on that forum. Most of them are still doing SDR works.
> Saying no professionals use this monitor is a joke.
> It's also one of the few monitors where you can calibrate HDR on it. There aren't many monitors where HDR can be calibrated.


It's also not superior to an OLED.

It, just like every single, solitary other FALD LCD that has ever existed, cannot compete with an OLED in terms of real contrast. This is a debate from 5 years ago. No one is having it anymore.


----------



## SilenMar

ttnuagmada said:


> It's also not superior to an OLED.
> 
> It, just like every single, solitary other FALD LCD that has ever existed, cannot compete with an OLED in terms of real contrast. This is a debate from 5 years ago. No one is having it anymore.


It is in fact a lot superior to an OLED without ABL, flickering, low color gamut. 

Everybody knows how dim OLED looks with ABL. All the pixel level contrast is wasted. It cannot even display correct color. Don't forget color is lifted by brightness. 

This is why true HDR monitors are all these FALD LCDs or dual-layers with sustained brightness, higher gamut, flicker-free DC dimming.


----------



## yzonker

Sigh...


----------



## Sheyster

yzonker said:


> Sigh...


LOL, indeed.. I feel like I started all this by simply asking if anyone was pre-ordering the 45" LG Curved OLED.


----------



## ttnuagmada

yzonker said:


> Sigh...


Yeah, I'm out. This feel like a convo from 2015.


----------



## Nd4spdvn

Sheyster said:


> You're right about HDR mode being dialed in well OOB on the LG OLEDs, particularly the C2. Minimal tweaking is needed there. SDR is another story though, it's much too "blue" OOB and Warm 50 color temp (or at least 30) is required. On the older LG models it's Warm 2 (or Warm 1).


I did not mention EOTF by accident. You are also right about greyscale accuracy oob which tend not be perfect and can be improved sometimes significantly by calibration (particularly gamma tracking and color via 3DLut) though it varies on the same models on different continents with some being much ballpark than the others (for example the US models tend to be bluer is also my understanding whereas the EU ones are coming much closer to D65 oob).

Edit: time to stop the off topic


----------



## ttnuagmada

SilenMar said:


> It is in fact a lot superior to an OLED without ABL, flickering, low color gamut.
> 
> Everybody knows how dim OLED looks with ABL. All the pixel level contrast is wasted. It cannot even display correct color. Don't forget color is lifted by brightness.
> 
> This is why true HDR monitors are all these FALD LCDs or dual-layers with sustained brightness, higher gamut, flicker-free DC dimming.


The new Samsung panels have the highest gamut of any display short of laser projection, they also don't flicker.



> Everybody knows how dim OLED looks with ABL.


How to tell someone you've never used an OLED without telling them you've never used an OLED.


Anyhow, feel free to make a thread about this over on AVSforum. I'll gladly continue over there if you want. You'll quickly realize you exist in a bubble though.

Apologies to everyone else for derailing.


----------



## alasdairvfr

Sooo.... video cards!

I started a New Game + in Witcher 3 rather than running around 'busy areas' to see how the game would run on a new playthrough. I noticed that cutscenes _really_ get bad frame drops and it causes issues with Frame Generation even as it drops below GSync range and causes flickering and of course stuttering. In actual gameplay it seems fine (only tested like the first 15-20min of the game) but pretty much every cutscene framerate drops from 115-130 down to like 60 (with Frame Generation turned on) before it recovers. It's pretty unpleasant.

Anyone else seeing this behaviour?


----------



## SilenMar

ttnuagmada said:


> The new Samsung panels have the highest gamut of any display short of laser projection, they also don't flicker.
> 
> 
> 
> How to tell someone you've never used an OLED without telling them you've never used an OLED.


I have used all kinds of OLED. 

Speaking like this only exposes you are right now hallucinating


----------



## SilenMar

And there is 2,000nits starfield. 

Open this with Windows Photo in HDR mode. They are supposed to be light-yellow stars but not brown stars. Only ABL can make brown stars.






Starfield.jxr







drive.google.com





Change the picture size from smaller to fullscreen to check the ABL of OLED on the starfield.


----------



## ttnuagmada

SilenMar said:


> And there is 2,000nits starfield.
> 
> Open this with Windows Photo in HDR mode. They are supposed to be light-yellow stars but not brown stars. Only ABL can make brown stars.
> 
> 
> 
> 
> 
> 
> Starfield.jxr
> 
> 
> 
> 
> 
> 
> 
> drive.google.com
> 
> 
> 
> 
> 
> Change the picture size from smaller to fullscreen to check the ABL of OLED on the starfield.



lol. That's not a starfield. Of course you would use something like that that had no black as an example. A real starfield with 200 nit stars would make your monitor look like a washed out mess.


----------



## SilenMar

ttnuagmada said:


> lol. That's not a starfield. Of course you would use something like that that had no black as an example. A real starfield with 200 nit stars would make your monitor look like a washed out mess.


You should suggest this to the movie makers to make a beautiful scene with white stars in black background. 
I have said what you see is limited so your impression of starfield has only 200nits ABL brown stars. It's in fact what you see is a mess with ABL.
Funny 200nits stars in black background don't even have much noticeable blooming. It's the strength of a professional monitor.
Keep talking. I have seen a lot of your baseless claims.


----------



## J7SC

jonnyzeraa said:


> any suggestions for this super confused guy?


...most 4090s are fairly close in performance to each other (subject to 'the lottery', including on VRAM). If budget is not the overriding concern, I would go for the MSI Suprim X Liquid or the Gigabyte Gaming OC (the model I use, though it was also the 'cheapest' here)....or any other 4090 that has dual vbios. That said, the Zotac just looks like the best buy in your market...



SilenMar said:


> It is in fact a lot superior to an OLED without ABL, flickering, low color gamut.
> 
> *Everybody knows* how dim OLED looks with ABL. All the pixel level contrast is wasted. It cannot even display correct color. Don't forget color is lifted by brightness.
> 
> This is why true HDR monitors are all these FALD LCDs or dual-layers with sustained brightness, higher gamut, flicker-free DC dimming.


When you use terms like "everybody knows..." in an argument, it speaks volumes....



Sheyster said:


> LOL, indeed.. I feel like I started all this by simply asking if anyone was pre-ordering the 45" LG Curved OLED.


...yes, it's all your mistake ! Now you have to play all your games on this for a month for penance...


----------



## ttnuagmada

SilenMar said:


> You should suggest this to the movie makers to make a beautiful scene with white stars in black background.
> I have said what you see is limited so your impression of starfield has only 200nits ABL brown stars. It's in fact what you see is a mess with ABL.
> Funny 200nits stars in black background don't even have much noticeable blooming. It's the strength of a professional monitor.
> Keep talking. I have seen a lot of your baseless claims.


This isn't even coherent, and you're either being intellectually dishonest or simply don't understand how an FALD LCD works if you don't think a starfield with even 100 nits stars would reduce your monitor to essentially dumpster IPS static contrast. There's a reason that no one is using IPS panels in high-end televisions FALD or not. There's also the whole part where situations that would trigger ABL to the point of even being noticeable pale in comparison to the situations where blooming would be obvious with 1000 zones on an IPS panel. 

edit: mods, can you delete this whole conversation? I don't have enough self-control to stop arguing.


----------



## SilenMar

ttnuagmada said:


> This isn't even coherent, and you're either being intellectually dishonest or simply don't understand how an FALD LCD works if you don't think a starfield with even 100 nits stars would reduce your monitor to essentially IPS static contrast.
> 
> edit: mods, can you delete this whole conversation? I don't have enough self-control to stop arguing.


Speaking like this only exposes it is you don't understand how FALD works.

FALD is specifically designed to increase the static contrast from 1000:1 to 1million:1. 
And the blooming of miniLED is more pronounced in a grey background, not in a black background. Otherwise all the showcase footages of miniLED display won't even use black background.
Keep talking.


----------



## yzonker

ttnuagmada said:


> This isn't even coherent, and you're either being intellectually dishonest or simply don't understand how an FALD LCD works if you don't think a starfield with even 100 nits stars would reduce your monitor to essentially dumpster IPS static contrast. There's a reason that no one is using IPS panels in high-end televisions FALD or not. There's also the whole part where situations that would trigger ABL to the point of even being noticeable pale in comparison to the situations where blooming would be obvious with 1000 zones on an IPS panel.
> 
> edit: mods, can you delete this whole conversation? I don't have enough self-control to stop arguing.


There's an ignore feature on the forum.

Edit: and I just added an additional user to it...


----------



## jonnyzeraa

J7SC said:


> ...most 4090s are fairly close in performance to each other (subject to 'the lottery', including on VRAM). If budget is not the overriding concern, I would go for the MSI Suprim X Liquid or the Gigabyte Gaming OC (the model I use, though it was also the 'cheapest' here)....or any other 4090 that has dual vbios. That said, the Zotac just looks like the best buy in your market...


Yeah, i guess my choice will be the Zotac AIRO, i will save this extra 500~600$ for a future mobo/cpu/ram upgrade, no reason to pay this big diff to have the same performance and if i choice suprim x liquid, ill need make some changes in my lancool 3, i have a H170i 420mm on top of this and 3x140mm fans in frontal so, or i remove the 420mm and try put in front or remove the 3x 140mm fans frontal for the suprim radiator and maybe just let one 1x140 and the radiator idk. Is a lot of extra money and extra job too lmao, since put a radiator below the gpu is not recommended right ..


----------



## ttnuagmada

SilenMar said:


> Speaking like this only exposes it is you don't understand how FALD works.
> 
> FALD is specifically designed to increase the static contrast from 1000:1 to 1million:1.


🤦‍♂️



> And the blooming of miniLED is more pronounced in a grey background, not in a black background. Otherwise all the showcase footages of miniLED display won't even use black background.
> Keep talking.


I have a thought experiment for you:

The panel in the PA32UCG has about a 1400:1 static contrast right? Lets say you throw a starfield up, that has a few thousand stars all calling for 200 nits. Each zone on your 1000 zone monitor is going to have to light-up at least a few of these stars. Keeping that in mind, how would your monitor be able to display absolute black, anywhere on the screen, at less than 0.14 nits?


----------



## Brads3cents

Since we are talking about monitors I think no end game monitor has released yet. I’ve been using the same monitor since 2018, a 4k 144hz ips panel 27”
Yea there are 240hz panels but it’s lacking in other areas 
Yeah there are oleds but I don’t want to get hit with a ppi penalty by going bigger

I think endgame is coming with the 8k ultrawides.
Basically it’s the same ppi as 4k 27”, but with the massive screen. At the same time it’s significantly easier to drive than true 8k and the 4090 will easily tackle this resolution with Dlss 

Unfortunately, I want Oled so it may be a bit while longer
Which is fine since these current oleds will get dated fast. Soon the monitor market will be getting EX technology trickle down from tvs which is great and also much better protection against burn in
But the real big technology to come is MLA (micro lens array) this will effectively make the colors as good or better than QD-Oled, but with all the gaming features and superior sub pixel layout of lg oled. And true HDR! 
unfortunately I’m not sure we will get such a panel in 2023… I might have to wait until 2024 for the ultimate gaming monitor


----------



## SilenMar

J7SC said:


> When you use terms like "everybody knows..." in an argument, it speaks volumes...


Everybody has an OLED TV. Everybody sees what it looks like. But not everybody knows the difference when there is a comparison.
Draw a square on the window of the picture you posted. That is what true HDR monitor looks like compared to OLED.











yzonker said:


> There's an ignore feature on the forum.


Ignorance is a bliss so that you won't be able to know anything better.



ttnuagmada said:


> The panel in the PA32UCG has about a 1400:1 static contrast right? Lets say you throw a starfield up, that has a few thousand stars all calling for 500 nits. Each zone on your 1000 zone monitor is going to have to light-up at least a few of these stars. Keeping that in mind, how would your monitor be able to display absolute black, anywhere on the screen, at less than .35 nits?


Funny you are always imaging.
I have thrown different kinds of starfield on this monitor. Even 1,800nits close-spaced stars on the black background won't be a problem. The only starfield that makes the monitor shows its limit is 1,800nits dots on a grey background.


----------



## ttnuagmada

SilenMar said:


> Funny you are always imaging.
> I have thrown different kinds of starfield on this monitor. Even 1,800nits close-spaced stars on the black background won't be a problem. The only starfield that makes the monitor shows its limit is 1,800nits dots on a grey background.


So you have one of those magic monitors that somehow completely defies physics? Yeah, a physics defying LCD is the only thing that would look better than an OLED, so I guess you win!


----------



## SilenMar

ttnuagmada said:


> So you have one of those magic monitors that somehow completely defies physics? Yeah, a physics defying LCD is the only thing that would look better than an OLED, so I guess you win!


There is nothing magic about it. It's simply the result of engineering from thousands of miniLED display suppliers. 
The semiconductor industry is all about vigorous supply chains. It is obvious who has the upper hand compared to the limited supply chains of OLED where the flickers are never solved. 
The density of FALD will only go up. And you can even buy a microLED TV right now.


----------



## ttnuagmada

SilenMar said:


> There is nothing magic about it. It's simply the result of engineering from thousands of miniLED display suppliers.
> The semiconductor industry is all about vigorous supply chains. It is obvious who has the upper hand compared to the limited supply chains of OLED where the flickers are never solved.
> The density of FALD will only go up. And you can even buy a microLED TV right now.


So, lets do this one more time: how can a panel with 1400:1 static contrast, display something at 500 nits in every zone without black being at .35 nits at the same time? So far your explanation is some hand-wavy "vigorous supply chains". Though what you're really telling us is that you don't seem to understand how LCD static contrast and FALD work on even a basic level.


----------



## SilenMar

ttnuagmada said:


> So, lets do this one more time: how can a panel with 1400:1 static contrast, display something at 500 nits in every zone, without black being at .35 nits at the same time? So far your explanation is some hand-wavy "vigorous supply chains". Though what you're really telling us is that you don't seem to understand how LCD static contrast and FALD work on even a basic level.


Check my previous reply. You don't really understand anything.

I'm afraid it has 1million:1 contrast when the FALD is enabled. Static contrast was not even there. The backlight can be shut off. Each zone has multiple LEDs, not just one. This is why this is a $5000 true HDR professional monitors with 1600nits sustained brightness, widest color gamut.

On the other hand, good luck do your HDR grading on an ABL, flickering, colorless OLED TV where the color is not even correct. 

One day It is going to be either a dual-layer LCD or a future microLED monitor If I want to do the pixel-level grading of each frame out of 8-million 4K pictures lol.


----------



## kryptonfly

Well, there was indeed a bad mounting from Gigabyte, the pressure was bad around the GPU, 53°C at 250W at stock :








I mounted the Bykski block with 1mm pads on the vram and 1.5mm on the vrms. It's clearly better ! I could keep almost the same vram temp than before, around 60°C.
I scored 29 294 in Port Royal
I scored 11 397 in Speed Way

It's really hard to pass PR but Speedway is much easier let's say. PR is more sensitive to the vram because I got blackscreens with lower vram speed than SW. PR should be a little higher I think, I can't do more for now but SW is nice. I gained around +200 pts with Galax bios. Actually it's a different behavior especially about the effective clock more in line with the set freq.


----------



## ttnuagmada

SilenMar said:


> Check my previous reply. You don't really understand anything.
> 
> I'm afraid it has 1million:1 contrast when the FALD is enabled.


You wouldn;'t be able to find me an ANSI measurement showing it was capable of even 1/20th of that. Your display also wouldn't do anything over 1400:1 contrast if it was having to display a starfield like the one discussed.



> Static contrast was not even there. The backlight can be shut off. Each zone has multiple LEDs, not just one.


The number of LED's is absolutely irrelevant. the number of addressable zones is what is relevant. 


I'm glad you finally confirmed in writing for everyone here that you don't understand how FALD works.


----------



## yzonker

kryptonfly said:


> Well, there was indeed a bad mounting from Gigabyte, the pressure was bad around the GPU, 53°C at 250W at stock :
> View attachment 2589013
> 
> 
> I mounted the Bykski block with 1mm pads on the vram and 1.5mm on the vrms. It's clearly better ! I could keep almost the same vram temp than before, around 60°C.
> I scored 29 294 in Port Royal
> I scored 11 397 in Speed Way
> 
> It's really hard to pass PR but Speedway is much easier let's say. PR is more sensitive to the vram because I got blackscreens with lower vram speed than SW. PR should be a little higher I think, I can't do more for now but SW is nice. I gained around +200 pts with Galax bios. Actually it's a different behavior especially about the effective clock more in line with the set freq.


Yea you look a little soft on PR for some reason. 3 bins more core, but a lower score than what I did the other day.

Result (3dmark.com) 

SW looks right though, same mem and 3 bins more core. (it doesn't add much though)

Result (3dmark.com)


----------



## mirkendargen

Ok I tried to resist but I really couldn't. HDR content by real professionals is mastered on displays with dual layer LCD films to ACTUALLY prevent bloom. You don't have one of those. They cost a ton, they aren't going to be 120hz, they're loud as hell because they need fans to cool the massive backlights and heat absorption of the dual layer films. Example: XM311K - 31 Inch 4K HDR Mastering Monitor

If you can't understand that your display has 1,xxx individual backlights, and 8,xxx,xxx individual pixels, I don't know what to tell you. You're too ignorant (willfully or otherwise) to keep up with the conversation.

So anyway, how about some video cards? Has anyone actually used a monitor with DSC? Is it truly lossless or just "visually lossless"? Is it possible to see a difference? That's the fastest route to 240hz 4k on a 4090. Or monitors/TV's start shipping with DP 2.1 and Nvidia puts that on a 4090ti just to make us rage.


----------



## StreaMRoLLeR

Hey everyone. Got the card yesterday

-Crazy amount of Coil-Whine
-Mem temp is max 66c ( not that bad batch which have 80c mems ) Core temp max 53 at 530W
-Rebar on
-Gaming windows, not stripped
-Standart bios.
-Lastest version from steam









I scored 28 682 in Port Royal


Intel Core i7-13700KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com




Im a bit late to party. Everyone is upgraded to Galax bios and doing 29k runs ez. Fortunately my effective clock is very good

My chip is doing max 3030 in PR. And thats what limiting me. Getting artifact AFTER +1700 so i stop at 1672.









Btw I am actively reporting old fake " before patch " artifact scores. Support is suprisingly fast and deleting them

@SoldierRBT


----------



## SilenMar

ttnuagmada said:


> You wouldn;'t be able to find me an ANSI measurement showing it was capable of even 1/20th of that. Your display also wouldn't do anything over 1400:1 contrast if it was having to display a starfield like the one discussed.
> 
> 
> 
> The number of LED's is absolutely irrelevant. the number of addressable zones is what is relevant.
> 
> 
> I'm glad you finally confirmed in writing for everyone here that you don't understand how FALD works.


You are hallucinating again.

I have tested starfield more than you do. Even 1800nits dots in each zone with slightly wider blooming pattern still have 1milllion:1 contrast. The monitor has physical grids to block excessive light coming out of each zone. Around the edge of each zone is completely black.

Funny how much imagination can go. ABL OLED is not even comparable to do the things a true HDR monitor does.


----------



## yzonker

StreaMRoLLeR said:


> Hey everyone. Got the card yesterday
> 
> -Crazy amount of Coil-Whine
> -Mem temp is max 66c ( not that bad batch which have 80c mems ) Core temp max 53 at 530W
> -Rebar on
> -Gaming windows, not stripped
> -Standart bios.
> -Lastest version from steam
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 682 in Port Royal
> 
> 
> Intel Core i7-13700KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> Im a bit late to party. Everyone is upgraded to Galax bios and doing 29k runs ez. Fortunately my effective clock is very good
> 
> My chip is doing max 3030 in PR. And thats what limiting me. Getting artifact AFTER +1700 so i stop at 1672.
> View attachment 2589023
> 
> 
> Btw I am actively reporting old fake " before patch " artifact scores. Support is suprisingly fast and deleting them


Before which patch?


----------



## alasdairvfr

mirkendargen said:


> So anyway, how about some video cards? Has anyone actually used a monitor with DSC? Is it truly lossless or just "visually lossless"? Is it possible to see a difference? That's the fastest route to 240hz 4k on a 4090. Or monitors/TV's start shipping with DP 2.1 and Nvidia puts that on a 4090ti just to make us rage.


My 4k144 (MSI Optix MPG3231UR-QD) panel uses DSC to achieve that res/framerate and can do so at 10bpc.










I'm about to get a second one to replace my crappy Samsung that's on its way back to NewEgg.

I cannot discern the difference. I believe it doesn't need DSC at 120HZ but I'm also not capable or qualified to determine if there is any loss to compression. I never see banding in fine gradient colours except poorly compressed video (game cutscenes, most streaming services compress to the point a wide colour gamut on the panel won't matter)

Not including a DP2.1 is a really frustrating decision on NVidia's part given the price of the 4090. One of the reasons I'm buying more DSC panels that seem to work well since a single cable even with DSC might not handle me merging 3x 4k into a 4k super-ultrawide at high frames. For me 144hz at 4k is enough to make me happy. I want to be able to get into some surround gaming for a few games.


----------



## StreaMRoLLeR

yzonker said:


> Before which patch?


5.56.1143.0


----------



## SilenMar

mirkendargen said:


> Ok I tried to resist but I really couldn't. HDR content by real professionals is mastered on displays with dual layer LCD films to ACTUALLY prevent bloom. You don't have one of those. They cost a ton, they aren't going to be 120hz, they're loud as hell because they need fans to cool the massive backlights and heat absorption of the dual layer films. Example: XM311K - 31 Inch 4K HDR Mastering Monitor
> 
> If you can't understand that your display has 1,xxx individual backlights, and 8,xxx,xxx individual pixels, I don't know what to tell you. You're too ignorant (willfully or otherwise) to keep up with the conversation.


You better do more research on how a monitor works, how backlight works, how color works.

I had seen all the attacks on FALD LCDs 100 times a day with whatever starfield you can throw. The professional monitors holds very well.

In the mean while, on the real content, OLED cannot show pure whites, neutral grays . The white balance is always changing. The specific hue is either invisible or miles off compared to a HDR monitor. 

Of course, PA32UCG is several miles away from a dual-layer LCD. But an OLED is hundreds miles away from it.


----------



## ttnuagmada

SilenMar said:


> You are hallucinating again.
> 
> I have tested starfield more than you do. Even 1800nits dots in each zone with slightly wider blooming pattern still have 1milllion:1 contrast. The monitor has physical grids to block excessive light coming out of each zone. Around the edge of each zone is completely black.
> 
> Funny how much imagination can go. ABL OLED is not even comparable to do the things a true HDR monitor does.


What good is blocking out zone bleed if the zone next to it is lit just the same? You still don't get it. Your monitor isn't doing 1 million to 1 contrast in all but the most extreme, cherry picked conditions. You are choice-supportive-bias personified.

You own one of these right? So you must have a spectro/colorimeter if you're doing any color-critical work, surely. So by all means, take an ANSI measurement for us.


----------



## mirkendargen

SilenMar said:


> In the mean while, on the real content, OLED cannot show pure whites, neutral grays . The white balance is always changing. The specific hue is either invisible or miles off compared to a HDR monitor.











LG G2 OLED Review (OLED55G2PUA, OLED65G2PUA, OLED77G2PUA, OLED83G2PUA, OLED97G2PUA)


The LG G2 OLED is a high-end TV, and it's the successor to the LG G1 OLED. OLED TVs like the G2 are self-emissive, meaning unlike LCD panels found on other TVs, ...




www.rtings.com






https://www.tomshardware.com/reviews/asus-proart-pa32ucg/4



You realize your monitor has worse gray uniformity, worse white balance, and worse color accuracy than an LG OLED, right? No one's arguing with you on fullscreen brightness (just on the value of it) so you can keep that one. Something tells me you haven't seen or heard anything about OLED's since 2015 or whenever it was that LG switched from RGB to white with filters on the....6 series I think it was?

Samsung has ever so slightly worse accuracy, but greater color range and brightness (and is what's more likely to end up in monitors).


----------



## SilenMar

ttnuagmada said:


> What good is blocking out zone bleed if the zone next to it is lit just the same? You still don't get it. Your monitor isn't doing 1 million to 1 contrast in all but the most extreme, cherry picked conditions. You are choice-supportive-bias personified.
> 
> You own one of these right? So you must have a spectro/colorimeter if you're doing any color-critical work, surely. So by all means, take an ANSI measurement for us.


It is you who are on the most extreme , cherry picked conditions. I don't usually give much thoughts about starfield unless you bring it up.
When I show the starfield. OLED is destroyed by ABL. 
Better not hallucinate furthermore. I get the over 1milllion:1 contrast at each zone on the monitor measured by included x-rite pro colorimeter lol.




mirkendargen said:


> You realize your monitor has worse gray uniformity, worse white balance, and worse color accuracy than an LG OLED, right? No one's arguing with you on fullscreen brightness (just on the value of it) so you can keep that one. Something tells me you haven't seen or heard anything about OLED's since 2015 or whenever it was that LG switched from RGB to white with filters on the....6 series I think it was?
> 
> Samsung has ever so slightly worse accuracy, but greater color range and brightness (and is what's more likely to end up in monitors).


I have a few true HDR monitors. All the uniformity, white balance, color accuracy are all better than whatever ABL OLED you pick. 
OLED is the worst. It will be wrecked when the miniLED goes to HDR 4,000 to utilize 12bit color, 
Funny how people shows their OLED next to a window while the true HDR monitors can just look like a window. 

Check this at two exposure settings 
*True HDR 1000 vs AW3423DW HDR 1000*

















More of the similar


Spoiler







































































































































































































































































































































































































































































































































































































































Then check this


----------



## GRABibus

kryptonfly said:


> Well, there was indeed a bad mounting from Gigabyte, the pressure was bad around the GPU, 53°C at 250W at stock :
> View attachment 2589013
> 
> 
> I mounted the Bykski block with 1mm pads on the vram and 1.5mm on the vrms. It's clearly better ! I could keep almost the same vram temp than before, around 60°C.
> I scored 29 294 in Port Royal
> I scored 11 397 in Speed Way
> 
> It's really hard to pass PR but Speedway is much easier let's say. PR is more sensitive to the vram because I got blackscreens with lower vram speed than SW. PR should be a little higher I think, I can't do more for now but SW is nice. I gained around +200 pts with Galax bios. Actually it's a different behavior especially about the effective clock more in line with the set freq.


Which Galax Bios do you use ?
I think I missed it as everybody speaks about it


----------



## yzonker

GRABibus said:


> Which Galax Bios do you use ?
> I think I missed it as everybody speaks about it











GALAX RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com


----------



## GRABibus

yzonker said:


> GALAX RTX 4090 VBIOS
> 
> 
> 24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com


for sure it is a nice benching Bios.

Still on my veeeeerrrryyyy nice GIGABYTE GAMING OC, and still on stock air cooler 



















I scored 29 734 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Max Power draw = 609W
Average GPU temperature = 56°C


----------



## GRABibus

kryptonfly said:


> Well, there was indeed a bad mounting from Gigabyte, the pressure was bad around the GPU, 53°C at 250W at stock :
> View attachment 2589013
> 
> 
> I mounted the Bykski block with 1mm pads on the vram and 1.5mm on the vrms. It's clearly better ! I could keep almost the same vram temp than before, around 60°C.
> I scored 29 294 in Port Royal
> I scored 11 397 in Speed Way
> 
> It's really hard to pass PR but Speedway is much easier let's say. PR is more sensitive to the vram because I got blackscreens with lower vram speed than SW. PR should be a little higher I think, I can't do more for now but SW is nice. I gained around +200 pts with Galax bios. Actually it's a different behavior especially about the effective clock more in line with the set freq.


Was rebar enabled during your PR run ?
I ask you this because after a bios flash, rebar settings in nvidia profile inspector are reset to default. So it is necessary to set them up again.


----------



## doom26464

5 pages of bickering about oled tech which should be a post from 4+ years ago.

Meanwhile can't get zero feedback on a cutting edge encoder on a next gen gpu in the 4090 owners thread. 

Forums sure are intesting lol.


----------



## J7SC

doom26464 said:


> 5 pages of bickering about oled tech which should be a post from 4+ years ago.
> 
> Meanwhile can't get zero feedback on a cutting edge encoder on a next gen gpu in the 4090 owners thread.
> 
> Forums sure are intesting lol.


...yesterday, I happened to read up about the > RTX 6000 Ada (workstation card). The bottom paragraph in the linked page could be of 'peripheral' interest to you re. encoders


----------



## Talon2016

mirkendargen said:


> So anyway, how about some video cards? Has anyone actually used a monitor with DSC? Is it truly lossless or just "visually lossless"? Is it possible to see a difference?


I use 2 DSC monitors. a 1440p 240Hz AW2721D and LG 27GN950-B 4K 160hz. You cannot tell the difference between DSC and non-DSC. Tests were conducted by VESA to prove this. 






FAQ - DisplayPort







www.displayport.org





_During the development of DSC, the contributing VESA member companies performed ongoing visual tests to uncover visual compression artifacts using different image types and image motion. This testing was used to fine tune the DSC codec (coding / decoding algorithm). For the final DSC codec, the visual performance of DSC was evaluated through clinical testing by VESA in collaboration with member companies. The evaluation included a statistically significant number of observers who viewed many images over four image categories including artificial engineered images, text and graphics, such as street maps or different examples of printed material, people, landscape, animals and stills. Overall, observers completed nearly 250,000 subjective image comparisons. VESA members also concluded subjective testing as a far more robust method to verify visually lossless quality rather than using objective metrics, such as, PSNR which typically designates one value for an image. The results of this testing indicated that DSC met the visually lossless criteria set by VESA. The details of testing methodology and results were published by VESA in early 2015._


----------



## SilenMar

doom26464 said:


> 5 pages of bickering about oled tech which should be a post from 4+ years ago.
> 
> Meanwhile can't get zero feedback on a cutting edge encoder on a next gen gpu in the 4090 owners thread.
> 
> Forums sure are intesting lol.


AV1 even came half a decade ago. Youtube already started to gear up for AV1 playback since 2018. 
NVENC AV1 is 40% more efficient compared to NVENC H264. So the image quality can be 40% better than under the same bandwidth.


----------



## Laithan

Ok.. it may be time to get moderators involved.

This monitor "discussion" is nothing but senseless bickering..nobody is ever going to agree. At this point it isn't helpful or productive to anyone in any way. There is certainly some legitimate monitor relevance to a 4090 driving high refresh rates but the discussion is far beyond that and extremely off-topic. Start a new thread please if you wish to continue because this thread is long enough without off topic noise. Thank you.


----------



## doom26464

SilenMar said:


> AV1 even came half a decade ago. Youtube already started to gear up for AV1 playback since 2018.
> NVENC AV1 is 40% more efficient compared to NVENC H264. So the image quality can be 40% better than under the same bandwidth.


What about NVENC h264 on the 4090? I read that nvidia included dual encoders alowing both half a screen and bottom to be split among the 2 encoders increasing performance. 

or maybe my memory is fuzzy.


----------



## GQNerd

Figure this thread could use some 4090 content so here goes... lol

The Galax vbios has been a God Send for my Suprim Liquid X! 

Core and effective clocks were already pretty close _(within 15-45mhz)_ on stock MSI 530w and 600w Suprim vbioses, but this Galax one is even better! My core and effective clocks are within _12-17mhz_ of each other now, and I gained _30-90mhz_ on the core clocks depending on benchmark or game. _Went from 3045-3060mhz to 3075-3150mhz!_ 

Vram OC stayed the same as before..
+1600 is stable with +1700 being the limit, anything above that crashes and resets the card to Default settings. _(interestingly, I have to "warm up" the vram to get it stable at +1700, so I usually do a few runs or use GPUZ render to get the card to around 40c or so)_


Port Royal is the most finnicky of any bench or game and only gained 30mhz, but everything else pretty much gained 60-90mhz. Highest power draw I've seen is 610w but only for a split second, usually sits between 500-570w went going full tilt, less in games.

*Side note:* I applied Liquid Metal.. (I'll deal with it later/if ever), I've also swapped the Rad fans so they run off the motherboard and not the card, and in push/pull.

Stock everything else including thermal pads, and this card is running cool AF.. 54c max on auto fans. 

Now for the #'s


----------



## SilenMar

doom26464 said:


> What about NVENC h264 on the 4090? I read that nvidia included dual encoders alowing both half a screen and bottom to be split among the 2 encoders increasing performance.
> 
> or maybe my memory is fuzzy.


They are dual encoders made for AV1. But if the encoders are used for h264, the efficiency is still around 40%. You can probably save same performance if the GPU is under heavy load. 
I have recorded 8K 250Mbps HDR with AV1 at 560W where the fps goes from 52fps to 48fps.


----------



## J7SC

doom26464 said:


> What about NVENC h264 on the 4090? I read that nvidia included dual encoders alowing both half a screen and bottom to be split among the 2 encoders increasing performance.
> 
> or maybe my memory is fuzzy.


...another few tidbits re. RTX4K encoders (and a quote or two on comparative performance) per YT time-stamp...







Miguelios said:


> Figure this thread could use some 4090 content so here goes... lol
> 
> The Galax vbios has been a God Send for my Suprim Liquid X!
> 
> Core and effective clocks were already pretty close _(within 15-45mhz)_ on stock MSI 530w and 600w Suprim vbioses, but this Galax one is even better! My core and effective clocks are within _12-17mhz_ of each other now, and I gained _30-90mhz_ on the core clocks depending on benchmark or game. _Went from 3045-3060mhz to 3075-3150mhz!_
> 
> Vram OC stayed the same as before..
> +1600 is stable with +1700 being the limit, anything above that crashes and resets the card to Default settings. _(interestingly, I have to "warm up" the vram to get it stable at +1700, so I usually do a few runs or use GPUZ render to get the card to around 40c or so)_
> 
> 
> Port Royal is the most finnicky of any bench or game and only gained 30mhz, but everything else pretty much gained 60-90mhz. Highest power draw I've seen is 610w but only for a split second, usually sits between 500-570w went going full tilt, less in games.
> 
> *Side note:* I applied Liquid Metal.. (I'll deal with it later/if ever), I've also swapped the Rad fans so they run off the motherboard and not the card, and in push/pull.
> 
> Stock everything else including thermal pads, and this card is running cool AF.. 54c max on auto fans.
> 
> Now for the #'s
> 
> View attachment 2589108
> View attachment 2589109
> 
> 
> View attachment 2589110
> 
> View attachment 2589112
> View attachment 2589113
> View attachment 2589111


...I find the Galax vbios doesn't so much increase the max clocks on my card compared to the stock one, but it seriously limits the downclocks - for an overall 'effective clocks' win. As to memory, I have been running mostly EEC-enabled as of late (far more costly on SW than PR for my setup, but I have yet to get close to my 29.8x PR per non-EEC). Interestingly, EEC and non-EEC seem to have the same VRAM speed limit, and both react equally well when VRAM temps are in the high 40s -52 C compared to the low 30s C... up to an additional ~ 120 MHz with the extra VRAM temp

So, trying to warm up my VRAM:


Spoiler


----------



## doom26464

If Major streaming platforms aka Fbook/youtube/twitch switch to AV1 next year it will drive demand up even more for next gen gpus. Impressive stuff.







yah the encoder is unreal. Even something like a 4060 when it launches will be impressive for encoding.


My 4090 shows up next week and this is going to be game changing for content creation.


----------



## GQNerd

J7SC said:


> ...I find the Galax vbios doesn't so much increase the max clocks on my card compared to the stock one, but it seriously limits the downclocks - for an overall 'effective clocks' win. As to memory, I have been running mostly EEC-enabled as of late (far more costly on SW than PR for my setup, but I have yet to get close to my 29.8x PR per non-EEC). Interestingly, EEC and non-EEC seem to have the same VRAM speed limit, and both react equally well when VRAM temps are in the high 40s -52 C compared to the low 30s C... up to an additional ~ 120 MHz with the extra VRAM temp
> So, trying to warm up my VRAM:
> 
> 
> Spoiler


lmao.. can relate to that video.

I've only experienced a single bugged run and hit 29.8k in PR (msi vbios and previous nvidia driver) but the vram on my Suprim is more likely to crash than artifact. Either black screen and I have to reboot, or driver crash and card goes to defaults.. I haven't ran with ECC yet but will check it out for giggles or when 3dmark starts requiring it.

I agree about the downclocks, and as a result, frametime lows, have improved with this bios.. but I def gained some bins as well. Whatever Galax's secret sauce is (raised nvvd and/or msvdd voltages) it's paying off!..

So far it's stable and temps are chilly, so I'm going to be rocking this vbios for a while.


----------



## kryptonfly

yzonker said:


> Yea you look a little soft on PR for some reason. 3 bins more core, but a lower score than what I did the other day.
> 
> Result (3dmark.com)
> 
> SW looks right though, same mem and 3 bins more core. (it doesn't add much though)
> 
> Result (3dmark.com)


Exactly, but I don't rely too much on PR and Firestrike as we need to do some tricks to break scores. There are many variables to keep in mind. SW is more in line with vram, I could see my vram limit at +1795 and contrary to others I gained a little on the vram, no need to warm it up, +1700 straight at 20°C but like I said with 1mm pads it goes till 62°C in SW and 58°C in PR (but I could even see 68°C max on the vram in Spiderman if I'm not wrong), perfect vram temp for 45°C max on the GPU I could see so far, I didn't check vulkan bench but I could see it with PR more sensitive but less effect on the final score though. These are some vram variances, indeed I think gpu is limited by vram :
+1795 : I scored 11 397 in Speed Way
+1700 : I scored 11 348 in Speed Way
I started to lose perf at +1800 and black screen at +1820.



GRABibus said:


> Was rebar enabled during your PR run ?
> I ask you this because after a bios flash, rebar settings in nvidia profile inspector are reset to default. So it is necessary to set them up again.


I think so but I didn't check indeed. Thanks for the tips, I will take a look later  
Congrats for your PR but honestly I think something is wrong with PR, maybe it's because of the 3dmark SystemInfo 5.55 vs 5.56, drivers should be fine though because we have both 527.56. Maybe SW is more reliable for 4090, it's the last bench made for new tech.


----------



## mirkendargen

doom26464 said:


> If Major streaming platforms aka Fbook/youtube/twitch switch to AV1 next year it will drive demand up even more for next gen gpus. Impressive stuff.
> 
> 
> 
> 
> 
> 
> 
> yah the encoder is unreal. Even something like a 4060 when it launches will be impressive for encoding.
> 
> 
> My 4090 shows up next week and this is going to be game changing for content creation.


I think the big holdup is still enduser devices. Computers and phones are starting to hit a critical mass of AV1 hardware decode, but smart TVs/streaming boxes are lagging way behind. Twitch probably doesn't care about that, but the TV/movie streaming services definitely do.


----------



## ESRCJ

What kind of VRAM temps are you folks getting with an EK block? Just looking for a point of comparison. Mine is hitting close to 50C during SpeedWay runs with a +1500MHz offset.


----------



## kryptonfly

GRABibus said:


> Was rebar enabled during your PR run ?
> I ask you this because after a bios flash, rebar settings in nvidia profile inspector are reset to default. So it is necessary to set them up again.


Thank you very much my dear ! It seems I drove with the handbrake here... REbar was disabled 
I scored 29 700 in Port Royal








It seems now reliable actually, I was wrong about systeminfo 5.55.
Max power draw = 551W with opened windows 17°C ambient.


----------



## Nizzen

GRABibus said:


> for sure it is a nice benching Bios.
> 
> Still on my veeeeerrrryyyy nice GIGABYTE GAMING OC, and still on stock air cooler
> 
> View attachment 2589070
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 734 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Max Power draw = 609W
> Average GPU temperature = 56°C


What score are you getting with +1200 and +1500 on the memory? 

I`m wondering if there is perfect scaling in the score 

Your gpu core is bad ass, that's for shure


----------



## kx11

Aorus Xtreme Waterforce REV1.1 now got a 1000w recommended specs while REV1.0 (the one i got) is rated 800w


anyone got that bios?? i'll flash it on mine


----------



## elbramso

kryptonfly said:


> Thank you very much my dear ! It seems I drove with the handbrake here... REbar was disabled
> I scored 29 700 in Port Royal
> View attachment 2589120
> 
> 
> It seems now reliable actually, I was wrong about systeminfo 5.55.
> Max power draw = 551W with opened windows 17°C ambient.


Is there a new version of the profile inspector? I'm still using the version where I already tweaked my 3090 kingpin. 
And would you mind to post your inspector settings?


----------



## J7SC

...applied my 1st VF curve to this card, and am pleasantly surprised that the core held 3195 MHz throughout 2x Superposition8K test runs  ...3DM testing to come (most after Christmas shopping)


----------



## GRABibus

elbramso said:


> Is there a new version of the profile inspector? I'm still using the version where I already tweaked my 3090 kingpin.
> And would you mind to post your inspector settings?











Releases · Orbmu2k/nvidiaProfileInspector


Contribute to Orbmu2k/nvidiaProfileInspector development by creating an account on GitHub.




github.com


----------



## GRABibus

Nizzen said:


> What score are you getting with +1200 and +1500 on the memory?
> 
> I`m wondering if there is perfect scaling in the score
> 
> Your gpu core is bad ass, that's for shure


Bios Galax
20°C ambient
Each test done after one reboot
No killed processes
Drivers NVIDIA on default
Rebar forced in Port Royal


----------



## Nizzen

GRABibus said:


> Bios Galax
> 20°C ambient
> Each test done after one reboot
> No killed processes
> Drivers NVIDIA on default
> Rebar forced in Port Royal
> 
> View attachment 2589125
> 
> 
> 
> View attachment 2589126
> 
> 
> 
> View attachment 2589127


Now try 3100mhz core and same mem scaling 
You are about to make this forums WAY better


----------



## GRABibus

kryptonfly said:


> Congrats for your PR but honestly I think something is wrong with PR, maybe it's because of the 3dmark SystemInfo 5.55 vs 5.56, drivers should be fine though because we have both 527.56. Maybe SW is more reliable for 4090, it's the last bench made for new tech.


I also make incredible high graphics score on air stock cooler in TS.
So, yes, maybe something is wrong in PR I think, but my Core is avery very good one (Not far away from a golden sample I think).

I will post again TS graphics scores with Galax Bios


----------



## GRABibus

Nizzen said:


> Now try 3100mhz core and same mem scaling
> You are about to make this forums WAY better


Say "thank you" first


----------



## motivman

GRABibus said:


> for sure it is a nice benching Bios.
> 
> Still on my veeeeerrrryyyy nice GIGABYTE GAMING OC, and still on stock air cooler
> 
> View attachment 2589070
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 734 in Port Royal
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Max Power draw = 609W
> Average GPU temperature = 56°C


Man, your score makes no sense AT ALL. I am running the same card, with the same bios and essentially the same clocks, with a 13900k, with faster ram, and you are scoring 600 points higher?


















I scored 29 110 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## AvengedRobix

The new WB alphacool look beautiful...



__ https://www.facebook.com/video.php?v=6492584757435108


----------



## GRABibus

motivman said:


> Man, your score makes no sense AT ALL. I am running the same card, with the same bios and essentially the same clocks, with a 13900k, with faster ram, and you are scoring 600 points higher?
> 
> View attachment 2589128
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 110 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Maybe yours make no sense at all….
Is Rebar forced in PR ?
What's your max Power draw on your screenshot ? I can't read it but seems below 600W.
Mine is always beyond 600W with Galax Bios in PR.

Look at my graphics score in TS with *no Forced Rebar* on *stock air cooler* and *gigabyte stock bios *:
I scored 36 048 in Time Spy

I agree with you, it makes no sense


----------



## GRABibus

Nizzen said:


> Now try 3100mhz core and same mem scaling
> You are about to make this forums WAY better


Bios Galax
22°C ambient
Each test done after one reboot
No killed processes
Drivers NVIDIA on default
Rebar forced in Port Royal


----------



## J7SC

....still dialing in the curves, but managed to complete a Superposition 8K run at 3210 MHz...getting around 16000 score w/EEC...VRAM getting warmed up


----------



## AvengedRobix

Question.. every single time i've to flash the bios i don't remember.. i've to disable the gpu from console manager or not?


----------



## elbramso

How do you force ReBar for Port Royal in the profile inspector?
I knew 1year ago but lost it again 😂

OK, I have it now 👍


----------



## yzonker

J7SC said:


> ....still dialing in the curves, but managed to complete a Superposition 8K run at 3210 MHz...getting around 16000 score w/EEC...VRAM getting warmed up
> View attachment 2589138


Are you actually getting higher effective clocks though with VF curve edits though? I haven't really found anything that increases effective clock on my card. Indicated clocks yes, but not effective.


----------



## Bozzern

Anyone tried the Galax bios on a card with 3 way connector?

I have the Gainward Phantom RTX 4090 (Non GS) which has a 3 way power connector.
Stock bios is limited to 400W and 100% Power limit.

Should i be able to push the power limit with the Galax bios and reach 450W with only the 3 way connector or am i wrong?

Maybe someone can check to see what the limit is by removing one of the four connectors?


----------



## Brads3cents

Miguelios said:


> Figure this thread could use some 4090 content so here goes... lol
> 
> The Galax vbios has been a God Send for my Suprim Liquid X!
> 
> Core and effective clocks were already pretty close _(within 15-45mhz)_ on stock MSI 530w and 600w Suprim vbioses, but this Galax one is even better! My core and effective clocks are within _12-17mhz_ of each other now, and I gained _30-90mhz_ on the core clocks depending on benchmark or game. _Went from 3045-3060mhz to 3075-3150mhz!_
> 
> Vram OC stayed the same as before..
> +1600 is stable with +1700 being the limit, anything above that crashes and resets the card to Default settings. _(interestingly, I have to "warm up" the vram to get it stable at +1700, so I usually do a few runs or use GPUZ render to get the card to around 40c or so)_
> 
> 
> Port Royal is the most finnicky of any bench or game and only gained 30mhz, but everything else pretty much gained 60-90mhz. Highest power draw I've seen is 610w but only for a split second, usually sits between 500-570w went going full tilt, less in games.
> 
> *Side note:* I applied Liquid Metal.. (I'll deal with it later/if ever), I've also swapped the Rad fans so they run off the motherboard and not the card, and in push/pull.
> 
> Stock everything else including thermal pads, and this card is running cool AF.. 54c max on auto fans.
> 
> Now for the #'s
> 
> 
> 
> 
> View attachment 2589111


So your GQNerd?? i was wondering who that was😅
you beat my FSU score by 23 points nice job!

i cant wait to get my water block running so i can be more competitive with some of you guys. Rebar is off tho i have to re-enable it for some of these benches like TS, SW, and PR

unfortunately, i have a FE and cant flash this uber bios everyone is talking about from galax 😔


----------



## alasdairvfr

How are ppl finding galax bios for dp/hdmi compatability? Anyone losing a port crossflashing?


----------



## alasdairvfr

Fired up Darktide this morning, and WOW its running well now! As in I needed to turn on my frame limiter for the first time for it.



Its still extremely sensitive to memory OC. Normally i can safely do +1400/1500 but for darktide i have to dial it in. Now though, it runs well enough a few extra frames at reduced stability isn't tempting.



Anyone else seeing this or am I smoking the good stuff?😂


----------



## Betroz

Played some BF2042 @ 3840x1600 Ultra just now. Core was holding stable at 3000 Mhz and +1500 mem. So far so good


----------



## Dragonsyph

How to make zero fan DB stay off even after PC restart?


----------



## GRABibus

Check of Galax Bios performance in TS for my Gigabyte Gaming OC on stock air cooler :

*Rebar forced in TS *:
*







*









I scored 35 647 in Time Spy


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





This brings me in current top 10 of "Hall Of Fame GX score at timespy", with a card on stock air cooler :








3DMark Time Spy Graphics Score Hall of Fame


The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.




www.3dmark.com


----------



## motivman

PSA: If you have the Gigabyte 4090 Gaming OC with Bykski waterblock, and struggling with Memory stability with good pads or thermal putty, do yourself a favor and use the original bykski provided thermal pads. Orignally I used 1.5mm artic TP-3 pads, and I went from being able to bench at +1800 memory, to not being able to bench at +1600. My daily on air cooler was +1500. With waterblock and artic TP-3, I could no longer run +1500 stable daily, had to go down to +1400. Well I got frustrated and removed TP-3 pads and replaced all my pads with bykski pads, and my card is performing way better. For example, looping Timespy extreme, my max memory junction temps with TP-3 pads was 48C, now with Bykski pads, my max is 66C. Also my temps on the core and hotspot are even better now... I can now bench +1800 like when I was on the air cooler again... actually I can even daily +1700 on memory now, LOL.

Waterblock with TP-3 Thermal pads










Waterblock with stock Bykski Thermal Pads










I am now #23 in the world in Speedway








I scored 11 318 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





I couldn't break 29K with TP-3 pads, now Look at my score in PR!!!








I scored 29 503 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Memory is KING on the 4090, keep it running WARM, lol.


----------



## J7SC

yzonker said:


> Are you actually getting higher effective clocks though with VF curve edits though? I haven't really found anything that increases effective clock on my card. Indicated clocks yes, but not effective.


This card hit up to 3210 MHz on air right out of the box, per earlier posts... Neither the Galax vbios nor VF curve changed that, but I got the effective clock speed up...didn't have HWInfo running for 3210, just GPU-Z but I did post this HWInfo above for 3195 when I had it all open.


----------



## StreaMRoLLeR

GRABibus said:


> So, yes, maybe something is wrong in PR



Everything is wrong for Port Royale. Lot of clowns and abusers there, claiming they achieve something. I spent 2 hours checking every valid result on HoF, I reported at least 47 logs.They need to upgrade Port Royal detection thing just like speedway. Speedway is free of vram bug


----------



## J7SC

StreaMRoLLeR said:


> Everything is wrong for Port Royale. Lot of clowns and abusers there, claiming they achieve something. I spent 2 hours checking every valid result on HoF, I reported at least 47 logs.They need to upgrade Port Royal detection thing just like speedway. Speedway is free of vram bug


...Speedway also has the vram bug; just not as often / pronounced.


----------



## GRABibus

StreaMRoLLeR said:


> Everything is wrong for Port Royale. Lot of clowns and abusers there, claiming they achieve something. I spent 2 hours checking every valid result on HoF, I reported at least 47 logs.They need to upgrade Port Royal detection thing just like speedway. Speedway is free of vram bug


Yes, poor clowns...


----------



## motivman

Lol, why y’all take this crap too seriously? LMAO. Getting mad over some dumb PR benchmark, SMH


----------



## motivman

Instead of wasting 2 hours of your life reporting benchmarks, waste those 2 hours by playing some games or talking to your wife/gf if you have one, lol.


----------



## StreaMRoLLeR

motivman said:


> Instead of wasting 2 hours of your life reporting benchmarks, waste those 2 hours by playing some games or talking to your wife/gf if you have one, lol.











I scored 29 874 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Oh my dear 
AVG clock 2940 and sitting at TOP 13. IMAGINE


----------



## StreaMRoLLeR

GRABibus said:


> Yes, poor clowns...


As far as i see you have legit good chip. Your scores are scaling accordingly.


----------



## elbramso

GRABibus said:


> Check of Galax Bios performance in TS for my Gigabyte Gaming OC on stock air cooler :
> 
> *Rebar forced in TS *:
> *
> View attachment 2589166
> *
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 35 647 in Time Spy
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> This brings me in current top 10 of "Hall Of Fame GX score at timespy", with a card on stock air cooler :
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark Time Spy Graphics Score Hall of Fame
> 
> 
> The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.
> 
> 
> 
> 
> www.3dmark.com


I'm currently rank7 in TS overall score without ReBar forced 😳
Now I think I need to redo my run 😜


----------



## GRABibus

...


----------



## GRABibus

elbramso said:


> I'm currently rank7 in TS overall score without ReBar forced 😳
> Now I think I need to redo my run 😜


Your CPU score will probably decrease so that overall score will be lower than with Rebar disabled.
But try and let's see


----------



## alasdairvfr

GRABibus said:


> Your CPU score will probably decrease so that overall score will be lower than with Rebar disabled.
> But try and let's see


Its a 1000 points drop forcing rebar in timespy for me. Only PR shows an improvement and its about 3-400 pts


----------



## yzonker

GRABibus said:


> Your CPU score will probably decrease so that overall score will be lower than with Rebar disabled.
> But try and let's see


Possibly not as much as you think. My 12900k on ddr5 hardly lost anything with reBar.


----------



## J7SC

In the TimeSpy series, I mostly did TS-Extreme, and from what I recall, forced r_BAR was beneficial by a small margin when running both CCDs - I sometimes just run one for certain benches that benefit more from peak CPU clock and tight system RAM. AM4 might be different though than the newer AMD and Intel gens. ...speaking of which, I'm waiting for CES '23 (Jan 5 - 8) to see if there's any news on AM5 X3D before I make a system upgrade call towards AMD or Intel.


----------



## GRABibus

yzonker said:


> Possibly not as much as you think. My 12900k on ddr5 hardly lost anything with reBar.


Yes, maybe the decrease in CPu score is more visible with Ryzen's with PBO.


----------



## AvengedRobix

i try to undervolt... Why this????
i set 2820 at 950mv but stay at 1050


----------



## GRABibus

J7SC said:


> In the TimeSpy series, I mostly did TS-Extreme, and from what I recall, forced r_BAR was beneficial by a small margin when running both CCDs - I sometimes just run one for certain benches that benefit more from peak CPU clock and tight system RAM. AM4 might be different though than the newer AMD and Intel gens. ...speaking of which, I'm waiting for CES '23 (Jan 5 - 8) to see if there's any news on AM5 X3D before I make a system upgrade call towards AMD or Intel.


7900X3D would be perfect for me


----------



## yzonker

Managed to bump this one up a little. 











Inched SW up a little closer to the golden core cards. 









I scored 11 347 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

GRABibus said:


> Yes, maybe the decrease in CPu score is more visible with Ryzen's with PBO.


No, my 13900k on DDR4 loses about 1000pts too. I think it's the much higher memory bandwidth on Intel DDR5.


----------



## GRABibus

yzonker said:


> No, my 13900k on DDR4 loses about 1000pts too. I think it's the much higher memory bandwidth on Intel DDR5.


you still have your 5800X build ?
It should be the same.


----------



## GRABibus

yzonker said:


> No, my 13900k on DDR4 loses about 1000pts too. I think it's the much higher memory bandwidth on Intel DDR5.


I lost same with my 5950X


----------



## Krzych04650

AvengedRobix said:


> i try to undervolt... Why this????
> i set 2820 at 950mv but stay at 1050
> View attachment 2589192


Click on the point you want to lock to and press L, yellow line will appear and it will force lock to this VF point.


----------



## AvengedRobix

Krzych04650 said:


> Click on the point you want to lock to and press L, yellow line will appear and it will force lock to this VF point.


ah.. i don't know the L function.. in any case i've changed the skin of afterburner and work


----------



## yzonker

GRABibus said:


> you still have your 5800X build ?
> It should be the same.


I do. It's my backup gaming rig now in case I blow up this one. lol. It didn't lose as many points, but percentage wise I think it may have been similar. Just half the score. So half as many points lost.


----------



## Panchovix

GRABibus said:


> I lost same with my 5950X


I lose about 1000 points in CPU score after forcing re-bar, but get about 600 points in GPU. (Normal TS)

I guess with intel at DDR5 you lose nothing since you have best of both worlds.


----------



## J7SC

@Nizzen & Co ...next-gen / 4090 voltage-locked ? Like the good old days (time-stamp)


----------



## motivman

StreaMRoLLeR said:


> I scored 29 874 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Oh my dear
> AVG clock 2940 and sitting at TOP 13. IMAGINE


lmao…show me where I bragged about that score ANYWHERE on this forum? I didn’t even realize it was still up there, but because of toxic clowns like you, I will NEVER DELETE IT, STAY MAD. Lol


----------



## Nizzen

Still waiting for someone to beat my air score @ 
*SCORE 31 727 😂*


----------



## J7SC

Nizzen said:


> Still waiting for someone to beat my air score @
> *SCORE 31 727 😂*


...I'm gunning for that with my 30,338 🤪


Spoiler


----------



## bmagnien

yzonker said:


> Managed to bump this one up a little.
> 
> View attachment 2589193
> 
> 
> 
> Inched SW up a little closer to the golden core cards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 347 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Nice score! 4K superposition is now a CPU benchmark with the 4000 series. How’s your 8k looking?


----------



## yzonker

bmagnien said:


> Nice score! 4K superposition is now a CPU benchmark with the 4000 series. How’s your 8k looking?


Yea 1080p Extreme and 8k scores are definitely not as good. Kinda figured there was some CPU dependence with 4k Optimized. But that was why I bought a 13900k. Got tired of always having the slower CPU. Scores are all on the Leader board. My 8k score is a bit below yours. One spot below in 1080p E.


----------



## mirkendargen

J7SC said:


> ...I'm gunning for that with my 30,338 🤪
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2589201












This one from a month ago was basically a solid brown screen, I'm super curious how much more broken a 31k+ can look lololol.


----------



## tubs2x4

GRABibus said:


> Check of Galax Bios performance in TS for my Gigabyte Gaming OC on stock air cooler :
> 
> *Rebar forced in TS *:
> *
> View attachment 2589166
> *
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 35 647 in Time Spy
> 
> 
> AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> This brings me in current top 10 of "Hall Of Fame GX score at timespy", with a card on stock air cooler :
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark Time Spy Graphics Score Hall of Fame
> 
> 
> The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.
> 
> 
> 
> 
> www.3dmark.com


What is with that timespy cpu test that amd cpus are low scores?


----------



## ESRCJ

I can't break 11K in Speed Way with my 4090 FE running a flat 3105MHz on the core (+300MHz offset and effective clocks not far behind) and a +1500MHz memory offset. Basically I hit a hard wall at 10.9K, which seems low since I'm seeing cards barely breaking 3GHz hitting over 11.2K lol. I score basically the same at 150MHz lower core clocks as well.

Any tips or insights?


----------



## StreaMRoLLeR

motivman said:


> lmao…show me where I bragged about that score ANYWHERE on this forum? I didn’t even realize it was still up there, but because of toxic clowns like you, I will NEVER DELETE IT, STAY MAD. Lol


You dont have to  it will get deleted on monday 🤡


----------



## Dreams-Visions

Hey folks! Since moving to the 4090, I occasionally (maybe a couple times a week) get an "invalid format error" from my LG C1 after prolonged tome with the TV off, forcing me to unplug and plug back in the HDMI cable. It's fine after that. My PC doesn't go to sleep; I just turn the TV off and back on again when I'm ready to use the system. 

I never had this issue in the 1.5 years I had my 3090 installed. Same cable, same settings on the PC (in so far as I'm aware), same settings on the TV (still sees it correctly as a PC and not some other component), same everything. Only difference is the GPU itself. Setup and GPU have shown no signs of instability; just this occasional, odd behavior.

Is this a driver issue? A new feature that requires configuration? Ideas always appreciated.


----------



## StreaMRoLLeR

Dreams-Visions said:


> Hey folks! Since moving to the 4090, I occasionally (maybe a couple times a week) get an "invalid format error" from my LG C1 after prolonged tome with the TV off, forcing me to unplug and plug back in the HDMI cable. It's fine after that. My PC doesn't go to sleep; I just turn the TV off and back on again when I'm ready to use the system.
> 
> I never had this issue in the 1.5 years I had my 3090 installed. Same cable, same settings on the PC (in so far as I'm aware), same settings on the TV (still sees it correctly as a PC and not some other component), same everything. Only difference is the GPU itself. Setup and GPU have shown no signs of instability; just this occasional, odd behavior.
> 
> Is this a driver issue? A new feature that requires configuration? Ideas always appreciated.


I have G1 and certified cable, now after switching to 4090 from 3080 Ti my eARC dolby atmos Passthrough via SN11RG is broken aswell. Sometimes it automatically switch to stereo. Coming back to your topic, yes i think its software handshake problem


----------



## GRABibus

tubs2x4 said:


> What is with that timespy cpu test that amd cpus are low scores?


TS CPU score is better with Intel (For same compared generations CPU);
And in this test, Rebar is enabled, so CPU score is lower than with Rebar disabled.
And...It is necesssary to disable SMT for Ryzen in TS to get better CPU score.

Definitely, Timespy is not made for Ryzen's.

This is why I only focus on graphics score.


----------



## motivman

StreaMRoLLeR said:


> You dont have to  it will get deleted on monday 🤡


I could care less...


----------



## Mystic33

Got recently one rtx 4090 msi suprim x , i think that unfortunately for me is pretty lazy in terms of overclock:




















What do you guys think about the scores?


----------



## yzonker

ESRCJ said:


> I can't break 11K in Speed Way with my 4090 FE running a flat 3105MHz on the core (+300MHz offset and effective clocks not far behind) and a +1500MHz memory offset. Basically I hit a hard wall at 10.9K, which seems low since I'm seeing cards barely breaking 3GHz hitting over 11.2K lol. I score basically the same at 150MHz lower core clocks as well.
> 
> Any tips or insights?


SW responds better to memory OC than core for one thing. +1500 is decent, but a lot of the higher scores are run at +1700 or more. The runs I posted recently were in the 1750-1825 range.

Everyone is running the Galax bios now which is 100 pts or more, unfortunately you probably can't flash it to the FE. I also got better scores from the MSI bios than my stock TUF bios or the Strix. I have no idea how the FE bios performs relative to these.

At one point at least (not sure it's true for the Galax bios) I gained quite a bit of score by literally unplugging my 2nd monitor. So if you have more than one monitor, you might try unplugging them and set your remaining monitor to 1440p or lower and 60hz or less.

I was stuck exactly where you are at one time, but a combinations of the things I mention above got me to 11.3k+.


----------



## dk_mic

ekwb block for the gaming trio w/ galax 666W bios
stock thermal paste on the trio was completely dry










usef a mix of arctic TP-3 and EKWB pads.. horrible coilwhine which will be tolerable when cased is closed and card is undervolted.
TSE looping @ 121% PL, 1.1 VCore @ 28C water temp, 3060 / 3030 MHz effective with those settings
didn't check yet, if i can push core or mem a bit further

temps look alright


----------



## Gandyman

Hey guys.
Has anyone else noticed that the 4090 REALLY doesn't like being frame capped? No matter how I cap fps, it likes to bounce off the ceiling so hard. What I mean by that is, say I can easily get 250 fps average with 200 lows, but if I cap at 116 (to stay within my monitors Gsync window) it struggles to stay there, frequently bouncing down to 105 ish. Sometimes accompanied by quite severe stuttering.

When I had the 3090 I would just cap in nvcp to 116 never saw a single deviation (unless the GPU was maxed out) but with the 4090 its sitting at like 40 - 50% usage and unable to hold a steady 116. I've found leaving nvcp settings alone and turning vsnc on in game is the best way to limit fps without awful stuttering, but that isn't ideal because I would rather it be inside G-sync range.

I've spent weeks at this point trying to find a way to make this work, and also searching forums/reddit etc for people having same issue, but surprisingly its been completely quiet. Which leads me to wonder if it is a problem with my system only.

Are you guys seeing odd behavior with it bouncing off the cap? If not I would be super interested to hear what methods you all are using.

At this point I'm just getting so exasperated with trying to make it work I'm super tempted to sell the 4090 and get a 3080ti or something and sit this gen out.

On an unrelated note if anyone in Australia want's a 4090 strix ....


----------



## yzonker

Mystic33 said:


> Got recently one rtx 4090 msi suprim x , i think that unfortunately for me is pretty lazy in terms of overclock:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you guys think about the scores?


Need more info. Are you running the stock bios? Is it hitting the power limit? Did you set LOD to +3 for Superposition (pretty sure the top scores all had this set)? Did you force reBar in PR?

If you just ran these on your daily OS with no/minimal tweaks, then I'd say they are fairly typical. There's just a bunch of stuff people are doing to gain points.


----------



## yzonker

Gandyman said:


> Hey guys.
> Has anyone else noticed that the 4090 REALLY doesn't like being frame capped? No matter how I cap fps, it likes to bounce off the ceiling so hard. What I mean by that is, say I can easily get 250 fps average with 200 lows, but if I cap at 116 (to stay within my monitors Gsync window) it struggles to stay there, frequently bouncing down to 105 ish. Sometimes accompanied by quite severe stuttering.
> 
> When I had the 3090 I would just cap in nvcp to 116 never saw a single deviation (unless the GPU was maxed out) but with the 4090 its sitting at like 40 - 50% usage and unable to hold a steady 116. I've found leaving nvcp settings alone and turning vsnc on in game is the best way to limit fps without awful stuttering, but that isn't ideal because I would rather it be inside G-sync range.
> 
> I've spent weeks at this point trying to find a way to make this work, and also searching forums/reddit etc for people having same issue, but surprisingly its been completely quiet. Which leads me to wonder if it is a problem with my system only.
> 
> Are you guys seeing odd behavior with it bouncing off the cap? If not I would be super interested to hear what methods you all are using.
> 
> At this point I'm just getting so exasperated with trying to make it work I'm super tempted to sell the 4090 and get a 3080ti or something and sit this gen out.
> 
> On an unrelated note if anyone in Australia want's a 4090 strix ....


No my TUF stays locked at 117 fps very consistently, although I don't play any games that will hit 200+ fps at max settings in 4k. I've been playing GTA V some. I think it can peak around 150 fps with the settings I'm running.


----------



## Mystic33

yzonker said:


> Need more info. Are you running the stock bios? Is it hitting the power limit? Did you set LOD to +3 for Superposition (pretty sure the top scores all had this set)? Did you force reBar in PR?
> 
> If you just ran these on your daily OS with no/minimal tweaks, then I'd say they are fairly typical. There's just a bunch of stuff people are doing to gain points.


LOD to +3, what do you mean? Resizable bar was off and no tweaks. Power limit was on oc bios @ 520w and i did not observe throttling in the benchmarks


----------



## yzonker

Mystic33 said:


> LOD to +3, what do you mean? Resizable bar was off and no tweaks. Power limit was on oc bios @ 520w and i did not observe throttling in the benchmarks


This is how you set LOD,









Negative Lod Bias


Have you ever wanted sharper textures over all in your games? Well with Lod bias that is possible. What I’m going to focus on is texture Lod bias from the Nvidia viewpoint, this can be done o…




melantechgaming.wordpress.com





3DMark will flag your run as invalid though if you change LOD. So people just do it in Superposition.


----------



## keikei

MSI recently updated their MSI Center, so you can disable the RGB's. MSI Center


----------



## KedarWolf

yzonker said:


> This is how you set LOD,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Negative Lod Bias
> 
> 
> Have you ever wanted sharper textures over all in your games? Well with Lod bias that is possible. What I’m going to focus on is texture Lod bias from the Nvidia viewpoint, this can be done o…
> 
> 
> 
> 
> melantechgaming.wordpress.com
> 
> 
> 
> 
> 
> 3DMark will flag your run as invalid though if you change LOD. So people just do it in Superposition.


Negative LOD BIAS tweaks are cheats and every benchmark should invalidate them.


----------



## yzonker

KedarWolf said:


> Negative LOD BIAS tweaks are cheats and every benchmark should invalidate them.


I don't disagree with that but seems like everyone uses it. So when in Rome...


----------



## galimberti

Hey guys, I'm wondering if you have seen a worse 4090 than mine. It is a Palit Gamerock (non-oc, limited to 450w, 3x8pin)
This is my stock curve:











Right now I'm running 950mV at 2610mhz, which is pretty lame. At this voltage I get the highest offset (+180) compared to the others. At 1100mV I can get only +120, which puts me at 2865mhz, lower than some stocks . Anything higher than that and I get crashed to desktop in the first few seconds of port royal.










On vram I got +1500mhz, which is average I guess.


----------



## KedarWolf

yzonker said:


> I don't disagree with that but seems like everyone uses it. So when in Rome...


Welp, with LOD BIAS, I managed this.









UNIGINE Benchmarks


Performance benchmarks by Unigine




benchmark.unigine.com


----------



## yzonker

KedarWolf said:


> Welp, with LOD BIAS, I managed this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> UNIGINE Benchmarks
> 
> 
> Performance benchmarks by Unigine
> 
> 
> 
> 
> benchmark.unigine.com
> 
> 
> 
> 
> 
> View attachment 2589280
> 
> 
> View attachment 2589278


LOL. I didn't think that would take long.


----------



## ESRCJ

yzonker said:


> SW responds better to memory OC than core for one thing. +1500 is decent, but a lot of the higher scores are run at +1700 or more. The runs I posted recently were in the 1750-1825 range.
> 
> Everyone is running the Galax bios now which is 100 pts or more, unfortunately you probably can't flash it to the FE. I also got better scores from the MSI bios than my stock TUF bios or the Strix. I have no idea how the FE bios performs relative to these.
> 
> At one point at least (not sure it's true for the Galax bios) I gained quite a bit of score by literally unplugging my 2nd monitor. So if you have more than one monitor, you might try unplugging them and set your remaining monitor to 1440p or lower and 60hz or less.
> 
> I was stuck exactly where you are at one time, but a combinations of the things I mention above got me to 11.3k+.


Thanks. So we still can't flash an FE with a non-FE BIOS? Or has anyone successfully done so? It would be nice to try a different BIOS.


----------



## yzonker

ESRCJ said:


> Thanks. So we still can't flash an FE with a non-FE BIOS? Or has anyone successfully done so? It would be nice to try a different BIOS.


All I know for sure is that NVflash won't flash the FE bios to my TUF. I also haven't seen anyone report successfully flashing an FE either. You can try it. Much more likely it will just throw an error and not flash at all. But I can't promise that obviously.


----------



## bmagnien

KedarWolf said:


> Welp, with LOD BIAS, I managed this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> UNIGINE Benchmarks
> 
> 
> Performance benchmarks by Unigine
> 
> 
> 
> 
> benchmark.unigine.com
> 
> 
> 
> 
> 
> View attachment 2589280
> 
> 
> View attachment 2589278


Niiiiiice! Go for 8k!

edit: nevermind I see your new score there. Holy ****! Gratz


----------



## yzonker

@KedarWolf , this run you did actually is more impressive IMO. 









UNIGINE Superposition benchmark score


UNIGINE Superpsition detailed score page




benchmark.unigine.com





I've been wondering why I seemed to be really low in that one. I didn't do anything different between the 3. 

Did you change anything, or run similar/same settings for all 3 of them (1080p E, 4k/8k optimized)?


----------



## GQNerd

KedarWolf said:


> Negative LOD BIAS tweaks are cheats and every benchmark should invalidate them.


Agreed.. none of my Superposition runs had that enabled, but I guess now we know how to move up a few more spots..



KedarWolf said:


> Welp, with LOD BIAS, I managed this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> UNIGINE Benchmarks
> 
> 
> Performance benchmarks by Unigine
> 
> 
> 
> 
> benchmark.unigine.com
> 
> 
> 
> 
> 
> View attachment 2589280
> 
> 
> View attachment 2589278


Nooo.. bumped me to 3rd place. This means war! Lol


----------



## J7SC

....a snow day today !  I am wondering whether Win 10 Pro / 4090 plays Tron 2.0 (ran across it after rummaging through old boxes in storage)...? Should look 'interesting' on an OLED...


----------



## yzonker

Yea that's the thing I find annoying about benchmarking sometimes. All the little tricks people use and a lot of people won't share them. I get that, but it's not like we're here to win anything. Looks like I got too greedy with the chiller yesterday and hit the friggin' cold bug again without realizing it. Ran this one at ambient. Several hundred points higher and about where I was expecting originally.


----------



## yzonker

J7SC said:


> ....a snow day today !  I am wondering whether Win 10 Pro / 4090 plays Tron 2.0 (ran across it after rummaging through old boxes in storage)...? Should look 'interesting' on an OLED...
> View attachment 2589292
> 
> View attachment 2589294


Must be time to try this out then.



Spoiler: Dumb LTT Vid



This PC is cooled by FLAMES - Hacksmith Collab - YouTube


----------



## ESRCJ

yzonker said:


> All I know for sure is that NVflash won't flash the FE bios to my TUF. I also haven't seen anyone report successfully flashing an FE either. You can try it. Much more likely it will just throw an error and not flash at all. But I can't promise that obviously.


Yeah I haven't seen anyone reporting that either for a 4090 FE. Not that it's going to result in a major performance improvement, but it would be nice to push some higher scores in benchmarks.


----------



## yzonker

I'll just post these little captures to avoid bogging down the thread with big images. Everything went up. Set the chiller 2C higher. lol I think I didn't realize it since 4k Optimized is so CPU intensive now, so the GPU falling off didn't matter as much.

1080p E









4k Opt









8k Opt


----------



## yzonker

Sorry, last post on this, but I thought this might be of interest to those wondering how CPU limited 4k Optimized is. I just re-ran at 5.9Ghz (previous run was stock 5.5).


----------



## Krzych04650

yzonker said:


> Sorry, last post on this, but I thought this might be of interest to those wondering how CPU limited 4k Optimized is. I just re-ran at 5.9Ghz (previous run was stock 5.5).
> 
> View attachment 2589306


Yea I've noticed that too, it runs at very high framerates and this benchmark seems to be single-threaded on the CPU. One more GPU gen and it will fall out of use, CPUs will almost certainly not get 50%+ faster on ST in just 2 years.


----------



## yzonker

Krzych04650 said:


> Yea I've noticed that too, it runs at very high framerates and this benchmark seems to be single-threaded on the CPU. One more GPU gen and it will fall out of use, CPUs will almost certainly not get 50%+ faster on ST in just 2 years.


It's still mostly GPU with the fastest CPUs (98% utilization), but yea if we get another big bump in GPU power then it'll be pretty well CPU bottlenecked. 

BTW going to 6Ghz was literally the exact same score.


----------



## J7SC

yzonker said:


> Must be time to try this out then.
> 
> 
> 
> Spoiler: Dumb LTT Vid
> 
> 
> 
> This PC is cooled by FLAMES - Hacksmith Collab - YouTube


...LTT is just a few km from here in the direction of the camera-pic, and normally I would be able to see 'the smoke' from their experiment, but not today...


----------



## coelacanth

Gandyman said:


> Hey guys.
> Has anyone else noticed that the 4090 REALLY doesn't like being frame capped? No matter how I cap fps, it likes to bounce off the ceiling so hard. What I mean by that is, say I can easily get 250 fps average with 200 lows, but if I cap at 116 (to stay within my monitors Gsync window) it struggles to stay there, frequently bouncing down to 105 ish. Sometimes accompanied by quite severe stuttering.
> 
> When I had the 3090 I would just cap in nvcp to 116 never saw a single deviation (unless the GPU was maxed out) but with the 4090 its sitting at like 40 - 50% usage and unable to hold a steady 116. I've found leaving nvcp settings alone and turning vsnc on in game is the best way to limit fps without awful stuttering, but that isn't ideal because I would rather it be inside G-sync range.
> 
> I've spent weeks at this point trying to find a way to make this work, and also searching forums/reddit etc for people having same issue, but surprisingly its been completely quiet. Which leads me to wonder if it is a problem with my system only.
> 
> Are you guys seeing odd behavior with it bouncing off the cap? If not I would be super interested to hear what methods you all are using.
> 
> At this point I'm just getting so exasperated with trying to make it work I'm super tempted to sell the 4090 and get a 3080ti or something and sit this gen out.
> 
> On an unrelated note if anyone in Australia want's a 4090 strix ....


I was having a similar issue. This fixed it for me based on info in this thread. G-Sync on. Frame cap on. And then what fixed it, VSync on in NVCP.


----------



## GRABibus

J7SC said:


> ...LTT is just a few km from here in the direction of the camera-pic, and normally I would be able to see 'the smoke' from their experiment, but not today...
> View attachment 2589307


Awful country 😂
Which one is it ?


----------



## motivman

yzonker said:


> Yea that's the thing I find annoying about benchmarking sometimes. All the little tricks people use and a lot of people won't share them. I get that, but it's not like we're here to win anything. Looks like I got too greedy with the chiller yesterday and hit the friggin' cold bug again without realizing it. Ran this one at ambient. Several hundred points higher and about where I was expecting originally.
> 
> View attachment 2589297


do you mind sharing your chiller setup? Would love to have a chiller I can swap with my Mora 360 when benching….


----------



## J7SC

GRABibus said:


> Awful country 😂
> Which one is it ?


...the one with a lot of snow in the winter; great for Christmas break skiing in Whistler ! Also, shouldn't you be thinking about 'Messi' soccer ? 

---
...not on full clocks yet, but this one is decent...doing comparative runs with ECC on/off...5950X not really a burden at Superpos8K, 99% utilization


----------



## GRABibus

J7SC said:


> ...the one with a lot of snow in the winter; great for Christmas break skiing in Whistler ! Also, shouldn't you be thinking about 'Messi' soccer ?


I supported Argentina 😉


----------



## J7SC

GRABibus said:


> I supported Argentina 😉


...I suggest you put a big sign about that on your car's windows, hood and doors, drive up and down the Av. des Champs-Élysées - then park your car overnight in the open 😂


----------



## zhrooms

tvbaas said:


> Some details pictures and info on Gigabyte Aorus 4090 Master VRM


Thanks, added. 
(SiC653A and uP9512U)


----------



## SilenMar

kx11 said:


> Aorus Xtreme Waterforce REV1.1 now got a 1000w recommended specs while REV1.0 (the one i got) is rated 800w
> 
> 
> anyone got that bios?? i'll flash it on mine


Chances are you need to RMA rev1.0 back to do components swap to use rev1.1 600W BIOS in the long run. 
At least the cooling plate needs to be changed as well as the fan controller. 
Once the GPU goes through RMA. The value doesn't stand against time even more.


----------



## GRABibus

J7SC said:


> ...I suggest you put a big sign about that on your car's windows, hood and doors, drive up and down the Av. des Champs-Élysées - then park your car overnight in the open 😂


😂😂


----------



## kryptonfly

I scored 39 091 in Time Spy

At 3165 mhz SMI locked all the way, especially the 2nd test, 20°C ambient. REbar is disabled.


----------



## AdamK47

yzonker said:


> Yea that's the thing I find annoying about benchmarking sometimes. All the little tricks people use and a lot of people won't share them. I get that, but it's not like we're here to win anything. Looks like I got too greedy with the chiller yesterday and hit the friggin' cold bug again without realizing it. Ran this one at ambient. Several hundred points higher and about where I was expecting originally.
> 
> View attachment 2589297


It's all about posturing. Gratification that only exists in their head.

Real world usage by gaming with the hardware is far more important.


----------



## AvengedRobix

Semi OT.. but this, on Oled and the 4090 is fantastic!!


----------



## yzonker

motivman said:


> do you mind sharing your chiller setup? Would love to have a chiller I can swap with my Mora 360 when benching….


Nothing special really. Just this 1/4 hp chiller,



https://a.co/d/9BYMUcQ



Some people insist you have to have 1/2hp but that hasn't been my experience. If you wanted to maintain sub 10C under load, then probably so. But I just use it for benching. 

I also have an extra pump on the chiller because chillers operate more efficiently at flow rates above normal water cooling loop rates.

Just put the chiller in your loop either permanently or use QDCs to take it in/out. The chiller itself doesn't have very much flow restriction so it won't hurt flow too much. Dual D5s will work, particularly if you have a fairly short run to the chiller. Mine is in an adjacent room with about 12 feet of line each direction.

It's not loud though, particularly in its normal mode. It has a boost mode that is a little louder. I mainly put it in the other room to dump the heat in there.

I'm using an OCTO to control fans and have curves set up to stop my fans around 21C.


----------



## schoolofmonkey

AvengedRobix said:


> Semi OT.. but this, on Oled and the 4090 is fantastic!!
> View attachment 2589330


Which one you using, because I gave up on my LG C2 42" due to the horrible VRR flicker in low light scenes.


----------



## motivman

yzonker said:


> Nothing special really. Just this 1/4 hp chiller,
> 
> 
> 
> https://a.co/d/9BYMUcQ
> 
> 
> 
> Some people insist you have to have 1/2hp but that hasn't been my experience. If you wanted to maintain sub 10C under load, then probably so. But I just use it for benching.
> 
> I also have an extra pump on the chiller because chillers operate more efficiently at flow rates above normal water cooling loop rates.
> 
> Just put the chiller in your loop either permanently or use QDCs to take it in/out. The chiller itself doesn't have very much flow restriction so it won't hurt flow too much. Dual D5s will work, particularly if you have a fairly short run to the chiller. Mine is in an adjacent room with about 12 feet of line each direction.
> 
> It's not loud though, particularly in its normal mode. It has a boost mode that is a little louder. I mainly put it in the other room to dump the heat in there.
> 
> I'm using an OCTO to control fans and have curves set up to stop my fans around 21C.


so chiller is just connected directly to your loop with QDC's, there is no additional reservoir for the chiller, except the res in your loop?


----------



## yzonker

motivman said:


> so chiller is just connected directly to your loop with QDC's, there is no additional reservoir for the chiller, except the res in your loop?


I do have a small res with a D5 also (left over from an old build) that I didn't bother to mention a bit ago. But it's there so I can run the chiller loop separately if I want. Lets me fill it independently for one thing. I've also used it to leak test new components like blocks before I put them in my machine. 

But it's completely unnecessary for using with your loop and can actually cause some issues. If both reservoirs are partially full, the water can/will migrate from one to the other and cause one of them to be too low while the other is completely full. I'd honestly recommend against it unless you want that same functionality or just want to add more water volume to the loop. 

The PMP-500 pump is in the insulated box you see just to the right of the res. That's a powerful pump, but loud. No way I would use it in the same room. I put it in that box to kill some of the noise as I could still hear it slightly through the friggin' wall!! 

I'd say I'm overkill on pumps. 3 D5's is certainly enough. 2 D5's is enough if you just put the chiller close to your machine.


----------



## J7SC

...this piece of art is called _"Still Life with OLED Fireplace on a Snowy Sunday" _ ...NFT available for $11.5 million


----------



## motivman

yzonker said:


> I do have a small res with a D5 also (left over from an old build) that I didn't bother to mention a bit ago. But it's there so I can run the chiller loop separately if I want. Lets me fill it independently for one thing. I've also used it to leak test new components like blocks before I put them in my machine.
> 
> But it's completely unnecessary for using with your loop and can actually cause some issues. If both reservoirs are partially full, the water can/will migrate from one to the other and cause one of them to be too low while the other is completely full. I'd honestly recommend against it unless you want that same functionality or just want to add more water volume to the loop.
> 
> The PMP-500 pump is in the insulated box you see just to the right of the res. That's a powerful pump, but loud. No way I would use it in the same room. I put it in that box to kill some of the noise as I could still hear it slightly through the friggin' wall!!
> 
> I'd say I'm overkill on pumps. 3 D5's is certainly enough. 2 D5's is enough if you just put the chiller close to your machine.
> 
> View attachment 2589343


Think this will work?









Amazon.com: Poafamx Aquarium Chiller 79Gal 1/3 HP Water Chiller for Hydroponics System Home Use Axolotl Fish Coral Shrimp 110V with Pump and Pipe : Pet Supplies


Amazon.com: Poafamx Aquarium Chiller 79Gal 1/3 HP Water Chiller for Hydroponics System Home Use Axolotl Fish Coral Shrimp 110V with Pump and Pipe : Pet Supplies



www.amazon.com


----------



## motivman

yzonker said:


> I do have a small res with a D5 also (left over from an old build) that I didn't bother to mention a bit ago. But it's there so I can run the chiller loop separately if I want. Lets me fill it independently for one thing. I've also used it to leak test new components like blocks before I put them in my machine.
> 
> But it's completely unnecessary for using with your loop and can actually cause some issues. If both reservoirs are partially full, the water can/will migrate from one to the other and cause one of them to be too low while the other is completely full. I'd honestly recommend against it unless you want that same functionality or just want to add more water volume to the loop.
> 
> The PMP-500 pump is in the insulated box you see just to the right of the res. That's a powerful pump, but loud. No way I would use it in the same room. I put it in that box to kill some of the noise as I could still hear it slightly through the friggin' wall!!
> 
> I'd say I'm overkill on pumps. 3 D5's is certainly enough. 2 D5's is enough if you just put the chiller close to your machine.
> 
> View attachment 2589343


This is my loop setup. CPU block, GPU block and Ram block, 420mm and 240mm internal radiator, four d5 pumps and Lots of Quick disconnects. Current flow with all d5 pumps at 100 percent is 265 l/h. So just get waterchiller and connect it to the back where my Mora 360 usually connects and I should be good? no need to use the pump included with the chiller, or extra water tank/res for the chiller?


----------



## yzonker

motivman said:


> Think this will work?


I don't see any reason why it wouldn't work. It does have a minimum set temp of 5C vs 3C on mine. Some people hotwire them to run continuously, but mine starts to struggle some below 6C anyway so none of that may really matter.


----------



## yzonker

motivman said:


> This is my loop setup. CPU block, GPU block and Ram block, 420mm and 240mm internal radiator, four d5 pumps and Lots of Quick disconnects. Current flow with all d5 pumps at 100 percent is 265 l/h. So just get waterchiller and connect it to the back where my Mora 360 usually connects and I should be good? no need to use the pump included with the chiller, or extra water tank/res for the chiller?
> 
> View attachment 2589350
> 
> View attachment 2589349


I was forgetting you had a gaggle of pumps. Yea you don't need anything else. Just have to manage to fill/bleed it. 

I have external rads too and either plug in the chiller in place of them or just put it in series with the whole thing. It does cool faster if I remove them. That's as much or more about flow as any additional heat load though.


----------



## motivman

yzonker said:


> I was forgetting you had a gaggle of pumps. Yea you don't need anything else. Just have to manage to fill/bleed it.
> 
> I have external rads too and either plug in the chiller in place of them or just put it in series with the whole thing. It does cool faster if I remove them. That's as much or more about flow as any additional heat load though.


Nice. I think I am getting the chiller. Amazon has a good return policy, so if doesn't work, back it goes, lol. Thanks for all your help man.


----------



## kx11

SilenMar said:


> Chances are you need to RMA rev1.0 back to do components swap to use rev1.1 600W BIOS in the long run.
> At least the cooling plate needs to be changed as well as the fan controller.
> Once the GPU goes through RMA. The value doesn't stand against time even more.


Yeah i'm not f***** with GB RMA process i hear it's horrible


----------



## Nico67

yzonker said:


> Nothing special really. Just this 1/4 hp chiller,
> 
> 
> 
> https://a.co/d/9BYMUcQ
> 
> 
> 
> Some people insist you have to have 1/2hp but that hasn't been my experience. If you wanted to maintain sub 10C under load, then probably so. But I just use it for benching.
> 
> I also have an extra pump on the chiller because chillers operate more efficiently at flow rates above normal water cooling loop rates.
> 
> Just put the chiller in your loop either permanently or use QDCs to take it in/out. The chiller itself doesn't have very much flow restriction so it won't hurt flow too much. Dual D5s will work, particularly if you have a fairly short run to the chiller. Mine is in an adjacent room with about 12 feet of line each direction.
> 
> It's not loud though, particularly in its normal mode. It has a boost mode that is a little louder. I mainly put it in the other room to dump the heat in there.
> 
> I'm using an OCTO to control fans and have curves set up to stop my fans around 21C.


Interesting  does the 1/4hp have any run away issues if your gaming for a few hrs?

I have been a little concerned my 1/2hp was going to run into issues with 4090+intel cpu. Really they are only rated at 790w, but that is in a large tank etc. Mine runs 24/7 (when I use it) at about 21c set point, to avoid condensation. 30litre res with three loops, chiller, cpu and gpu.


----------



## lmfodor

Hi everyone, last week I bought a Gibabyte Gaming OC and I come from a Rog Strix 3090. The change seemed fantastic! I just wanted to ask you about two issues, one of which I don't know if it makes sense to worry about and another that is quite common but there are different approaches to doing it. The first is that I tested the silicon lotery to see how it behaved with the factory OC and with some manual OC. What I noticed, that caught my attention, is that the two external fans report both in MSI AB and in GPUZ a difference of 5% between the right and left. This never happened to me, I don't know if it was a defect, or if it didn't matter. I do not live in the US and the RMA policies are not so easy in my country, so I would like you to advise me if you see it as something critical or normal. Please note that the FAN Speed of the first Fan report 67% but a higher value of RPM, while the second 71% but a lower value, does not make sense!








On the other hand, the big question about the cable, I have a Phantheks P500A case. In width, I don't see much difference between the distance of the mobo against the glass of other full tower models like the ICUE 7000x or other large ones. They are all more or less 25 cm wide, but the important thing is how the blessed cable bends. I saw several videos that many bend it lower and I managed to do it trying to respect the 3.5 cm, which bends a bit when you close the glass. I don't get third party cables like Corsair or Cablemod. I'm worried that it's too hot, although over time on a trip to the US I'm sure to look for one of these.








Finally, I see very good results in Port Royale with the same GPU, does the *Galax BIOS* work better on the Gigabyte? with +145 clock speed and +800mhz in memory I achieved 27156. with 2,940 MHz Clock frequency. an Avg clock freq of 2,900 MHz and Memory clock frequency1,425 MHz with Average temperature 60 C. Note that the fans in general without OC are at 65/70% and keep the GPU temperature very low, and doing an undervolt playing solo the power limit at 90/80 the performance is very good for the consumption and frequencies that I obtain. Last question, is it still convenient to use curves? that is to say 0.975v at 2750? I saw many contradictions in the best practice to do a UV. Some only touch the PL and others go around the curve.

I appreciate your advices, this forum and community are incredible!


----------



## SilenMar

kx11 said:


> Yeah i'm not f***** with GB RMA process i hear it's horrible


That at least happens when it is done the proper way. 
The other way they can do is just to take off the exposed MOSFETs and inductors. Make the 24-phrase into 22-phrases with the original plate. 
It can probably still get 600W but like wasted 600W.


----------



## Gandyman

coelacanth said:


> I was having a similar issue. This fixed it for me based on info in this thread. G-Sync on. Frame cap on. And then what fixed it, VSync on in NVCP.


Thanks Ill give it a go. frame cap with nvcp or rtss?


----------



## Gandyman

coelacanth said:


> I was having a similar issue. This fixed it for me based on info in this thread. G-Sync on. Frame cap on. And then what fixed it, VSync on in NVCP.


Just tried this with no result.

In this scene here for eg, im getting 173 fps fully uncapped



Spoiler: Uncapped















With the cap on it just bounces from 110 - 116 every 3 to 5 seconds with terrible stutters



Spoiler: After Cap
















But my monitors OSD real time fps counter tells the real story. Even tho capped it jumps from 110 to 116, the OSD fps counter on the monitor shows constant dips into the 70 fps range. I thought perhaps my Gsync module was failing, but I tried it on my LG CX and it does the same thing. 

With Just vsync on it sits at 120 with only infrequent dips to 119 with a very gentle stutter



Spoiler: Only Vsync ON















This is better but still not ideal. The monitors OSD FPS counter shows a solid 120 fps, but the occasional stutter is still less than ideal, and I woudl much rather be in the Gsync window.

Any thoughts from anyone would be appreciated


----------



## dk_mic

Gandyman said:


> This is better but still not ideal. The monitors OSD FPS counter shows a solid 120 fps, but the occasional stutter is still less than ideal, and I woudl much rather be in the Gsync window.
> Any thoughts from anyone would be appreciated


does this happen in several games or only specific engines/platforms?
have you tried to flash another bios / update the strix bios to the latest version?
are you running the the latest motherboard bios version?
have you tried to clean install nvidia drivers (ddu in safemode), eventually tried current / older drivers?
are you running PCIe x16 4.0 @ x16 4.0 (check in gpu-z, eventually activate the render test)

edit: maybe try the different low latency modes


----------



## dk_mic

lmfodor said:


> What I noticed, that caught my attention, is that the two external fans report both in MSI AB and in GPUZ a difference of 5% between the right and left. This never happened to me, I don't know if it was a defect, or if it didn't matter.
> 
> On the other hand, the big question about the cable, I have a Phantheks P500A case. In width, I don't see much difference between the distance of the mobo against the glass of other full tower models like the ICUE 7000x or other large ones. They are all more or less 25 cm wide, but the important thing is how the blessed cable bends. I saw several videos that many bend it lower and I managed to do it trying to respect the 3.5 cm, which bends a bit when you close the glass.
> 
> Last question, is it still convenient to use curves? that is to say 0.975v at 2750? I saw many contradictions in the best practice to do a UV. Some only touch the PL and others go around the curve.
> 
> I appreciate your advices, this forum and community are incredible!


I am not sure what is going on with your fans, I would expect they run at the same %, varying a bit in RPM. Don't think the GB-G-OC has two separate fan curves. As long as they spin similarily you should be fine. You could also set fan curves in afterburner or with the program "fancontrol" manually and see how the fans react.

You will be fine with the adapter and that bend. Just make sure it sits properly and is fully inserted.

Using curve works great, just make sure you are stable at the given voltage/frequency. This point of stability can be different from load to load (i.e. might be fine in regular raster but crash in RTX).
I am currently using 0.875 @ 2535 MHz. This leaves performance at the table, but still pushes more than enough frames for my games right now.

Galax BIOS has the best benchmark scores, but you won't notice much in everyday gaming.


----------



## Gandyman

dk_mic said:


> does this happen in several games or only specific engines/platforms?
> have you tried to flash another bios / update the strix bios to the latest version?
> are you running the the latest motherboard bios version?
> have you tried to clean install nvidia drivers (ddu in safemode), eventually tried current / older drivers?
> are you running PCIe x16 4.0 @ x16 4.0 (check in gpu-z, eventually activate the render test)
> 
> edit: maybe try the different low latency modes


Thanks for reply. Just for background I've run a PC build/repair shop for over 15 years so I've tried a tonne of things.

This happens in probably 50% of the games I've tried, with the other 50% being perfectly fine. Will list ones I've tried at the end.

I have updated the bios and firmware of the Strix, I haven't tried another brand bios however. Do you think that may help?

Yes I am running latest BIOS. I have a 13900k on z690 Apex. I updated the ME firmware and BIOS as recommended by ASUS. Something in the back of my mind wonders if It still would happen if I had gotten a z790 motherboard with native support.

I have clean installed drivers properly, even fresh installed windows.

Yes it is running PCIe x16 4.0. Funny story with that, my first 4090 Strix was limited to 8x 4.0. Sent it back to Asus they confirmed the fault and sent me a replacement.

The low latency modes seem to make things worse, as they make a CPU bound situation even more CPU bound.

The games I've tested so far that WONT cap are:

FC:new dawn
FC:6
AC: Oddesy
WD:Legion
Witcher 3 4.0 update (dx11)
Crysis 2 remaster
Crysis 3 remaster

The games that DO cap fine are

AC:Valhalla
Dvision2
CP2077
H:ZeroDawn
Hitman 3
Metro Enhanced
Dying Light 2

This happens on both the Acer x34p with Gsync Module (120hz 2018 version but bought in 2020) and the LG CX 4k OLED with Freesync.
It seems to be worse in more CPU bound scenarios.
For what its worth, if ever I am below cap - EG cp2077 or wd:legion with ultra RT and dlss off performance is great, no stutters, and my monitor OSD shows 1:1 fps counter to monitor refresh. It doesn't bounce around to all hell like if I try to introduce a cap.


----------



## dk_mic

Gandyman said:


> The games I've tested so far that WONT cap are:
> 
> FC:new dawn
> FC:6
> AC: Oddesy
> WD:Legion
> Witcher 3 4.0 update (dx11)
> Crysis 2 remaster
> Crysis 3 remaster


This is FC6 capped to 140 FPS via NVCP, gsync on, monitor overlay on










I think it works as supposed to?


----------



## AvengedRobix

schoolofmonkey said:


> Which one you using, because I gave up on my LG C2 42" due to the horrible VRR flicker in low light scenes.


LG c1 48… in nv panel fps at 118, low latency ultra, vsync enabled and gsync compatible.. in game vsync off


----------



## EEE-RAY

I have a 4090 and tried to player the remasters, I ticked all the eye candy and expted my fps was like 60 - how my card was drawing only 200 watts, whast going on?


----------



## schoolofmonkey

AvengedRobix said:


> LG c1 48… in nv panel fps at 118, low latency ultra, vsync enabled and gsync compatible.. in game vsync off


Yeah I noticed the 55" C2 doesn't suffer from VVR flicker as bad as the 42", so it seems it could be just the panel they use in the 42".


----------



## Dreams-Visions

StreaMRoLLeR said:


> I have G1 and certified cable, now after switching to 4090 from 3080 Ti my eARC dolby atmos Passthrough via SN11RG is broken aswell. Sometimes it automatically switch to stereo. Coming back to your topic, yes i think its software handshake problem


understood. tyvm! I'll just keep an eye on it I suppose. It's not a big inconvenience, it's just a new and unexpected thing. hopefully it'll be resolved on the driver end soon.


----------



## yzonker

Nico67 said:


> Interesting  does the 1/4hp have any run away issues if your gaming for a few hrs?
> 
> I have been a little concerned my 1/2hp was going to run into issues with 4090+intel cpu. Really they are only rated at 790w, but that is in a large tank etc. Mine runs 24/7 (when I use it) at about 21c set point, to avoid condensation. 30litre res with three loops, chiller, cpu and gpu.


No, with a 500-600w load it will stabilize around 15C. I think I'd have to run an unrealistic load, like Kombustor and yCruncher combined, to get near the limit. It's rated at 3010 BTU/hr which converts to 882w. That seems reasonable given the results I've seen.


----------



## Brads3cents

finally got around to setting up my waterclock. Im excited to see the results.
this card is very average with memory. +1500 is what i feel i can run in games. benches i can run 1550. the highest i was ever able to do is 1600 but it has to be hot enough

the core tho is close to golden. I can run firestike at 3180Mhz with the stock air cooler. Unfortunately, it drops to 3150 because it cant hold temps on air.
I'm hoping now with my new water-cooling setup that i can run 3240 and hold it.
depends on the game/benchmark though. in something like timespy it wont get the same clocks

I didnt get to try it out yet :/ .... i have to finish building out my core p7


im bumed out that i cant use the galax bios but i still wouldnt trade this card


----------



## GRABibus

kryptonfly said:


> I scored 39 091 in Time Spy
> 
> At 3165 mhz SMI locked all the way, especially the 2nd test, 20°C ambient. REbar is disabled.


Please, beat me in timespy graphics score French colleague…I am on air…😜









3DMark Time Spy Graphics Score Hall of Fame


The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.




www.3dmark.com




enable rebar and there will be 2 French guys in top 10 👍


----------



## MrTOOSHORT

4090 on air or water = same


----------



## Brads3cents

doubt it. especially for benchmarking. with a window open i can keep temps in the low 30s which is way better than the 50s.. and more importantly i can stabilize clocks on water

If i was able to get a shunted 3090 to max out at 43 C pulling 700-800 watts. I wont be able to get the same delta as i did with the optimus block last generation but at the same time powerdraw will be less. It will never see 50 degrees with my 1920mm rad setup and thats much cooler than mid 60s that the card will see gaming temps on air


----------



## Lmah2x

Over the weekend I did some testing with the 4090 FE as I had time to try to figure how I wanted to run the gpu. I wanted to settle on an overclock, set it and forget it...

For these tests I used 1.1v, 120% Power Target, 60% Fan Speed

I verified that GPU Core overclock does just about nothing in Cyberpunk 2077 Benchmark (4K, DLSS Quality, Psycho RT, Max settings)
+225 Core/+1600 Memory: 80.06 FPS
+135 Core/+1600 Memory: 80 FPS
+0 Core/+1600 Memory: 79.92 FPS
+0 Core/+0 Memory: 74.51 FPS

I also tested in 3DM SpeedWay and only saw a score deviance of 30-60 points between overclocks and stock. Further tested in Metro Exodus Enhanced and results were also within margin of error.

Afterwards I tested with undervolts in Cyberpunk 2077.
OC+UV, [email protected]/+1600 Memory: 77.8 FPS
Stock Curve up to, [email protected]/+1600 Memory: 77.6 FPS
OC+UV, [email protected]/+0 Memory: 74.4 FPS
Stock Curve up to, [email protected]/+0 Memory: 74.3 FPS

Is it just me or does overclocking the GPU core do next to nothing? I was GPU bound in every test I ran so I think it is highly unlikely it's my 5800x3D. As for VRAM overclocking I saw a 7-8% improvement, which I also further tested in GotG, SoTTR, Cyberpunk 2077, Hitman 3 and Metro Exodus Enhanced and it all had the same result. I felt like I was wasting my time overclocking the 4090. I ended up settling with an Undervolt with the stock curve, 1.0v, +1600 Memory, 95% PT which is still a 5% improvement over stock.

Anyone have a similar experience?


----------



## StreaMRoLLeR

Lmah2x said:


> Over the weekend I did some testing with the 4090 FE as I had time to try to figure how I wanted to run the gpu. I wanted to settle on an overclock, set it and forget it...
> 
> For these tests I used 1.1v, 120% Power Target, 60% Fan Speed
> 
> I verified that GPU Core overclock does just about nothing in Cyberpunk 2077 Benchmark (4K, DLSS Quality, Psycho RT, Max settings)
> +225 Core/+1600 Memory: 80.06 FPS
> +135 Core/+1600 Memory: 80 FPS
> +0 Core/+1600 Memory: 79.92 FPS
> +0 Core/+0 Memory: 74.51 FPS
> 
> I also tested in 3DM SpeedWay and only saw a score deviance of 30-60 points between overclocks and stock. Further tested in Metro Exodus Enhanced and results were also within margin of error.
> 
> Afterwards I tested with undervolts in Cyberpunk 2077.
> OC+UV, [email protected]/+1600 Memory: 77.8 FPS
> Stock Curve up to, [email protected]/+1600 Memory: 77.6 FPS
> OC+UV, [email protected]/+0 Memory: 74.4 FPS
> Stock Curve up to, [email protected]/+0 Memory: 74.3 FPS
> 
> Is it just me or does overclocking the GPU core do next to nothing? I was GPU bound in every test I ran so I think it is highly unlikely it's my 5800x3D. As for VRAM overclocking I saw a 7-8% improvement, which I also further tested in GotG, SoTTR, Cyberpunk 2077, Hitman 3 and Metro Exodus Enhanced and it all had the same result. I felt like I was wasting my time overclocking the 4090. I ended up settling with an Undervolt with the stock curve, 1.0v, +1600 Memory, 95% PT which is still a 5% improvement over stock.
> 
> Anyone have a similar experience?


 Try other AAA hungry games like Plague Tale Requiem with DLSS OFF.

I ended up %90 Power slider + 50 on core which equals for 2810mhz effective and pulling 350-360W max while doing 51-49c max on my Suprim Liquid X

There is no negligible difference for core oc since Ampere.


----------



## yzonker

Lmah2x said:


> Over the weekend I did some testing with the 4090 FE as I had time to try to figure how I wanted to run the gpu. I wanted to settle on an overclock, set it and forget it...
> 
> For these tests I used 1.1v, 120% Power Target, 60% Fan Speed
> 
> I verified that GPU Core overclock does just about nothing in Cyberpunk 2077 Benchmark (4K, DLSS Quality, Psycho RT, Max settings)
> +225 Core/+1600 Memory: 80.06 FPS
> +135 Core/+1600 Memory: 80 FPS
> +0 Core/+1600 Memory: 79.92 FPS
> +0 Core/+0 Memory: 74.51 FPS
> 
> I also tested in 3DM SpeedWay and only saw a score deviance of 30-60 points between overclocks and stock. Further tested in Metro Exodus Enhanced and results were also within margin of error.
> 
> Afterwards I tested with undervolts in Cyberpunk 2077.
> OC+UV, [email protected]/+1600 Memory: 77.8 FPS
> Stock Curve up to, [email protected]/+1600 Memory: 77.6 FPS
> OC+UV, [email protected]/+0 Memory: 74.4 FPS
> Stock Curve up to, [email protected]/+0 Memory: 74.3 FPS
> 
> Is it just me or does overclocking the GPU core do next to nothing? I was GPU bound in every test I ran so I think it is highly unlikely it's my 5800x3D. As for VRAM overclocking I saw a 7-8% improvement, which I also further tested in GotG, SoTTR, Cyberpunk 2077, Hitman 3 and Metro Exodus Enhanced and it all had the same result. I felt like I was wasting my time overclocking the 4090. I ended up settling with an Undervolt with the stock curve, 1.0v, +1600 Memory, 95% PT which is still a 5% improvement over stock.
> 
> Anyone have a similar experience?


When I tested CP2077, I saw a larger increase from the mem OC, but I did also see an increase from core OC larger than what you are showing. I think I tested at max PL, 1100mv as a baseline. Core OC was ~2% and mem OC was ~3%. Seems like exact #'s were 1.7% and 3.3%, but I would have to dig up my post to verify.


----------



## yzonker

And I see there was some additional escalation in Superposition scores overnight.


----------



## akgis

Anyone played FH5 with a 4090

In the big city the FPS is inconstant and stuttery and nor CPU or GPU is working at max, lowering to 1080p doesnt help at all, same FPS Vsync on and off, this didnt happened with my 3090 the FPS at 4K was 90ish on city but GPU was pinned at 99% usage and frametimes were consistent

Everywhere else is fine 144fps, GPU usage depending on the area but 80%+ all eyecandy and RT, on the city GPU is stuttery and cpu at 25% with no core doing more than 30%, GPU at 50% usage.


----------



## leonman44

Hey guys ! I am planning to order a 4090 by the end of this month , I am looking mostly at ROG strix , MSI suprim , ZOTAC extreme but which gpus tend to overclock the best so far ?

Also I am planning to install a gpuwaterblock as I already have a hardtubing custom loop.


----------



## KedarWolf

leonman44 said:


> Hey guys ! I am planning to order a 4090 by the end of this month , I am looking mostly at ROG strix , MSI suprim , ZOTAC extreme but which gpus tend to overclock the best so far ?
> 
> Also I am planning to install a gpuwaterblock as I already have a hardtubing custom loop.


The Strix has the Optimus Water Cooling block, though pricey, their blocks are the very best, and the Matte Black looks soooooo good.









Signature 4090 Strix/TUF GPU Waterblock


IMPORTANT SHIPPING INFORMATION: Orders 11/15/22: Ship November and through December 2022 Orders 11/16/22: Ship January 2023 The highest performance GPU waterblock ever created, exclusively for the NVIDIA 4090 GPU. Compatible with: ASUS Strix 4090 OC Edition ASUS Strix 4090 ASUS TUF 4090 OC...




optimuspc.com


----------



## theilya

Anyone flashed their Zotac cards to 600w yet? howd it go?


----------



## yzonker

KedarWolf said:


> The Strix has the Optimus Water Cooling block, though pricey, their blocks are the very best, and the Matte Black looks soooooo good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Signature 4090 Strix/TUF GPU Waterblock
> 
> 
> IMPORTANT SHIPPING INFORMATION: Orders 11/15/22: Ship November and through December 2022 Orders 11/16/22: Ship January 2023 The highest performance GPU waterblock ever created, exclusively for the NVIDIA 4090 GPU. Compatible with: ASUS Strix 4090 OC Edition ASUS Strix 4090 ASUS TUF 4090 OC...
> 
> 
> 
> 
> optimuspc.com


Except memory cooling is even better on that block. lol


----------



## Nizzen

theilya said:


> Anyone flashed their Zotac cards to 600w yet? howd it go?


Doesn't matter what name the 4090 is, as long it isn't 4090 FE. Bioses works on every model


----------



## leonman44

KedarWolf said:


> The Strix has the Optimus Water Cooling block, though pricey, their blocks are the very best, and the Matte Black looks soooooo good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Signature 4090 Strix/TUF GPU Waterblock
> 
> 
> IMPORTANT SHIPPING INFORMATION: Orders 11/15/22: Ship November and through December 2022 Orders 11/16/22: Ship January 2023 The highest performance GPU waterblock ever created, exclusively for the NVIDIA 4090 GPU. Compatible with: ASUS Strix 4090 OC Edition ASUS Strix 4090 ASUS TUF 4090 OC...
> 
> 
> 
> 
> optimuspc.com


I know right , I already viewed these blocks and they seem amazing and you can just order it with copper and don’t have to deal with eks junk nickel plating wearing off for no reason but i don’t think that there’s a retail store in Europe and I have a mostly aura sync rgb (although I use static colors) built and it seems that I don’t have another choice except EK again.

Also it’s so sad that the pcb is actually so small that watercooled cards are too small in my opinion and I have a full tower , in order to make it look nice I may have to get the active backplate as well.


----------



## alasdairvfr

akgis said:


> Anyone played FH5 with a 4090
> 
> In the big city the FPS is inconstant and stuttery and nor CPU or GPU is working at max, lowering to 1080p doesnt help at all, same FPS Vsync on and off, this didnt happened with my 3090 the FPS at 4K was 90ish on city but GPU was pinned at 99% usage and frametimes were consistent
> 
> Everywhere else is fine 144fps, GPU usage depending on the area but 80%+ all eyecandy and RT, on the city GPU is stuttery and cpu at 25% with no core doing more than 30%, GPU at 50% usage.


There is a weird issue with this game on SSDs, are you running from SSD or HDD?

Can you run a benchmark and see what your avg fps is and stutter count? I didn't notice terrible performance at 4k but I didn't play more than maybe 60min since getting the 4090.


----------



## Lmah2x

akgis said:


> Anyone played FH5 with a 4090
> 
> In the big city the FPS is inconstant and stuttery and nor CPU or GPU is working at max, lowering to 1080p doesnt help at all, same FPS Vsync on and off, this didnt happened with my 3090 the FPS at 4K was 90ish on city but GPU was pinned at 99% usage and frametimes were consistent
> 
> Everywhere else is fine 144fps, GPU usage depending on the area but 80%+ all eyecandy and RT, on the city GPU is stuttery and cpu at 25% with no core doing more than 30%, GPU at 50% usage.


A few things to try....

Cap fps at 138 in either NVCP or RTSS
If you're using NVCP V-Sync try the In-game V-Sync instead, or the opposite
Try disabling Nvidia GeForce Experience Overlay if you have it enabled (I've had issues with this and FH5 in the past)
If you have HWInfo/Aida64/Etc open or similar software that monitors a lot of sensors try disabling it (RTSS wouldn't cause a problem since it only monitors CPU/GPU) and test


----------



## AvengedRobix

theilya said:


> Anyone flashed their Zotac cards to 600w yet? howd it go?


Zotac amp Extreme and Galax BIOS with no problem


----------



## th3illusiveman

Lmah2x said:


> Over the weekend I did some testing with the 4090 FE as I had time to try to figure how I wanted to run the gpu. I wanted to settle on an overclock, set it and forget it...
> 
> For these tests I used 1.1v, 120% Power Target, 60% Fan Speed
> 
> I verified that GPU Core overclock does just about nothing in Cyberpunk 2077 Benchmark (4K, DLSS Quality, Psycho RT, Max settings)
> +225 Core/+1600 Memory: 80.06 FPS
> +135 Core/+1600 Memory: 80 FPS
> +0 Core/+1600 Memory: 79.92 FPS
> +0 Core/+0 Memory: 74.51 FPS
> 
> I also tested in 3DM SpeedWay and only saw a score deviance of 30-60 points between overclocks and stock. Further tested in Metro Exodus Enhanced and results were also within margin of error.
> 
> Afterwards I tested with undervolts in Cyberpunk 2077.
> OC+UV, [email protected]/+1600 Memory: 77.8 FPS
> Stock Curve up to, [email protected]/+1600 Memory: 77.6 FPS
> OC+UV, [email protected]/+0 Memory: 74.4 FPS
> Stock Curve up to, [email protected]/+0 Memory: 74.3 FPS
> 
> Is it just me or does overclocking the GPU core do next to nothing? I was GPU bound in every test I ran so I think it is highly unlikely it's my 5800x3D. As for VRAM overclocking I saw a 7-8% improvement, which I also further tested in GotG, SoTTR, Cyberpunk 2077, Hitman 3 and Metro Exodus Enhanced and it all had the same result. I felt like I was wasting my time overclocking the 4090. I ended up settling with an Undervolt with the stock curve, 1.0v, +1600 Memory, 95% PT which is still a 5% improvement over stock.
> 
> Anyone have a similar experience?


it's the 5800X3D limiting your frames in that game. AMD CPUs don't do raytracing as well as intel does - if you had a 13700k+ you would see gains. If you check your GPU usage it likely dropped below 99% during your bench.

My card hits a hardcap of around 80 fps in the middle of the city with DLSS at lower resolutions with RT and high crowd density on and the GPU is around 80% usage. I've looked at some benchmarks and it appears that even the new Zen 4 CPUs fall short of intel when RT is turned on. seems like an Architectural bottleneck on that front, cause in raster they are plenty quick.


----------



## Lmah2x

th3illusiveman said:


> it's the 5800X3D limiting your frames in that game. AMD CPUs don't do raytracing as well as intel does - if you had a 13700k+ you would see gains. If you check your GPU usage it likely dropped below 99% during your bench.
> 
> My card hits a hardcap of around 80 fps in the middle of the city with DLSS at lower resolutions with RT and high crowd density on and the GPU is around 80% usage. I've looked at some benchmarks and it appears that even the new Zen 4 CPUs fall short of intel when RT is turned on. seems like an Architectural bottleneck on that front, cause in raster they are plenty quick.


In the benchmark it stays 99%-100% the entire time.

The only game I really saw RT performance bottlenecks at the CPU were Spider-Man Remastered and Hitman 3. But I will pay more attention in other RT games to see if it notice anything.


----------



## Betroz

th3illusiveman said:


> AMD CPUs don't do raytracing as well as intel does
> I've looked at some benchmarks and it appears that even the new Zen 4 CPUs fall short of intel when RT is turned on. seems like an Architectural bottleneck on that front,


Do you have a link to a review or research that explains why that is?
I am looking to upgrade my 10900K sometime in 2023, and not sure if I want to wait for Ryzen 7000 3D, or just go with a 13700K now - or wait even longer. Although I mostly play games with RT off (BF2042). Got a 4090 card, so CPU limited right now.


----------



## dr/owned

KedarWolf said:


> The Strix has the Optimus Water Cooling block, though pricey, their blocks are the very best, and the Matte Black looks soooooo good.


$370 for a block is absurd though. On resale it'll be listing for $150 and probably sit for a while when secondhand people aren't gung-ho about having the most expensive block.

And their product page is gimmick on gimmick. They don't cite a single source for their "testimonials" and don't even say the wattage they use to arrive at a 10C delta. Hell my block is 0C delta at idle.


----------



## yzonker

dr/owned said:


> $370 for a block is absurd though. On resale it'll be listing for $150 and probably sit for a while when secondhand people aren't gung-ho about having the most expensive block.
> 
> And their product page is gimmick on gimmick. They don't cite a single source for their "testimonials" and don't even say the wattage they use to arrive at a 10C delta. Hell my block is 0C delta at idle.


It's probably the best though. They have been in the past. Someone on reddit posted results too that were in line with what I would expect. I don't remember the exact #s though. 

I was serious about my comment though. That block along with Heatkiller use 0.5mm pads on the VRAM. So even using cheap pads may result in very low mem temps. And it wouldn't be very forgiving of leaving the backing on since the pad is so thin (decrease core contact pressure). I think I would put some tape on the block and use thermal putty to fill the gap to try to increase mem temps.


----------



## dr/owned

yzonker said:


> It's probably the best though. They have been in the past. Someone on reddit posted results too that were in line with what I would expect. I don't remember the exact #s though.
> 
> I was serious about my comment though. That block along with Heatkiller use 0.5mm pads on the VRAM. So even using cheap pads may result in very low mem temps. And it wouldn't be very forgiving of leaving the backing on since the pad is so thin (decrease core contact pressure). I think I would put some tape on the block and use thermal putty to fill the gap to try to increase mem temps.


I think I found their twitter quote:










I can say my Alphacool block holds +26C at +750W total card power draw in Furmark (from the AC mains). If I extrapolate that to 515W it's 17.8C delta, but really with the 4% efficiency lost AC-DC in the PSU it's about 18.6C delta. Anyways...regardless of the exact numbers their claims of their block being double-digits better than the next closest is absurd. Especially when Alphacool is $250 total including international shipping which most people aren't going to be paying in the US once PPC and MMM get their orders.

Personally not a fan of 0.5mm thermal pads after messing with them on the Phanteks block previous to the Alphacool one. Hard to handle, taking even the backing off them can end up stretching them (thinning them out), and then there's even less tolerance for component and standoff height variations or even very minor things like the torque on a screw being slightly more than on another where 1/8 turn on M4 thread pitch is like 0.1mm.


----------



## yzonker

dr/owned said:


> I think I found their twitter quote:
> 
> View attachment 2589451
> 
> 
> I can say my Alphacool block holds +26C at +750W total card power draw in Furmark (from the AC mains). If I extrapolate that to 515W it's 17.8C delta, but really with the 4% efficiency lost AC-DC in the PSU it's about 18.6C delta. Anyways...regardless of the exact numbers their claims of their block being double-digits better than the next closest is absurd. Especially when Alphacool is $250 total including international shipping which most people aren't going to be paying in the US once PPC and MMM get their orders.
> 
> Personally not a fan of 0.5mm thermal pads after messing with them on the Phanteks block previous to the Alphacool one. Hard to handle, taking even the backing off them can end up stretching them (thinning them out), and then there's even less tolerance for component and standoff height variations or even very minor things like the torque on a screw being slightly more than on another where 1/8 turn on M4 thread pitch is like 0.1mm.


I think they're referring to core power though, not board power. Board power in their example is 560w. So that would put you at 19.4C plus the 4% (not sure I understand that part) taking it to 20.2C. 

Are you using LM on the die or a normal TIM?

In regards to the 0.5mm pads, yes I agree and why I used thermal putty on that block. It was awesome for me 3090 that has mem that actually responds positively to lower temps. Not so much for a 4090.


----------



## KedarWolf

MSI AB / RTSS development news thread


RTSS 7.3.4 Beta 6 Build 27502 is online, changes list is available a few posts above....




forums.guru3d.com





Afterburner 4.6.5 Beta 4, if you have Beta 3 though, no need to upgrade. Just adds support for new AMD cards.


----------



## Thebc2

I got my Optimus block installed this weekend and shared my impressions and testing in the Optimus thread. In short, fit and finish is top notch, and I was seeing roughly the same deltas they shared. My memory OC improved by 200 points despite the .5mm thick pads, so at a bit of a loss as to why I am not seeing the memory instability others have. The hwinfo screenshot is while running furmark, temps on the Octo are water temps pre and post GPU block.

Overall very happy with it under water.




















Sent from my iPhone using Tapatalk Pro


----------



## AlienNight

Hi, I think I have low rail voltages. Is it normal?


----------



## lawson67

AvengedRobix said:


> Zotac amp Extreme and Galax BIOS with no problem


I am guessing all the RGB don't work on the Zotac card once you've flashed the Galax BIOS ?


----------



## Brads3cents

Thebc2 said:


> I got my Optimus block installed this weekend and shared my impressions and testing in the Optimus thread. In short, fit and finish is top notch, and I was seeing roughly the same deltas they shared. My memory OC improved by 200 points despite the .5mm thick pads, so at a bit of a loss as to why I am not seeing the memory instability others have. The hwinfo screenshot is while running furmark, temps on the Octo are water temps pre and post GPU block.
> 
> Overall very happy with it under water.
> 
> 
> 
> 
> Sent from my iPhone using Tapatalk Pro


im curious if its because cold memory seems to actually be worse for vram overclocking

when my memory is heated up i can run up to +1600
cold boot about 1575 max

if i open my window to let in cold air, my system sometimes freezes up at 1560 and reboots.with even 1550 being problematic

overclocking this generation is so boring. you want cold temps for core but not for memory.... this could be forgiven if vram OC wasnt so so important with many benchmarks and games scaling better with vram. I miss that days when vram speed didnt matter too much and only added a tiny bit, now you hope to win the lottery on both ends and thats really really hard

its mainly why i took so long to finally get a waterblock even though i have a big investment in a huge gpu only loop with 2x420 push pull with additional 1080mm rad with 2 pumps and 2 res.
yes i basically gave my gpu 5 radiators and my cpu only one 🤣
i wanted separate loops tho

despite all that i still debated even putting the card on water and almost skipped buying the ek block


----------



## akgis

alasdairvfr said:


> There is a weird issue with this game on SSDs, are you running from SSD or HDD?
> 
> Can you run a benchmark and see what your avg fps is and stutter count? I didn't notice terrible performance at 4k but I didn't play more than maybe 60min since getting the 4090.


My average FPS is destroyed becuse the benchmark takes part in the city, its its on a SSD nvme pci4.

If you would be kind to test, if you have the game installed doesnt mater the resolution, you need to drive in the city abit try to go to castle house then work your way down you should fine places where FPS tanks but GPU/CPU usage gets down, if you dont its still something on my PC I cant find.

I cant really play the game becuase anything that goes though the city, freeroam or Tracks might cause very hard imput latency



Lmah2x said:


> A few things to try....
> 
> Cap fps at 138 in either NVCP or RTSS
> If you're using NVCP V-Sync try the In-game V-Sync instead, or the opposite
> Try disabling Nvidia GeForce Experience Overlay if you have it enabled (I've had issues with this and FH5 in the past)
> If you have HWInfo/Aida64/Etc open or similar software that monitors a lot of sensors try disabling it (RTSS wouldn't cause a problem since it only monitors CPU/GPU) and test


Tryed all that even a new Windows11 diferent version 21H2, oldest Nvidia 4090 suporting drivers and some in between all DDU removed, the drivers were instaled with NVclean so no Geforce, still same results.

RTSS or not, there is special parts of the city that tank the FPS, you move the camera a bit and back to all Framerate, its really strange.

My 3090 didnt had this probleam, but could had been a game build I dont know. All other games so far are fine.


----------



## th3illusiveman

Betroz said:


> Do you have a link to a review or research that explains why that is?
> I am looking to upgrade my 10900K sometime in 2023, and not sure if I want to wait for Ryzen 7000 3D, or just go with a 13700K now - or wait even longer. Although I mostly play games with RT off (BF2042). Got a 4090 card, so CPU limited right now.


Im not sure why, this link (German - may need google translate) kinda shows the difference though - AMD cpus just drop more frames once RT is on. 









Core i9-13900K, i7-13700K & i5-13600K: Gaming-Könige im Test: Benchmarks in Games


Intel Raptor Lake im Test: Benchmarks in Games / Leistung in Spielen (720p, RTX 3090 Ti) / CPU-Gaming-Leistung im absoluten CPU-Limit




www-computerbase-de.translate.goog


----------



## dr/owned

yzonker said:


> I think they're referring to core power though, not board power. Board power in their example is 560w. So that would put you at 19.4C plus the 4% (not sure I understand that part) taking it to 20.2C.
> 
> Are you using LM on the die or a normal TIM?
> 
> In regards to the 0.5mm pads, yes I agree and why I used thermal putty on that block. It was awesome for me 3090 that has mem that actually responds positively to lower temps. Not so much for a 4090.


Furmark only loads about 200MB into vram so I'm just assuming that most of the +750W when I run it is just the core (and I see 0C increase on the vram). The 4% thing is because I'm measuring on the AC side of the PSU so 100% isn't getting converted to DC for the GPU. For the AX1500i it's about 94-96% peak efficiency at 240V. So I'm figuring if the wall says 1000W then 960W is actually getting to the card.

Plain Kryonaut on the paste. I didn't want to mess with Conductonaut this time around because it didn't seem to make a huge difference on the 3090.


----------



## J7SC

MrTOOSHORT said:


> 4090 on air or water = same


...not too far from the truth, but in apps where core counts more than VRAM, w-cooling is still advantageous - apart from being whisper-quiet. For Superposition 8K w/the 5950X (1x or 2x CCDs) loop temps also play a role. Anyway, on my setup (all water, ambient), there is a really small temp window 'overlap' before I can run full VRAM speed but won't downclock too much on core.

...just missed top-10 in Superposition 8K (after being out in the snowy woods at - 10 C, brrrr), but haven't clocked the core up fully yet:


----------



## Lmah2x

th3illusiveman said:


> it's the 5800X3D limiting your frames in that game. AMD CPUs don't do raytracing as well as intel does - if you had a 13700k+ you would see gains. If you check your GPU usage it likely dropped below 99% during your bench.
> 
> My card hits a hardcap of around 80 fps in the middle of the city with DLSS at lower resolutions with RT and high crowd density on and the GPU is around 80% usage. I've looked at some benchmarks and it appears that even the new Zen 4 CPUs fall short of intel when RT is turned on. seems like an Architectural bottleneck on that front, cause in raster they are plenty quick.


I found the spot downtown you were talking about downtown. My fps dropped to 90 with DLSS Performance, max settings, psycho RT, max crowd density. GPU usage was about 84%. But one thing you didn't mention if you turn off RT, the bottleneck is still there. So I don't think it's RT related it's likely just optimization issue.


----------



## AvengedRobix

lawson67 said:


> I am guessing all the RGB don't work on the Zotac card once you've flashed the Galax BIOS ?


i've the default set for RGB and don't have install the zotac utility to change them but hey works


----------



## yzonker

AvengedRobix said:


> i've the default set for RGB and don't have install the zotac utility to change them but hey works


Unless they changed it since 30 series, RGB isn't a part of the Vbios.


----------



## th3illusiveman

Lmah2x said:


> I found the spot downtown you were talking about downtown. My fps dropped to 90 with DLSS Performance, max settings, psycho RT, max crowd density. GPU usage was about 84%. But one thing you didn't mention if you turn off RT, the bottleneck is still there. So I don't think it's RT related it's likely just optimization issue.


When i did my testing, turning off RT boosted frames, turning down crowd density boosted frames. To be fair, 4K, fully maxed out w/ DLSS quality means the GPU will be pegged 90% of the time - its people running at 1440p that may experience more bottlenecks.


----------



## Muut

HOF bios doing wonders on Suprim X (stock cooler / ambiant). Completely flat 3180mhz curve on Firestrike Extreme. made me jump from 44th place to 9th place on the hall of fame ladder

https://www.3dmark.com/fs/29086041


----------



## Nico67

yzonker said:


> No, with a 500-600w load it will stabilize around 15C. I think I'd have to run an unrealistic load, like Kombustor and yCruncher combined, to get near the limit. It's rated at 3010 BTU/hr which converts to 882w. That seems reasonable given the results I've seen.


Nice , it probably is 3010BTU/hr at a rated temperature, possibly even with a rated "tank" capacity. I would like to see some graphs that show the cooling scaling, as it does seem weird that the 1/4 does 3010, 1/2 does 4020 and the 1 does 10050. These ratings must take into account the recommended tank size and possibly min set point, in which case I wonder how they rate at say 30 litres?


----------



## J7SC

@Muut 😀
---

I started to really get into the GAME (rather than benchmark) of Unigine's Superposition. Just like with their 'Valley' benchmark where you can roam and float through the forests and up the mountains , it is actually a blast, even w/o RTX, DLSS3 et al  ...key commands when in 'game' mode are at the bottom right of the screen.


----------



## Muut

lol didn't even know it was possible to play in superposition xD


----------



## J7SC

Muut said:


> lol didn't even know it was possible to play in superposition xD


...yeah, great fun. Same w/ Unigine Valley's game mode for roaming and flying and floating - heats up the GPU more than you think though when in game mode.


----------



## Gandyman

dk_mic said:


> This is FC6 capped to 140 FPS via NVCP, gsync on, monitor overlay on
> 
> View attachment 2589393
> 
> 
> I think it works as supposed to?


Idk wether this is good news or bad news for me ...










with fps cap set to 116 im dropping all over the place, my monitor counter showing dips as low as 106 in the 2 mins I played to take this screenie.

Any idea what could be wrong? 13th gen cpu in a z690 mobo? Im so far out of ideas

z790 hero are 1100 bucks down under ... very expensive experiment -_-


----------



## yzonker

Nico67 said:


> Nice , it probably is 3010BTU/hr at a rated temperature, possibly even with a rated "tank" capacity. I would like to see some graphs that show the cooling scaling, as it does seem weird that the 1/4 does 3010, 1/2 does 4020 and the 1 does 10050. These ratings must take into account the recommended tank size and possibly min set point, in which case I wonder how they rate at say 30 litres?


I don't understand their ratings either, but if you look at the rated current draw of each unit, it scales directly with the BTU rating rather than the HP rating. 1/2 HP unit only draws a little more current than the 1/4 HP unit. 1 HP is much higher. 

Also, at one time I went to the trouble of back calculating the wattage using the temperature vs time graphs in the manual and it works out pretty close. 

And yes that rating is for a temp at or above ambient.


----------



## yzonker

Lmah2x said:


> I found the spot downtown you were talking about downtown. My fps dropped to 90 with DLSS Performance, max settings, psycho RT, max crowd density. GPU usage was about 84%. But one thing you didn't mention if you turn off RT, the bottleneck is still there. So I don't think it's RT related it's likely just optimization issue.


Can you show where you are talking about? I drove around what I thought might be the area you are referring to with DLSS set to high performance. That was the most power I've seen my 13900k pull in a game. Nearly 200w and it's running bone stock (probably could undervolt it some but I haven't bothered).


----------



## doom26464

I just got my zotac 4090 up and running and took it through a suite of games at stock speeds..my 1440x3440 144hz monitor this thing Is overkill and smashing through the ceiling and then some. Compared to my 3080 it's complete waste at that resolution as I really only needed another 10-20% more to max that monitor.

On my 4k 120hz oled display though is where this thing shines. Unreal treat of a gpu, this is what I been looking for for awhile. 

I haven't started overclocking yet will get to that tommrow, but looking at performance gains I'm not excepting much. Also no idea why everyone so into flashing bios for high PL% when these things smash into voltage limit so easy??? 

Maybe for benching but benchmarking is boring. Games also don't seem to scale too well with core I take it seems to be better focused on vram tweaking this generation.


----------



## J7SC

@Nizzen ...some more horsing around... Not subbed of course (though it would be #1 at HoF), but can you beat this 'lucky run'  in Speedway ? I'm still working on that 31 K Port Royal score of yours...

...there are some 'memory holes' in VRAM worth finding, but +1770 was a bit optimistic.









---

In case anyone has tried out the 'Game' section of Unigine Superposition, those three levers are worth playing with, especially if you turned the lights out before by the door


----------



## dk_mic

Gandyman said:


> Idk wether this is good news or bad news for me ...
> 
> View attachment 2589521
> 
> 
> with fps cap set to 116 im dropping all over the place, my monitor counter showing dips as low as 106 in the 2 mins I played to take this screenie.
> 
> Any idea what could be wrong? 13th gen cpu in a z690 mobo? Im so far out of ideas
> 
> z790 hero are 1100 bucks down under ... very expensive experiment -_-


I know you are only at 70% gpu there, but what happens if you set all graphics settings to low? and then resolution to 720p? 
are you sure you don't have several frame limiters active? do you get dips even if you have all monitoring software disabled?
I think it's clearly a software issue if it can manage to hold higher framrates without problems and it's just the limiter making it stutter..
can you actually see and feel the stutters in game, or is it just the counter that fluctuates?


----------



## GRABibus

MrTOOSHORT said:


> 4090 on air or water = same


This is why I keep it on air


----------



## chibi

While waiting for 4090 FE, I see the MSI Trio available. Is this card any good? I see it only has the 3 adapter plug and 450w bios. Can this be increased for max performance? Assume card will be water cooled eventually if a block is made available.
Otherwise skip and wait for FE?


----------



## rahkmae

Yes it is running PCIe x16 4.0. Funny story with that, my first 4090 Strix was limited to 8x 4.0. Sent it back to Asus they confirmed the fault and sent me a replacement.

How did you determine it worked that way? limited to 8x 4.0?


----------



## GAN77

Lots of talk about memory and strange behavior below 50 degrees. Some do not see the effects of temperature.
RTX 4090 uses memory chip D8BZC, which decodes to MT61K512M32KPA-*21 (21 Gbps)*
RTX 4080 uses memory chip D8BZF, which decodes to MT61K512M32KPA-*24 (24 Gbps)*

Some reviews suggest that a relabeled D8BZF may be used on the 4090.

Similar to what we saw with the Asus ROG Strix, the GDDR6X memory shows the expected "Micron DB8ZC" label, but at least on the sample we received, this appears to be down-binned 24 Gbps memory. Three of the 4090 cards we've tested top out at 23–23.6 Gbps on the memory overclock, but the Colorful and Asus cards were able to reach 25 Gbps with relative ease.


----------



## alasdairvfr

akgis said:


> My average FPS is destroyed becuse the benchmark takes part in the city, its its on a SSD nvme pci4.
> 
> If you would be kind to test, if you have the game installed doesnt mater the resolution, you need to drive in the city abit try to go to castle house then work your way down you should fine places where FPS tanks but GPU/CPU usage gets down, if you dont its still something on my PC I cant find.
> 
> I cant really play the game becuase anything that goes though the city, freeroam or Tracks might cause very hard imput latency












This is me benching it DLSS off but DLAA on. In the city scenes it dipped to the lower 100s so if you aren't getting around that in the benchmark with similar resolution and settings you might have an issue elsewhere. Without DLSS most people will be CPU bound at 4k, turning it on I'm usually hitting 144fps with minor dips here and there. I don't really have time to do a lot of driving around now but the benchmark mode at least is a good way to compare and see where you land. The stutter count I'm reading online sometimes goes through the roof when the game is installed on an SSD (counterintuitive) so that could be your culprit. There were some posts on this thread ~page 241-245 of ppl comparing different CPUs. Overclocking GPU for me was maybe 5% difference so you shouldn't be too far off.


----------



## tubs2x4

For CAD people bestbuy.ca has Msi 4090 gaming x trio for back order available on Jan 9 if people still looking for a 4090.


----------



## AvengedRobix

doom26464 said:


> I just got my zotac 4090 up and running and took it through a suite of games at stock speeds..my 1440x3440 144hz monitor this thing Is overkill and smashing through the ceiling and then some. Compared to my 3080 it's complete waste at that resolution as I really only needed another 10-20% more to max that monitor.
> 
> On my 4k 120hz oled display though is where this thing shines. Unreal treat of a gpu, this is what I been looking for for awhile.
> 
> I haven't started overclocking yet will get to that tommrow, but looking at performance gains I'm not excepting much. Also no idea why everyone so into flashing bios for high PL% when these things smash into voltage limit so easy???
> 
> Maybe for benching but benchmarking is boring. Games also don't seem to scale too well with core I take it seems to be better focused on vram tweaking this generation.


On zotac i've a original BIOS with undetvolt for play game and on quiet BIOS i've flashed the Galax 666w bios


----------



## chibi

tubs2x4 said:


> For CAD people bestbuy.ca has Msi 4090 gaming x trio for back order available on Jan 9 if people still looking for a 4090.


where does it show the eta of Jan 9th? I don’t see that when I view it on the BB app.
Anyways, is this a good card? Seems power limited to 450w with the three pigtails. Can I get the full 600w if I use the Corsair 12vhpwr cable and hof bios?


----------



## doom26464

AvengedRobix said:


> On zotac i've a original BIOS with undetvolt for play game and on quiet BIOS i've flashed the Galax 666w bios


Do you even see a difference in games with the 600+ bios? 

I don't understand what the point of all the huge power limits are when like sub 400watts this thing is smashing into VREL. 

Maybe benchmarking it helps but even still if your limited by voltage whats it matter? 

Though maybe it helps boost core clock somewhat as I'm seeing reading through here. Though I see minimal in game testing just on and on benchmarking.


----------



## yzonker

doom26464 said:


> Do you even see a difference in games with the 600+ bios?
> 
> I don't understand what the point of all the huge power limits are when like sub 400watts this thing is smashing into VREL.
> 
> Maybe benchmarking it helps but even still if your limited by voltage whats it matter?
> 
> Though maybe it helps boost core clock somewhat as I'm seeing reading through here. Though I see minimal in game testing just on and on benchmarking.


The Galax bios is a 1-2% gain in anything that responds to higher core clocks. It increases effective clocks. Really not worth it for games IMO unless you just want that last bit of performance.


----------



## alasdairvfr

doom26464 said:


> Do you even see a difference in games with the 600+ bios?
> 
> I don't understand what the point of all the huge power limits are when like sub 400watts this thing is smashing into VREL.
> 
> Maybe benchmarking it helps but even still if your limited by voltage whats it matter?
> 
> Though maybe it helps boost core clock somewhat as I'm seeing reading through here. Though I see minimal in game testing just on and on benchmarking.


Benchmarking for sure it makes a significant difference, PR goes from like 26k to 28-29k for most ppl. So like 10-ish percent.

In my anecdotal experience actual game performance is 3-5% on a heavy-but-stable OC. You can still run a decent OC with a slight undervolt and stay within 450w. For me personally it's worth overclocking and giving it the power envelope to do so but between 450-500w the perf gains drop off a cliff. 5-600w even more so. The last 150w would probably be 2-3%

Where I live it's cold AF in the winter and I use a lot less electric heating by closing the door to my office and not running many of the baseboard heaters throughout my basement.


----------



## Jimbodiah

Anyone know how to save the gpu OC settings so it will use them after a reboot? Using Asus AI Suite 3 at the moment.


----------



## AvengedRobix

doom26464 said:


> Do you even see a difference in games with the 600+ bios?
> 
> I don't understand what the point of all the huge power limits are when like sub 400watts this thing is smashing into VREL.
> 
> Maybe benchmarking it helps but even still if your limited by voltage whats it matter?
> 
> Though maybe it helps boost core clock somewhat as I'm seeing reading through here. Though I see minimal in game testing just on and on benchmarking.


In game i've not try the 600w.. i use the stock at 80% and 0.995mv for 2900mhz.. in bench with the Galax BIOS, Yes you are limited by voltage on core clock but you can go hogher with vram.. you have more wattage room for the vram.. of you.pock 1.10V you go *Easy at* 600w.


----------



## yzonker

Jimbodiah said:


> Anyone know how to save the gpu OC settings so it will use them after a reboot? Using Asus AI Suite 3 at the moment.


Afterburner does that. Haven't used the Asus software.


----------



## Betroz

Jimbodiah said:


> Anyone know how to save the gpu OC settings so it will use them after a reboot? Using Asus AI Suite 3 at the moment.


Don't use that Asus bloatware, as it hurts minimum fps in games. MSI Afterburner is better.


----------



## inedenimadam

how do the waterblocks stack up this generation? I see byski, phanteks, and EK have made blocks for the MSI cards. Which one should I buy, and do I get the active backplate too?

opinions please.


----------



## Jimbodiah

Thanks!


----------



## Nizzen

inedenimadam said:


> how do the waterblocks stack up this generation? I see byski, phanteks, and EK have made blocks for the MSI cards. Which one should I buy, and do I get the active backplate too?
> 
> opinions please.


Sad truth: waterblocks doesn't help much on 4090. Pretty much the same performance on aircooling. You gain very little, if something at all.


----------



## J7SC

GAN77 said:


> Lots of talk about memory and strange behavior below 50 degrees. Some do not see the effects of temperature.
> RTX 4090 uses memory chip D8BZC, which decodes to MT61K512M32KPA-*21 (21 Gbps)*
> RTX 4080 uses memory chip D8BZF, which decodes to MT61K512M32KPA-*24 (24 Gbps)*
> 
> Some reviews suggest that a relabeled D8BZF may be used on the 4090.
> 
> Similar to what we saw with the Asus ROG Strix, the GDDR6X memory shows the expected "Micron DB8ZC" label, but at least on the sample we received, this appears to be down-binned 24 Gbps memory. Three of the 4090 cards we've tested top out at 23–23.6 Gbps on the memory overclock, but the Colorful and Asus cards were able to reach 25 Gbps with relative ease.


....it would be nice to find some* up*-binned 24 Gbps GDDR6X and mount it on a 4090 w/ slow VRAM (not that difficult for a professional like Krisfix.de).

While a lot more complicated than switching VRAM chips, those who wish for a 4090 Ti *now* might also be tempted to get the core chip from a RTX 6000 Ada (18,176 cuda) and 'drop it in' along with some relevant VRM strengthening - it is pin-compatible ....mind you, where to get such a chip only and at what cost is one issue, the other would be a custom vbios...



Jimbodiah said:


> Anyone know how to save the gpu OC settings so it will use them after a reboot? Using Asus AI Suite 3 at the moment.


...there is an option in GPU TweakIt (available as a stand alone) to lock settings in on boot-up. While MSI AB is generally superior, there used to be 'some special' versions of GPU TweakIt that also allowed for additional adjustments on Asus cards, such as VRAM voltage....but not sure re. this 4090 gen.



inedenimadam said:


> how do the waterblocks stack up this generation? I see byski, phanteks, and EK have made blocks for the MSI cards. Which one should I buy, and do I get the active backplate too?
> 
> opinions please.


There are some nice and pricey water-blocks out there, but at US$122 or so with backplate, the Bykski was hard to beat for my Giga-G-OC. So far, it has been performing flawlessly. This is the second Bykski block I have - the other one sits on a 6900XT work machine and has been performing great for 1.5+ years now...pristine looking micro-fins and all other parts you can see through the acrylic.

EDIt: regarding whether water-blocks are needed for a 4090 or not ...leaving out some performance gains when core plays a role, it is also a.) substantially quieter than an air-cooled card and especially b.) the 4090s fit much better into build projects once stripped of their giant air-coolers (re. length and depth)...so it makes sense for those who have an open looop anyway.


----------



## inedenimadam

Nizzen said:


> Sad truth: waterblocks doesn't help much on 4090. Pretty much the same performance on aircooling. You gain very little, if something at all.


am aware. paid enough attention here to see that. I still wouldn't mind smaller and quieter. If I happen to get a few extra mhz...great, if not, I am ok with just smaller and quieter. running an external mora 3.


----------



## simonabamber

What’s everyone’s everyday overclock on this card? I’ve run it pushed to the max but it was a bunch extra heat for not many FPS, where’s the sweet spot?


----------



## SilenMar

doom26464 said:


> Do you even see a difference in games with the 600+ bios?
> 
> I don't understand what the point of all the huge power limits are when like sub 400watts this thing is smashing into VREL.
> 
> Maybe benchmarking it helps but even still if your limited by voltage whats it matter?
> 
> Though maybe it helps boost core clock somewhat as I'm seeing reading through here. Though I see minimal in game testing just on and on benchmarking.


You can see the difference in certain games when it is just limited by power. 









The more point is that a card loaded at 600W, vapor chamber cooled at 60C everywhere with 0db fans just holds more value than a card loaded at 500W with turbo fans at 80C. 

The design and components quality for 600W are better than 500W. After the next-gen GPUs come, the old ones with higher value can be sold at a higher price as well.


----------



## Aurosonic

Got my Aorus Extreme Waterforce today. Mediocre chip and memory, can pass Port Royal at 3030/+1500 with 28400 score. Below is a buged run on a Galax bios:


----------



## tubs2x4

chibi said:


> where does it show the eta of Jan 9th? I don’t see that when I view it on the BB app.
> Anyways, is this a good card? Seems power limited to 450w with the three pigtails. Can I get the full 600w if I use the Corsair 12vhpwr cable and hof bios?


 I put an order in last night on it and I looked this am and still showed available for back order. But looking now its out of stock again.


----------



## chibi

tubs2x4 said:


> I put an order in last night on it and I looked this am and still showed available for back order. But looking now its out of stock again.


I see that now, I had the card in my cart since yesterday evening but couldn't commit. Going to hold out and try to get an FE card.


----------



## yzonker

inedenimadam said:


> am aware. paid enough attention here to see that. I still wouldn't mind smaller and quieter. If I happen to get a few extra mhz...great, if not, I am ok with just smaller and quieter. running an external mora 3.


You still get 1 bin (15mhz) per 5-6C on the core, it just doesn't add as much performance as it did in the past. For example, dropping my temps 10-15C with my chiller will still gain 100-200pts in PR. 

Someone reported that it worked well to leave the backing on one side of the pads to increase VRAM temps to keep from losing VRAM OC. I would do that if I were re-mounting. Or some form of partially compromised cooling for the mem to elevate temps closer to what you get with an air cooler.

But my advice is to get the cheap Bykski as @J7SC recommended. It seems to have as good or better delta than the others except Optimus which I think is a bit better but $$$$$. I have the Bykski on my TUF.


----------



## tubs2x4

chibi said:


> I see that now, I had the card in my cart since yesterday evening but couldn't commit. Going to hold out and try to get an FE card.


Well Jan9 a ways away yet. If it hasn’t shipped can alwasy cancel order. It’s not like they aren’t going to be able to sell that card anyway.


----------



## inedenimadam

simonabamber said:


> What’s everyone’s everyday overclock on this card? I’ve run it pushed to the max but it was a bunch extra heat for not many FPS, where’s the sweet spot?


3000 at 1.05, +1500 VRAM. I should probably dial it back, but that 3000 feels so right.



yzonker said:


> But my advice is to get the cheap Bykski as @J7SC recommended.


you guys got a retailer preference in the US for byski? I only see a full coverage+ active backplate on the byski website. Amazon has just the normal block...but its $179


----------



## Aurosonic

inedenimadam said:


> 3000 at 1.05, +1500 VRAM. I should probably dial it back, but that 3000 feels so right.


Try to pass Metro Exodus 5 runs with these settings. If it passes, you're good to go with almost every game.


----------



## J7SC

yzonker said:


> You still get 1 bin (15mhz) per 5-6C on the core, it just doesn't add as much performance as it did in the past. For example, dropping my temps 10-15C with my chiller will still gain 100-200pts in PR.
> 
> Someone reported that it worked well to leave the backing on one side of the pads to increase VRAM temps to keep from losing VRAM OC. I would do that if I were re-mounting. Or some form of partially compromised cooling for the mem to elevate temps closer to what you get with an air cooler.
> 
> But my advice is to get the cheap Bykski as @J7SC recommended. It seems to have as good or better delta than the others except Optimus which I think is a bit better but $$$$$. I have the Bykski on my TUF.


...apart from the (smallish) core MHz gain as well as size, weight and noise, there are also cards which benefit if there was a factory mounting issue - like mine. On air, the core would always go very high (over 3.2GHz) but with Hotspot temps well past the 90s...that in turn seemed to affect the internal GPU IMC. Since water-blocking, my VRAM gained as much as 120 Hz, EEC included, depending on VRAM temps which are now almost too cold given they extra heatsink, thermal putty on VRAM etc. But Hotspot rarely even reaches 60 C now.

@inedenimadam ...I got my Bykski block at FormulaMod


----------



## kryptonfly

GRABibus said:


> Please, beat me in timespy graphics score French colleague…I am on air…😜
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark Time Spy Graphics Score Hall of Fame
> 
> 
> The 3DMark.com Overclocking Hall of Fame is the official home of 3DMark world record scores.
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> enable rebar and there will be 2 French guys in top 10 👍


Your wish is my command 
REbar is forced, I lose around 1000pts on the cpu. Sometimes the voltage falls at 1.095v or less and it crashes, depends of the temp, I need to warm it up a little before to pass it. I can't see more than 550W on the gpu for now, it's where watercooling is useful.

I scored 39 227 in Time Spy


















I've just seen I'm now 7th in graphics score


----------



## GRABibus

kryptonfly said:


> Your wish is my command
> REbar is forced, I lose around 1000pts on the cpu. Sometimes the voltage falls at 1.095v or less and it crashes, depends of the temp, I need to warm it up a little before to pass it. I can't see more than 550W on the gpu for now, it's where watercooling is useful.
> 
> I scored 39 227 in Time Spy
> 
> View attachment 2589670
> 
> 
> View attachment 2589671
> 
> 
> I've just seen I'm now 7th in graphics score


The Gigas OC we received in France are well Binned 😜


----------



## yzonker

Still only one French person in the top 10. LOL


----------



## jootn2kx

I advise to be careful with memory overclock above +1500mhz in afterburner.
Not sure what triggered it but mine was @ +1700 stable in all games and benchmarks for 2 months without single crash.
Suddenly when not gaming, just scrolling on facebook and surfing I saw weird artifacts over the whole screen and pc crashed+rebooted.
Looking at the type of artifacts the screen was going pretty crazy and I was immediately thinking that it was probably my vram.

So I set my memory clock back to +1500 maybe my previous profile wasn't 100% stable after all.
The weird thing though I didn't have issues in games and benchmarks only when browsing on windows, totally random.
I also completely removed the drivers with DDU just to be sure.
Didn't have any issues anymore so far, let's hope for the best  i'm not risking it anymore to go above +1500

I'm still on the Galax666 bios, maybe it's doing something after all with the vram?
i'm trying to replicate the issue will hammer my card with several benchmarks the next days


----------



## kryptonfly

GRABibus said:


> The Gigas OC we received in France are well Binned 😜


Mine comes from Spain (PcComponentes) at 2035€, I could have worse apparently. I ordered a Suprim Liquid X on 12th october in pre-order, I waited until on 3 december and when I saw this Giga I jumped into it, the Suprim became available on 8th december, just 5 days after... if I had waited 5 more days I would have a Suprim Liquid more expensive and probably a little worse in vram OC. So, I was lucky let's say, this Giga was made for me


----------



## sym30l1c

I got my 4090 FE a few days ago and finally got to install it and test it yesterday. Really a beast, although I feel like I've lost the silicon lottery: stock boost is "only" 2685MHz (very rarely it gets to 2700MHz). I tried a quick and sloppy overclock and at least I got around 2950MHz fairly easily.

I know...a real first world problem...

Plan is to watercool it, so hopefully I can squeeze a little more performance out of it.


----------



## HeLeX63

Hey guys. Just purchased an MSI RTX 4090 Suprim X along with an Alphacool full cover waterblock (coming from a 6900XT Red Devil) and will arrive in a week or so. What are some typical Core and Memory overclocks?

What is considered below good or above average silicone quality in terms of overclocking? Will I notice any significant increases going from air to water in terms of boost ?

Thanks


----------



## kryptonfly

yzonker said:


> Still only one French person in the top 10. LOL


3DMark Time Spy Graphics Score Hall of Fame
The first is bugged I think with 3060mhz on gpu, I have a doubt about the 3rd, maybe the 6th too. The others are legit. And now on HWBot they cancelled my submits of PR and SW because we need to enable ECC now, even on SW...


----------



## chibi

MSI Suprim 4090 available at Bestbuy Canada for backorder. A few Aorus models at Canada Computers for us Canadian folk.


----------



## coelacanth

Gandyman said:


> Thanks Ill give it a go. frame cap with nvcp or rtss?


NVCP


----------



## GAN77

Has anyone seen reviews and PCB MSI RTX 4090 VENTUS 3X?


----------



## yzonker

kryptonfly said:


> 3DMark Time Spy Graphics Score Hall of Fame
> The first is bugged I think with 3060mhz on gpu, I have a doubt about the 3rd, maybe the 6th too. The others are legit. And now on HWBot they cancelled my submits of PR and SW because we need to enable ECC now, even on SW...


Oh I know, but I just had to chuckle a bit that you bumped Grabibus out.


----------



## Sheyster

GRABibus said:


> This is why I keep it on air


Same here and it's staying that way. Will be an easy upgrade to 4090Ti..


----------



## GRABibus

yzonker said:


> Oh I know, but I just had to chuckle a bit that you bumped Grabibus out.


You hate me, right ?


----------



## kryptonfly

Sheyster said:


> Same here and it's staying that way. Will be an easy upgrade to 4090Ti..


With 53°C at 250W, one pcie obstructed, PCH at 74°C (56°C usually), my SN850X nvme at 44°C and still unallocated (no read, write at all), I didn't have the choice to watercool it. Now I can use my pcie sound card, PCH is at ~56°C, SN850X at 34°C.


----------



## gfunkernaught

Could someone here with a 4k TV/Monitor run a quick test? I'm curious to see performance difference in Quake 2 RTX without resolution scaling at 4k. Also, enable DSR 4x in the NVCP and try it again this time at 8k.


----------



## Nico67

yzonker said:


> I don't understand their ratings either, but if you look at the rated current draw of each unit, it scales directly with the BTU rating rather than the HP rating. 1/2 HP unit only draws a little more current than the 1/4 HP unit. 1 HP is much higher.
> 
> Also, at one time I went to the trouble of back calculating the wattage using the temperature vs time graphs in the manual and it works out pretty close.
> 
> And yes that rating is for a temp at or above ambient.


Yeah I did have a look around and found some specs on Hailea where they talk about the rating for being temp pull down. The set out ambient 30c with 20hr pull down from 28c to 18c at a rated volume. 1/4hp can do 300l and 1/2 can do 500l, again its a bit hard to say whats useful info, but I think this relates more to the BTU/hr. In watercooling scenario its quite different, yes you do have an ambient temp and a pulldown, but an aquarium isn't directly sticking 600-700w of energy into the tank.
I also think You are right in that rated watts or amps is how much power the unit draws, but that usually is comparable to how much heat it can remove.

The reason I use it, is because I live in qld, australia, currently summer with 27-35c+ days. Even with Aircon I am getting 76c temps on my Strix, hopefully I will get a delivery soon so I can lap this EK block, pretty average GPU pad finish


----------



## tubs2x4

chibi said:


> MSI Suprim 4090 available at Bestbuy Canada for backorder. A few Aorus models at Canada Computers for us Canadian folk.


See the 4080 reference is available also I see now.


----------



## 8472

Bestbuy dropped a lot of cards today. Mainly FEs and gigabytes. Hopefully Asus and MSI will have drops soon as well.


----------



## mirkendargen

inedenimadam said:


> you guys got a retailer preference in the US for byski? I only see a full coverage+ active backplate on the byski website. Amazon has just the normal block...but its $179


Aliexpress vendor of your choice.


----------



## J7SC

Sheyster said:


> Same here and it's staying that way. Will be an easy upgrade to 4090Ti..


Thinking ahead already 😂 ? I am sure 4090 Ti will be tempting, but the jump from previous gens to 4090 will be a very hard act to follow. To wit, _two_ w-cooled 2080 Tis in NVLink got me to about 21k in Port Royal (spoiler) while my single 3090 Strix manages about 16k with the latest drivers...all the while, my 4090 gets me not that far from 30k (discounting my >30k 'lucky' runs).

As to water-cooling, I love it as I have sensitive hearing and am surrounded by computers...


Spoiler


----------



## yzonker

J7SC said:


> Thinking ahead already 😂 ? I am sure 4090 Ti will be tempting, but the jump from previous gens to 4090 will be a very hard act to follow. To wit, _two_ w-cooled 2080 Tis in NVLink got me to about 21k in Port Royal (spoiler) while my single 3090 Strix manages about 16k with the latest drivers...all the while, my 4090 gets me not that far from 30k (discounting my >30k 'lucky' runs).
> 
> As to water-cooling, I love it as I have sensitive hearing and am surrounded by computers...
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2589739


The only way the 4090 Ti is going to be interesting is if Nvidia either finds significantly faster VRAM (beyond 24Gbps) and increases the voltage limit. I suspect neither will happen. Otherwise it'll just be a marginal jump in performance. Maybe 10%.


----------



## Sheyster

J7SC said:


> Thinking ahead already 😂 ? I am sure 4090 Ti will be tempting, but the jump from previous gens to 4090 will be a very hard act to follow.


Indeed.. I'm actually considering a rebuild sooner than later. 

Z790 Apex
13900KS
8000 DDR5
4090Ti


----------



## J7SC

Sheyster said:


> Indeed.. I'm actually considering a rebuild sooner than later.
> 
> Z790 Apex
> 13900KS
> 8000 DDR5
> 4090Ti


Z790 Apex/1300KS is tempting, but a bit late for me to get into LG1700. CES will (hopefully) show what's up with AM5 X3D...else, I have a look at upcoming Intel and AMD new-gen HEDT.


----------



## mouacyk

yzonker said:


> The only way the 4090 Ti is going to be interesting is if Nvidia either finds significantly faster VRAM (beyond 24Gbps) and increases the voltage limit. I suspect neither will happen. Otherwise it'll just be a marginal jump in performance. Maybe 10%.


10% for 2x cost, I'll take it


----------



## doom26464

Seem to have core dialed in after playing with it all night. Zotac trinity OC

PL+110%
Voltage slider:+100
+230 on the core.

Boost up to like 3030-3045 max tested through multiple games. Anything beyond that I get crashing. I can bench at like 3070mhz but not stable for games and I'm not worried about stable benchmarking.

That said for like a near 300mhz jump from what it was out of the box in cyberpunk benchmark gain was like 1.5fps lol. Wierd why scaling for core clock is so bad.

I'll playing around with vram tommrow or the day after I run a few more hours of games to make sure the core is 100% stable.


----------



## lmfodor

dk_mic said:


> I am not sure what is going on with your fans, I would expect they run at the same %, varying a bit in RPM. Don't think the GB-G-OC has two separate fan curves. As long as they spin similarily you should be fine. You could also set fan curves in afterburner or with the program "fancontrol" manually and see how the fans react.
> 
> You will be fine with the adapter and that bend. Just make sure it sits properly and is fully inserted.
> 
> Using curve works great, just make sure you are stable at the given voltage/frequency. This point of stability can be different from load to load (i.e. might be fine in regular raster but crash in RTX).
> I am currently using 0.875 @ 2535 MHz. This leaves performance at the table, but still pushes more than enough frames for my games right now.
> 
> Galax BIOS has the best benchmark scores, but you won't notice much in everyday gaming.


Hi @dk_mic! thanks for your reply. 

As for the fans spinning at different speeds, I didn't see this behavior on any GPU I've owned, nor in any forum posts. It really doesn't worry me too much, because the difference is about 5%. I should try booting in Safe Mode to see if it's a SW issue or indeed it's a fan. I wouldn't want to change the GPU just because of this, if a fan fails I guess it would be easier to replace it rather than RMA the card, at least not in my country where we only have importers. In the US the whole process is very easy.

Regarding the Undervold, in my old 3090 I had achieved a good curve, however in the 4090 I saw der bauer and several others talking about directly limiting the Power Limit, to 80% or 70%, although I notice that the frequency drops a lot. I thought the 4090 worked well around 0.9v and above to achieve factory frequencies (2500mhz). I am going to try with your configuration, how for example I play MW2 at 1440p but soon I will go to a 4k monitor to get better performance.

Thanks again!


----------



## man from atlantis

man from atlantis said:


> So I just ordered a Palit GameRock OC, waiting the mailman hit the door


So I grabbed my card yesterday. My very short experience so far;
-Very easy to fit 16pin adapter. I heard very clear clicking noise, without applying any excessive force and it fit like a glove. I jiggled just to confirm and it wouldn't come out. I felt bad and think I wish didn't order the cablemod adapter.
-No coil whine whatsoever.
-Card doesn't seem to have much load on PCIe slot, seen 20W max so far at 450W power limit.
-16pin adapter voltage drops from 12.20V to 12.00V at 450W, on Seasonic SS-1000XP circa 2012 PSU. no OCP trip, I didn't have any OCP problem with 440W RTX3080 either FYI.



Spoiler


----------



## elbramso

yzonker said:


> The only way the 4090 Ti is going to be interesting is if Nvidia either finds significantly faster VRAM (beyond 24Gbps) and increases the voltage limit. I suspect neither will happen. Otherwise it'll just be a marginal jump in performance. Maybe 10%.


like it used to be with the 3090ti - really hard to justify. Further if the 4090ti drops it'll only be a few months before the 5090 is up.

btw. I doubt that voltage will fix the VRAM issue - I think voltage is causing the issues... All VRAMs perform worse when cold, which maybe because of an overcurrent protection.


----------



## elbramso

kryptonfly said:


> Your wish is my command
> REbar is forced, I lose around 1000pts on the cpu. Sometimes the voltage falls at 1.095v or less and it crashes, depends of the temp, I need to warm it up a little before to pass it. I can't see more than 550W on the gpu for now, it's where watercooling is useful.
> 
> I scored 39 227 in Time Spy
> 
> View attachment 2589670
> 
> 
> View attachment 2589671
> 
> 
> I've just seen I'm now 7th in graphics score


Almost impossible to beat these french Giga cards^^
I'm happy you don't believe in real cpu's 🤣








I scored 39 720 in Time Spy


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## 681933

I'll join this club soon, getting a MSI Suprim X 4090 after being super disappointed with the RX 7000 generation.

One thing I want to ask, do Seasonic Prime units (I have a 750W Prime PX one) still have intermittent shutdown due to 12V sensing issues? Heard it was an issue on 3080s and 3090s, not sure if it still applies to the 4090.


----------



## Blameless

Several RTX 4090 models came in stock around the same time yesterday afternoon on Best Buy, tried buying anything that was 1800 bucks or less. After failing to get any of my preferred models, I did manage to purchase a Gigabyte Windfarce.

I am aware that this card has what can be considered the second worst PCB of all RTX 4090s. I am aware of Gigabyte's reputation with regard to QC and support. I won't even be able to fit this card in any of the cases I own; had to buy a new one and will have to rebuild one of my systems around this card.

I'm still excited. A slightly suboptimal card is considerably better than no card at all. Having only fourteen vcore phases with budget PWM controllers and 50A power stages doesn't really bother me that much; 700A at 125C is still overkill for my intended use case (gaming, on the stock cooler). I've also had success recently with a Gigabyte RMA, as archaic there system is...not that I'm not going to void the warranty on this card as soon as I make sure it's not DOA.

Planning on testing it when I can pick it up (probably not till after Christmas), then stripping it down, replacing all the TIM (I still have nearly a kilogram of TG-PP-10), and flashing it to the GALAX BIOS. Won't use the full power limit, but I'm aiming for the most performance I squeeze out of ~450w. The Gigabyte cards also have all four memory VRM phases intact, and seem to be doing pretty well with GDDR6X OCing.

Anyway, I'll let you guys know how things turn out.


----------



## Roacoe717

I encountered a error when trying to flash my Liquid X again. The only difference is I applied Nvidia's firmware update a few days ago to prevent a black screen during boot up. Anyone encounter this? I haven't tried other bios yet just galax.


----------



## yzonker

Speaking of, 









NVIDIA Could Give TITAN RTX Another Swing as Maxed-Out AD102 in an Unabashed 4-slot Monstrosity


A report by Moore's Law is Dead claims that NVIDIA is preparing to launch a new TITAN RTX halo product, based on a maxed-out 4 nm "AD102" silicon. Where does this put the RTX 4090 Ti? Somewhere in between the RTX 4090 and the TITAN RTX Ada, as NVIDIA gave itself plenty of segmentation headroom...




www.techpowerup.com


----------



## sym30l1c

4090 FE here: I tried running the Afterburner scanner and got this:

13:09:56 Connected to MSI Afterburner control interface v2.3
13:10:17 GPU1 : VEN_10DE&DEV_2684&SUBSYS_165B10DE&REV_A1&BUS_9&DEV_0&FN_0
13:10:17 Start scanning, please wait a few minutes
13:48:15 Scan succeeded, average core overclock is 165MHz, memory overclock is 200MHz
13:48:15 Dominant limiter
13:48:15 Voltage
13:48:15 Results are considered unstable
13:48:16 Overclocked curve exported to MSI Afterburner

Is this reliable? I have a already tested the card in several games with +220 core and +1000 memory without any issues.
I've always used Precison X1, so I'm not sure whether to trust Afterburner or whether there is a probblem with the card.


----------



## MikeS3000

sym30l1c said:


> 4090 FE here: I tried running the Afterburner scanner and got this:
> 
> 13:09:56 Connected to MSI Afterburner control interface v2.3
> 13:10:17 GPU1 : VEN_10DE&DEV_2684&SUBSYS_165B10DE&REV_A1&BUS_9&DEV_0&FN_0
> 13:10:17 Start scanning, please wait a few minutes
> 13:48:15 Scan succeeded, average core overclock is 165MHz, memory overclock is 200MHz
> 13:48:15 Dominant limiter
> 13:48:15 Voltage
> 13:48:15 Results are considered unstable
> 13:48:16 Overclocked curve exported to MSI Afterburner
> 
> Is this reliable? I have a already tested the card in several games with +220 core and +1000 memory without any issues.
> I've always used Precison X1, so I'm not sure whether to trust Afterburner or whether there is a probblem with the card.


In my opinion Afterburner Auto OC works fine to find a guaranteed stable overclock. It looks like it bumps down at least 30 to 45 mhz from the absolute max to assure stability. I would do you own memory overclock as it looks like +200 is the max it is ever willing to go. I run +1600 no problem and it told me +200.


----------



## sym30l1c

MikeS3000 said:


> In my opinion Afterburner Auto OC works fine to find a guaranteed stable overclock. It looks like it bumps down at least 30 to 45 mhz from the absolute max to assure stability. I would do you own memory overclock as it looks like +200 is the max it is ever willing to go. I run +1600 no problem and it told me +200.


Thanks for the reply! What surprised me is that it says results are unstable after its own scanning.
I'll try more manual overclocking and more tests.


----------



## MikeS3000

sym30l1c said:


> Thanks for the reply! What surprised me is that it says results are unstable after its own scanning.
> I'll try more manual overclocking and more tests.


I think that it's a bug that is says it is unstable because mine said that too.


----------



## sym30l1c

MikeS3000 said:


> I think that it's a bug that is says it is unstable because mine said that too.


Got it, thanks!


----------



## doom26464

What is bottleneck the core on these cards so much?? More and more test I run it seems like higher core clocks has meaningless effect. 

Second thing I'm ready to start playing with vram overclocks now. Seems from reading that vram actually has the biggest performance gains in games anyways(guessing this is my bottleneck and answer to My first question) 

What are people using to test vram stablity without error correction negative results? I see results from people everything from +1000 to +2000 and everything in between. Seems like the more normal average is +1400 to +1700.


----------



## man from atlantis

doom26464 said:


> What is bottleneck the core on these cards so much?? More and more test I run it seems like higher core clocks has meaningless effect.
> 
> Second thing I'm ready to start playing with vram overclocks now. Seems from reading that vram actually has the biggest performance gains in games anyways(guessing this is my bottleneck and answer to My first question)
> 
> What are people using to test vram stablity without error correction negative results? I see results from people everything from +1000 to +2000 and everything in between. Seems like the more normal average is +1400 to +1700.


I think increasing the VRAM clocks until 3dmark score starts decreasing is good enough.


----------



## Lmah2x

doom26464 said:


> What is bottleneck the core on these cards so much?? More and more test I run it seems like higher core clocks has meaningless effect.
> 
> Second thing I'm ready to start playing with vram overclocks now. Seems from reading that vram actually has the biggest performance gains in games anyways(guessing this is my bottleneck and answer to My first question)
> 
> What are people using to test vram stablity without error correction negative results? I see results from people everything from +1000 to +2000 and everything in between. Seems like the more normal average is +1400 to +1700.


Core clocks after 2500 honestly don't add much performance. It's like a 3% performance gain from 2500Mhz to 3000Mhz with the default power target. You see slightly more performance at a 133% power target, maybe 4-6%. Can't say it's worth it at 600w for daily use, only really beneficial for benchmarking.

I only use the 133% power target on my undervolts.


----------



## 681933

Crylune said:


> I'll join this club soon, getting a MSI Suprim X 4090 after being super disappointed with the RX 7000 generation.
> 
> One thing I want to ask, do Seasonic Prime units (I have a 750W Prime PX one) still have intermittent shutdown due to 12V sensing issues? Heard it was an issue on 3080s and 3090s, not sure if it still applies to the 4090.


Does... Nobody know?


----------



## chibi

How old is your PSU? Just submit an rma claim to get a new one. State you’re hitting ocp due to newer gen gpu.


----------



## J7SC

yzonker said:


> Speaking of,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA Could Give TITAN RTX Another Swing as Maxed-Out AD102 in an Unabashed 4-slot Monstrosity
> 
> 
> A report by Moore's Law is Dead claims that NVIDIA is preparing to launch a new TITAN RTX halo product, based on a maxed-out 4 nm "AD102" silicon. Where does this put the RTX 4090 Ti? Somewhere in between the RTX 4090 and the TITAN RTX Ada, as NVIDIA gave itself plenty of segmentation headroom...
> 
> 
> 
> 
> www.techpowerup.com


...marketed only as NVidia FE and not AIC partners ? For $2,500 + ? a.) Nuts... b.) Shades of EVGA



Blameless said:


> Several RTX 4090 models came in stock around the same time yesterday afternoon on Best Buy, tried buying anything that was 1800 bucks or less. After failing to get any of my preferred models, I did manage to purchase a Gigabyte Windfarce.
> 
> I am aware that this card has what can be considered the second worst PCB of all RTX 4090s. I am aware of Gigabyte's reputation with regard to QC and support. I won't even be able to fit this card in any of the cases I own; had to buy a new one and will have to rebuild one of my systems around this card.
> 
> I'm still excited. A slightly suboptimal card is considerably better than no card at all. Having only fourteen vcore phases with budget PWM controllers and 50A power stages doesn't really bother me that much; 700A at 125C is still overkill for my intended use case (gaming, on the stock cooler). I've also had success recently with a Gigabyte RMA, as archaic there system is...not that I'm not going to void the warranty on this card as soon as I make sure it's not DOA.
> 
> Planning on testing it when I can pick it up (probably not till after Christmas), then stripping it down, replacing all the TIM (I still have nearly a kilogram of TG-PP-10), and flashing it to the GALAX BIOS. Won't use the full power limit, but I'm aiming for the most performance I squeeze out of ~450w. The Gigabyte cards also have all four memory VRM phases intact, and seem to be doing pretty well with GDDR6X OCing.
> 
> Anyway, I'll let you guys know how things turn out.


Congrats !
As has been established here plenty of times, the AD102 4090s are already pretty maxed when it gets to out-of-the-box performance. This won't stop folks like myself from trying to squeeze more out of it (hopefully just for fun), but baseline performance in gaming is already off the charts in stock mode, with little to be gained via oc. I got the Giga-G-OC (US$ 1,619 at the time) back in mid-October as it was the only card available locally (thanks again @mirkendargen ) and even online. The fact that it is a great clocker is nice but secondary - I wasn't ready yet to update my gamer system from a highly-tuned 5950X, and in apps like FS2020 and CP 2077 (on which I spent most of my gaming time), DLSS3, Fr.Ins. and NV reflex practically paid for themselves already.

Keep the core cool and the VRAM warm - and enjoy !


----------



## jonnyzeraa

Hello guys, my 4090 AMP Extreme will be delivered tomorrow, since this topic have close to 400 pages and currently i don't have much free time for read each page, is anyone can tell me a guide or tips for extract the maximum performance this card can give to me? or the best way to go is made a undervolt or something? thanks!!


----------



## 8472

yzonker said:


> Speaking of,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> NVIDIA Could Give TITAN RTX Another Swing as Maxed-Out AD102 in an Unabashed 4-slot Monstrosity
> 
> 
> A report by Moore's Law is Dead claims that NVIDIA is preparing to launch a new TITAN RTX halo product, based on a maxed-out 4 nm "AD102" silicon. Where does this put the RTX 4090 Ti? Somewhere in between the RTX 4090 and the TITAN RTX Ada, as NVIDIA gave itself plenty of segmentation headroom...
> 
> 
> 
> 
> www.techpowerup.com


They won't release that until 4090s stop selling out. Even then it'll probably only be available at best buy, so good luck getting one.


----------



## man from atlantis

even pumping +2000MHz on VRAM still increases port royal score on my card.


CCLKMCLK BUMPPORT ROYALPERF2700025100100%270050025286101%270075025460101%2700100025562102%2700110025562102%2700120025704102%2700130025701102%2700140025767103%2700150025785103%2700160025837103%2700170025896103%2700180025878103%2700190025919103%2700200025941103%


----------



## Lmah2x

jonnyzeraa said:


> Hello guys, my 4090 AMP Extreme will be delivered tomorrow, since this topic have close to 400 pages and currently i don't have much free time for read each page, is anyone can tell me a guide or tips for extract the maximum performance this card can give to me? or the best way to go is made a undervolt or something? thanks!!


Overclocking VRAM offers the most performance. Then I would say raising Power Target. Last would be GPU core.

Raising power target on an undervolt also benefits on these cards.


----------



## Krzych04650

Frame Gen is such a game changer in Witcher 3 Next-gen, it doubles performance in CPU bound scenarios.









In GPU bound scenarios it more like 35-50% though, so a lot less.

Also ReBar has huge gains for me









Game still runs like a disaster though, it feels like it is running on an emulator. Which it actually kind of is as it is using DX11 to DX12 translation instead of actual DX12, hence why such a massive CPU issues.
But is not even raw performance that is the problem, frametimes are terrible, it takes ages for everything to appear after loading screen, game needs like 3 seconds to recover after you quit the map or journal or any other screen and the framtime then looks like a damn step drill.









I don't think this game can be fixed; it is so fundamentally broken that it will always run like that.


----------



## 681933

chibi said:


> How old is your PSU? Just submit an rma claim to get a new one. State you’re hitting ocp due to newer gen gpu.


Replying to me? I don't have the 4090 yet so I don't know, the PSU is merely 1 year old, it is a Seasonic Prime PX-750. I want to know if the 12V sensing shutdown issues are still present in RTX 40 GPUs with the new adapter.


----------



## doom26464

Krzych04650 said:


> Frame Gen is such a game changer in Witcher 3 Next-gen, it doubles performance in CPU bound scenarios.
> View attachment 2589849
> 
> 
> In GPU bound scenarios it more like 35-50% though, so a lot less.
> 
> Also ReBar has huge gains for me
> View attachment 2589850
> 
> 
> Game still runs like a disaster though, it feels like it is running on an emulator. Which it actually kind of is as it is using DX11 to DX12 translation instead of actual DX12, hence why such a massive CPU issues.
> But is not even raw performance that is the problem, frametimes are terrible, it takes ages for everything to appear after loading screen, game needs like 3 seconds to recover after you quit the map or journal or any other screen and the framtime then looks like a damn step drill.
> View attachment 2589851
> 
> 
> I don't think this game can be fixed; it is so fundamentally broken that it will always run like that.


I noticed when playing around with it even on my 4090 I'm getting like 90ish fps but experience some heavy stutters when running around. It feels still semi broken. Sucks as the original witcher 3 ran like butter after patch 1.6(I think?)

Also it's not like constant stutters, more like it will run smooth for about 30seconds then out of the blue feels like one big ass stutters then back to smooth.


----------



## ttnuagmada

man from atlantis said:


> even pumping +2000MHz on VRAM still increases port royal score on my card.
> 
> 
> CCLKMCLK BUMPPORT ROYALPERF2700025100100%270050025286101%270075025460101%2700100025562102%2700110025562102%2700120025704102%2700130025701102%2700140025767103%2700150025785103%2700160025837103%2700170025896103%2700180025878103%2700190025919103%2700200025941103%


While the added cache mostly makes up for not increasing the bandwidth at all, there was 0 chance it was going to completely mask it. These things are definitely bandwidth starved to a degree.


----------



## mirkendargen

doom26464 said:


> I noticed when playing around with it even on my 4090 I'm getting like 90ish fps but experience some heavy stutters when running around. It feels still semi broken. Sucks as the original witcher 3 ran like butter after patch 1.6(I think?)
> 
> Also it's not like constant stutters, more like it will run smooth for about 30seconds then out of the blue feels like one big ass stutters then back to smooth.


The patch for it the other day fixed that for me.


----------



## J7SC

...last bit of fun before Christmas festivities home-office clean-up 🤪 ....4090 left just a bit on the table re. clocks, but there's always post-cleanup fun to be had.

I also added (older) results for the 6900XT and 2x 2080 Ti below (normally systems used for work). Time to re-run those before the clean-up after New Years...


----------



## SilenMar

pat182 said:


> i have the pg32uqx, so almost the same gaming hdr experience, i wouldnt trade small trade local dimming bloom with 1800 peak nit for a 300 nit oled experience lol
> kinda miss my pg27uq too, it was a good monitor with so much ppi, 32inch is nice tho


Windows Auto HDR + 4090 in any games such Mirror's edge at 4K 144Hz is a treat on PG32UQX. There are 1,600nits highlights everywhere to showcase more realistic contrast. 

There is no other way to create high brightness, high color gamut when a display is only able to create a 400nits sun as brown as a pin pong ball. A display cannot get that much contrast where the 400nits sun against 0.1nits shadow 400:0.1 vs 2000nits sun against shadow 2000:0.1. It won't show that much color either. The eyes won't be able to see these 2^10 or 1024 shades of 10-bit color when they are cramped under 400nits instead of 1000nits. This is why Dolby blocked its tone mapping and license on sub-HDR displays like C1 in Netflix. It will just look worse compared to original HDR 10 static clipping content. It can look even worse than 400nits SDR. High chances are some people who works in OLED don't understand HDR. Their way of interpreting contrast is limited. Color is ignored as well. What OLED can show is still basically SDR plus a few dots on the iceberg of HDR. 

A true HDR display needs at least 1,000nits no ABL, full track of EOTF, 85% Rec.2020, 1milllion:1 contrast. Not many display can meet this requirement. Not many DV display are out there. An OLED can meet only 1 of the requirements and that's on the condition close to 0nit black background which is just a relative contrast where the image can have very limited color.


----------



## SilenMar

Krzych04650 said:


> Frame Gen is such a game changer in Witcher 3 Next-gen, it doubles performance in CPU bound scenarios.


Frame Generator/DLSS 3.0 only increases the frame rate. The input lag is massive. 
The total latency in Witcher 4.00 is around 90ms the same as running at 40fps while the old 1.32 Witcher has just 6.6ms rendering latency at 4K 144Hz.

Stutter issue can be solved by G-sync. 

Overall there is no problem playing both version. It's just the next-gen update is not meant for the current-gen. What I care about is the story. Graphics can be outdated. Combat system can be outdated. Story won't be. They should've put more effort into the quests. 

I do like the asymmetric view on the 4.00 though.


----------



## mirkendargen

SilenMar said:


> Windows Auto HDR + 4090 in any games such Mirror's edge at 4K 144Hz is a treat


AutoHDR is indeed awesome and is the best reason to go to Windows 11. Make sure to run the HDR calibration app from the Microsoft store because the config from it is what AutoHDR uses. My experience is cell shaded games look the best with it. Persona 5, etc. It also does HDR in Cyberpunk better than the broke ass in-engine implementation.


----------



## yzonker

mirkendargen said:


> AutoHDR is indeed awesome and is the best reason to go to Windows 11. Make sure to run the HDR calibration app from the Microsoft store because the config from it is what AutoHDR uses. My experience is cell shaded games look the best with it. Persona 5, etc. It also does HDR in Cyberpunk better than the broke ass in-engine implementation.


Thanks for the tip. Did not know about the calibration app.


----------



## StreaMRoLLeR

KedarWolf said:


> Negative LOD BIAS tweaks are cheats and every benchmark should invalidate them.


dont tell me PR is plagued with -3 lod bias cheat aswell


----------



## yzonker

StreaMRoLLeR said:


> dont tell me PR is plagued with -3 lod bias cheat aswell


No, 3DMark flags it and invalidates the result.


----------



## StreaMRoLLeR

mirkendargen said:


> AutoHDR is indeed awesome and is the best reason to go to Windows 11. Make sure to run the HDR calibration app from the Microsoft store because the config from it is what AutoHDR uses. My experience is cell shaded games look the best with it. Persona 5, etc. It also does HDR in Cyberpunk better than the broke ass in-engine implementation.


+1

In Game calibration tools matches perfect black level and clip point for my G1 OLED since Day 1 release. BUT for Auto-HDR i think game also must support it. For example Dying Light 2 have a Auto HDR " tag " activating in menu, and notifies its working. Some games do not have this tag hence worse performance then SDR.


----------



## HeLeX63

Hey guys. I purchased the MSI RTX 4090 Suprim X along with an Alphacool full cover WB to be connected to my loop with a 560mm and 480mm radiator cooling a monoblock and GPU itself. I saw that the Suprim X has a bios limit power of 520W. Since I am watercooling (which gains me more power efficiency and I lose the fans as well) I'd say I have the equivalent of 540W. Is the 60W difference to 600W detrimental ???

Do I need that 60 watts or so to OC ?


----------



## SilenMar

mirkendargen said:


> AutoHDR is indeed awesome and is the best reason to go to Windows 11. Make sure to run the HDR calibration app from the Microsoft store because the config from it is what AutoHDR uses. My experience is cell shaded games look the best with it. Persona 5, etc. It also does HDR in Cyberpunk better than the broke ass in-engine implementation.


I don't need to use that tool as it is still limited. The config only reads from firmware. There is no need to use Windows HDR Calibration if the monitor firmware EOTF is already tracked accurately. It will look the same after that calibration. Only the monitor that alters EOTF needs such tools. The actual HDR settings such as maximum Content Light Level is locked. And on other monitor there is a chance to calibrate a fake HDR 1000 into an actual HDR 400. 

Windows Auto HDR corresponds to the dynamic range of a monitor. If a monitor is capable of HDR 1400, the SDR will be scaled to HDR 1400 with higher contrast, higher gamut. This means if the monitor is good at HDR, Auto HDR will look good. If the monitor is crap at HDR, the Auto HDR looks like crap. The content itself doesn't matter much.

And the HDR in Cyberpunk is pretty good. High light is getting over 2000nits. Low light is near 0nit. Easy 800+nits APL in daylight.


----------



## alasdairvfr

mirkendargen said:


> The patch for it the other day fixed that for me.


The patch somewhat resolved it for me but new scenes take a few seconds of stuttering before it smooths out. Improved from non-stop stuttering for sure, but still not acceptable since it ran silky smooth before. I was looking at this remaster as an excuse to replay the game, maybe I'll play it on the previous version as it still looked fantastic and ran so smoothly. The improvements don't justify the jank for me.

I'll wait a few months before calling it a day with the remaster.


----------



## Nico67

Finally, Lapped the EK Strix block and got it all up and working. The original Strix mount was pretty bad, and my ram pretty much won't do better than +1000, but the waterblock mount is much better and the cooling is giving me 120 more mhz at muchless power draw and temp.

I'll add some more pics, but here's the cooling now,










21c watertemp and 38c max for a17c delta at 496w with about 7c hotspot delta. 

orginal cooler/ card pics showing tiny dry contact in centre


















bad pic of lapped ek block 










ek block / card contact is much better, tightened to left screw a bit more so should be even better now.


















card ready pasted with narrow pre-squished pads 










system for interests sake


----------



## inedenimadam

SilenMar said:


> Frame Generator/DLSS 3.0 only increases the frame rate. The input lag is massive.
> The total latency in Witcher 4.00 is around 90ms the same as running at 40fps while the old 1.32 Witcher has just 6.6ms rendering latency at 4K 144Hz.
> 
> Stutter issue can be solved by G-sync.


The input lag is noticeable, but im still using frame generation anyway. Single player w/controller, so I can get used to the delay. This implementation seems better than it was in plague tale. I am not getting noticeable artifacts this time at least.


----------



## J7SC

inedenimadam said:


> The input lag is noticeable, but im still using frame generation anyway. Single player w/controller, so I can get used to the delay. This implementation seems better than it was in plague tale. I am not getting noticeable artifacts this time at least.


FYI...in FS 2020, they recently added not only DLSS3 and Frame Insertion, but now also NV Reflex - really works


----------



## mirkendargen

inedenimadam said:


> The input lag is noticeable, but im still using frame generation anyway. Single player w/controller, so I can get used to the delay. This implementation seems better than it was in plague tale. I am not getting noticeable artifacts this time at least.


You can see ever so slight artifacts around the text of subtitles while also moving, but you have to really look for it to notice it. Doesn't detract from the game at all.


----------



## mouacyk

Nico67 said:


> Finally, Lapped the EK Strix block and got it all up and working. The original Strix mount was pretty bad, and my ram pretty much won't do better than +1000, but the waterblock mount is much better and the cooling is giving me 120 more mhz at muchless power draw and temp.
> 
> I'll add some more pics, but here's the cooling now,
> 
> View attachment 2589946
> 
> 
> 21c watertemp and 38c max for a17c delta at 496w with about 7c hotspot delta.
> 
> orginal cooler/ card pics showing tiny dry contact in centre
> 
> View attachment 2589954
> 
> View attachment 2589955
> 
> 
> bad pic of lapped ek block
> 
> View attachment 2589958
> 
> 
> ek block / card contact is much better, tightened to left screw a bit more so should be even better now.
> 
> View attachment 2589959
> 
> View attachment 2589960
> 
> 
> card ready pasted with narrow pre-squished pads
> 
> View attachment 2589963
> 
> 
> system for interests sake
> 
> View attachment 2589964


That's hardcore. Why not liquid metal it also, for additional cooling and clocks?


----------



## motivman

So I got waterchiller, but seems like its only good for short benchmark runs. I set it to 55F and ran Port Royal Stress test, after 20 runs, Water temperature got to 89F, lol. This is about 8C delta T from my ambient temps. I get the same performance from my Mora 360 LT with fans running at 1000 rpm. Maybe I need one of those 1/2HP waterchillers??


----------



## KingEngineRevUp

motivman said:


> So I got waterchiller, but seems like its only good for short benchmark runs. I set it to 55F and ran Port Royal Stress test, after 20 runs, Water temperature got to 89F, lol. This is about 8C delta T from my ambient temps. I get the same performance from my Mora 360 LT with fans running at 1000 rpm. Maybe I need one of those 1/2HP waterchillers??


If the COP is 2, then it removes 2 watts of heat for 1 watt spent. You need a 300-400W aquarium chiller I believe.


----------



## motivman

KingEngineRevUp said:


> If the COP is 2, then it removes 2 watts of heat for 1 watt spent. You need a 300-400W aquarium chiller I believe.


i got one of those chinese made 1/3hp chiller... it cant hold temps at all, haha.









Amazon.com : NA Aquarium Chiller 79Gal 1/3 HP Water Chiller for Hydroponics System Low Noise for Fish Tank Axolotl Coral Reef Tank 300L : Pet Supplies


Amazon.com : NA Aquarium Chiller 79Gal 1/3 HP Water Chiller for Hydroponics System Low Noise for Fish Tank Axolotl Coral Reef Tank 300L : Pet Supplies



www.amazon.com


----------



## Mad Pistol

First serious attempt at benchmarking and... this Gigabyte Windforce actually got some balls.

+100 voltage
+225 Core
+1500 Mem
106% power (max)
85% fan speed



















Unfortunately, I think my B450 PCIe 3.0 interface is holding me back now... damn.


----------



## J7SC

I think I found the core top speed (via MSI AB). Speaking of MSI AB, I also tried out the Galax HoF AI software; not bad ! ...I just wish that VRAM voltage controller would work on my card, which has 4 phases for the VRAM but a different IC controller. One of these days, I'll find voltage control for VRAM (I hope). Or do a hardware mod ?


----------



## Krzych04650

schoolofmonkey said:


> Which one you using, because I gave up on my LG C2 42" due to the horrible VRR flicker in low light scenes.


Both WOLED and QD-OLED have this issue inherently, no way around it. Also it isn't just low light scenes, it is most visible in low light scenes but average scenes usually have darker parts too. People are in denial about it since OLED basically became a cult, but it is a fact and it sucks because it makes all current OLEDs unsuitable for gaming. And the worst part about it is that this issue has been known and acknowledged by manufacturers for like 5 years, and yet new iterations of WOLED keep coming out with this issue unresolved, and brand new QD-OLED tech also came out with exact same issue, so it is really inherent to the OLED tech itself and hopes for that ever getting fixed are growing thin. 



motivman said:


> So I got waterchiller, but seems like its only good for short benchmark runs. I set it to 55F and ran Port Royal Stress test, after 20 runs, Water temperature got to 89F, lol. This is about 8C delta T from my ambient temps. I get the same performance from my Mora 360 LT with fans running at 1000 rpm. Maybe I need one of those 1/2HP waterchillers??


I was only loosely considering a chiller, but from what I've gathered back then you do need 1/2 HP for setups like this. And this kind of a chiller draws almost 400W from the wall and costs like 600 EUR, so I went with MORA instead. Just for benching though it would certainly be fun, it is like winter on demand


----------



## J7SC

Krzych04650 said:


> ...
> 
> I was only loosely considering a chiller, but from what I've gathered back then you do need 1/2 HP for setups like this. And this kind of a chiller draws almost 400W from the wall and costs like 600 EUR, so I went with MORA instead. Just for benching though it would certainly be fun, it is like winter on demand


...these 4090s are different, though, IMO, than the previous gens re. cooling vs performance. Core is mostly voltage-limited and while cooling helps with speed-step losses a little bit, VRAM needs to be warmer. With a 1320 x 63 mm w-cooling setup, I usually only see one speed-step loss. Benching it with LN2 or even DICE would be fun, though, as long as it has a VRAM heater.

There is also the question of upcoming HEDT systems from AMD and Intel - by the time all is said and done re. one of those and a 4090+, you're probably looking at over 1000 W of heat energy to get rid off during oc'ed benching. That is really nothing new...we have a real cold spell (for our region) with s.th. like - 12 C, so I did some benching with my four-year old Threadripper (oc at 340 W) and 2x 2080 Ti Gigabyte Xtr WF WB (380W each)... Superposition 8K below, but also ran Heaven on multiple cycles...that 1,100 W warmed the room up quickly (no kidding) 😀


----------



## Krzych04650

J7SC said:


> ...these 4090s are different, though, IMO, than the previous gens re. cooling vs performance. Core is mostly voltage-limited and while cooling helps with speed-step losses a little bit, VRAM needs to be warmer. With a 1320 x 63 mm w-cooling setup, I usually only see one speed-step loss. Benching it with LN2 or even DICE would be fun, though, as long as it has a VRAM heater.


Yea they are and running water like 10C above ambient is probably better for 4090 than 10C below ambient with the chiller since VRAM really doesn't like low temperatures like this, and scaling with core clock is not great. It is quite bizarre how it works this time around.

Still, if you wanted to have a set temperature on the chiller maintained for continuous load with 4090 in the loop, 1/4 HP cannot keep up with that and it will end up with worse water temp than MORA while burning all the power.



J7SC said:


> There is also the question of upcoming HEDT systems from AMD and Intel - by the time all is said and done re. one of those and a 4090+, you're probably looking at over 1000 W of heat energy to get rid off during oc'ed benching. That is really nothing new...we have a real cold spell (for our region) with s.th. like - 12 C, so I did some benching with my four-year old Threadripper (oc at 340 W) and 2x 2080 Ti Gigabyte Xtr WF WB (380W each)... Superposition 8K below, but also ran Heaven on multiple cycles...that 1,100 W warmed the room up quickly (no kidding) 😀
> View attachment 2589991


Yea I also I had 2x2080 Ti and on X99 HEDT, I managed to get like 1120W power reading from the wall once, AX1600i was even showing spikes up to 1300W in the software. I had to upgrade to it from 1200W Seasonic Prime Platinum because it couldn't handle transient spikes that system had even in games. And that was with 380W BIOS on 2080 Tis which wasn't exactly enough for them at 1093mV sometimes, it could have been even worse. Fun times.

Putting in two 4090s with upcoming HEDT would really test this power supply


----------



## Blameless

Crylune said:


> Replying to me? I don't have the 4090 yet so I don't know, the PSU is merely 1 year old, it is a Seasonic Prime PX-750. I want to know if the 12V sensing shutdown issues are still present in RTX 40 GPUs with the new adapter.


There should be no shutdown issues on that model of PSU. The OCP and OPP margins are significant.









Seasonic PRIME Series 750 W Review


We evaluate Seasonic's Prime 750 unit, which boasts amazing performance, including 80 PLUS Titanium efficiency, a set of interesting features, and a unique look. Seasonic has an impressive Titanium entry on their hands that will make other OEMs feel uncomfortable because it raises the...




www.techpowerup.com


----------



## 8472

The corsair cables are in stock as of a few minutes ago.


----------



## SPL Tech

what you need is 10 MOARs with 5000 RPM fans. should be fine then.


----------



## mirkendargen

SPL Tech said:


> what you need is 10 MOARs with 5000 RPM fans. should be fine then.


You gotta put a MORA in the freezer, duh.


----------



## StreaMRoLLeR

J7SC said:


> I think I found the core top speed (via MSI AB). Speaking of MSI AB, I also tried out the Galax HoF AI software; not bad ! ...I just wish that VRAM voltage controller would work on my card, which has 4 phases for the VRAM but a different IC controller. One of these days, I'll find voltage control for VRAM (I hope). Or do a hardware mod ?
> View attachment 2589988


Hmm I wonder what is the IC of memory and gpu in HoF, if it is Monolithic Power Systems MP2891 then Suprim owners should take control of it.


----------



## schoolofmonkey

Krzych04650 said:


> Both WOLED and QD-OLED have this issue inherently, no way around it. Also it isn't just low light scenes, it is most visible in low light scenes but average scenes usually have darker parts too. People are in denial about it since OLED basically became a cult, but it is a fact and it sucks because it makes all current OLEDs unsuitable for gaming. And the worst part about it is that this issue has been known and acknowledged by manufacturers for like 5 years, and yet new iterations of WOLED keep coming out with this issue unresolved, and brand new QD-OLED tech also came out with exact same issue, so it is really inherent to the OLED tech itself and hopes for that ever getting fixed are growing thin.


See I'm glad there's people like yourself that understand, on another forum I was basically told I was crazy and my C2 was "faulty", until I got them to run VRR Flicker test.

Don't get me wrong, my C2 42" is worse the the lounge room's C2 55" my wife uses for gaming (3090), but we have different brightness settings etc, both have VRR on.

The OLED's look fantastic, but it's just not truly suited to PC gaming with VRR, the 32" Odyssey G7 IPS has been more or less perfect other than the stupid scan line issue most of the newer Samsung screens suffer from, at least I can use G-Sync and not feel like I'm in a disco, and IPS still looks fine.


----------



## 681933

Blameless said:


> There should be no shutdown issues on that model of PSU. The OCP and OPP margins are significant.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seasonic PRIME Series 750 W Review
> 
> 
> We evaluate Seasonic's Prime 750 unit, which boasts amazing performance, including 80 PLUS Titanium efficiency, a set of interesting features, and a unique look. Seasonic has an impressive Titanium entry on their hands that will make other OEMs feel uncomfortable because it raises the...
> 
> 
> 
> 
> www.techpowerup.com


I sure hope so.


----------



## yzonker

Krzych04650 said:


> Both WOLED and QD-OLED have this issue inherently, no way around it. Also it isn't just low light scenes, it is most visible in low light scenes but average scenes usually have darker parts too. People are in denial about it since OLED basically became a cult, but it is a fact and it sucks because it makes all current OLEDs unsuitable for gaming. And the worst part about it is that this issue has been known and acknowledged by manufacturers for like 5 years, and yet new iterations of WOLED keep coming out with this issue unresolved, and brand new QD-OLED tech also came out with exact same issue, so it is really inherent to the OLED tech itself and hopes for that ever getting fixed are growing thin.
> 
> 
> 
> I was only loosely considering a chiller, but from what I've gathered back then you do need 1/2 HP for setups like this. And this kind of a chiller draws almost 400W from the wall and costs like 600 EUR, so I went with MORA instead. Just for benching though it would certainly be fun, it is like winter on demand


That's completely incorrect regarding the chiller. As I've said before, my 1/4 hp chiller can hold mid teens at gaming loads in the 500-600w range. The one @motivman bought must either be defective or it's spec'ed higher than it actually performs.


----------



## StreaMRoLLeR

What is your effective clock at dying light 2 guys ? At 1.1 and 3000 ish ( 4K dlss quality )

Mine is staying near 2860


----------



## yzonker

Chiller test. Water temp started around 14-15C. Just stayed there for the run. If I start higher, it'll slowly creep down and still equalize somewhere near there. Gave me a "not passed" probably because I was doing a bunch of other stuff at the same time and the framerate dropped on one or more loops.









I scored 0 in Speed Way Stress Test


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com







Spoiler: Pics


----------



## Glottis

Krzych04650 said:


> Both WOLED and QD-OLED have this issue inherently, no way around it. Also it isn't just low light scenes, it is most visible in low light scenes but average scenes usually have darker parts too. People are in denial about it since OLED basically became a cult, but it is a fact and it sucks because it makes all current OLEDs unsuitable for gaming. And the worst part about it is that this issue has been known and acknowledged by manufacturers for like 5 years, and yet new iterations of WOLED keep coming out with this issue unresolved, and brand new QD-OLED tech also came out with exact same issue, so it is really inherent to the OLED tech itself and hopes for that ever getting fixed are growing thin.


Let's see.

VRR flicker
ABL auto brightness limiter
APL average picture level dimming
Black/shadow crush
Burn-in
Near black chrominance overshoot
Have to use pixel shifting (making desktop and PC usage look like crap)
Have to use logo dimming (making game UI look like crap)
Pink tint lottery
Vertical banding lottery
Have to disable desktop icons, hide taskbar, resort to other "babying" techniques.
Did I miss anything?

OLED sure is amazing. I probably would have burned 2 OLEDs by now considering how much I played Diablo II Resurrected on my TV


----------



## Krzych04650

StreaMRoLLeR said:


> What is your effective clock at dying light 2 guys ? At 1.1 and 3000 ish ( 4K dlss quality )
> 
> Mine is staying near 2860


Around 2960, although this is thanks to Galax 666W BIOS. All other BIOSes that I've tried would drop to around 2840-2880 in RT games with 3000 set clock, so this is an expected behavior for most cards with stock BIOS I think.











Glottis said:


> Let's see.
> 
> VRR flicker
> ABL auto brightness limiter
> APL average picture level dimming
> Black/shadow crush
> Burn-in
> Near black chrominance overshoot
> Have to use pixel shifting (making desktop and PC usage look like crap)
> Have to use logo dimming (making game UI look like crap)
> Pink tint lottery
> Vertical banding lottery
> Have to disable desktop icons, hide taskbar, resort to other "babying" techniques.
> Did I miss anything?
> 
> OLED sure is amazing. I probably would have burned 2 OLEDs by now considering how much I played Diablo II Resurrected on my TV


Yea that is the thing, OLED has some amazing qualities, and it does fix some long-standing LED LCD issues like low contrast and glow/bleed, and it can look absolutely astounding in some demos, but then real-world usage hits and it turns out that for every one issue that it fixes it introduces three new ones. It is great as a TV for movies and shows exclusively since almost none of the flaws matter in this case and the qualities that it has are especially desirable there, but it definitely doesn't stand the test of real-world desktop usage.


----------



## yzonker

One thing I'm noticing is that the OLED owners don't seem to feel the need to make posts to justify their purchase. Lol.


----------



## doom26464

After playing around with vram OC yesterday I got all the way up to +2000 with slider maxed. No artifacting and my graphic score kept going up in timespy. Ended with a 39218 graphic score.

Not sure if this is normal? Or maybe I got really good ram?


Dialed it back to +1900 to be sure but tested a bunch of games on It and no crashes, artifacts or even in cyberpunk benchmark no major drop to min fps results.


----------



## alasdairvfr

Another Witcher 3 hotfix. Last one was an improvement but not quite there. Will check this one out later on.

Small download, high diskspace churn though!


----------



## jootn2kx

doom26464 said:


> After playing around with vram OC yesterday I got all the way up to +2000 with slider maxed. No artifacting and my graphic score kept going up in timespy. Ended with a 39218 graphic score.
> 
> Not sure if this is normal? Or maybe I got really good ram?
> 
> 
> Dialed it back to +1900 to be sure but tested a bunch of games on It and no crashes, artifacts or even in cyberpunk benchmark no major drop to min fps results.


I don't know I wouldn't recommend above +1500.
Mine does the same +1900 a 2000+ without artifacting in games and benchmarks and without losing performance.
But suddenly after 1 month it gaves weird artifacts over the whole screen when I was randomly browsing on the internet, crashed and pc rebooted.
Then couple of days later same thing, was looking pretty scary.
Dialed back to +1500 to be safe and since then no more random artifacts/crashes


----------



## lawson67

doom26464 said:


> After playing around with vram OC yesterday I got all the way up to +2000 with slider maxed. No artifacting and my graphic score kept going up in timespy. Ended with a 39218 graphic score.
> 
> Not sure if this is normal? Or maybe I got really good ram?
> 
> 
> Dialed it back to +1900 to be sure but tested a bunch of games on It and no crashes, artifacts or even in cyberpunk benchmark no major drop to min fps results.


The same on My Zotac Extreme Vram scales all the way to 2000mhz without any artifacts even though after 1950mhz i get errors with the Vulcan memory test, however it still scales for that extra 50mhz


----------



## jootn2kx

lawson67 said:


> The same on My Zotac Extreme Vram scales all the way to 2000mhz without any artifacts even though after 1950mhz i get errors with the Vulcan memory test, however it still scales for that extra 50mhz


 Interesting which tool are you using for the memory test?


----------



## man from atlantis

doom26464 said:


> After playing around with vram OC yesterday I got all the way up to +2000 with slider maxed. No artifacting and my graphic score kept going up in timespy. Ended with a 39218 graphic score.
> 
> Not sure if this is normal? Or maybe I got really good ram?
> 
> 
> Dialed it back to +1900 to be sure but tested a bunch of games on It and no crashes, artifacts or even in cyberpunk benchmark no major drop to min fps results.


Same for me but I got visual errors in Metro Exodus EE, while it passes without a problem in CP77. I dialed back to +1500MHz, it looks good on the RTSS overlay as 3000MHz aswell, lol.

CCLKMCLK BUMPCP77PERF27002625049,25100%2700275050049,65101%2700281375049,97101%27002875100050,13102%27002900110050,22102%27002925120050,31102%27002951130050,44102%27002975140050,64103%27003000150050,81103%27003025160050,74103%27003050170050,79103%27003076180050,84103%27003101190050,95103%27003125200051,00104%


----------



## yzonker

Hey @J7SC , must be nice living in that warm climate up there.  That's Fahrenheit BTW.


----------



## motivman

yzonker said:


> Chiller test. Water temp started around 14-15C. Just stayed there for the run. If I start higher, it'll slowly creep down and still equalize somewhere near there. Gave me a "not passed" probably because I was doing a bunch of other stuff at the same time and the framerate dropped on one or more loops.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 0 in Speed Way Stress Test
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Pics
> 
> 
> 
> 
> View attachment 2590021
> 
> 
> View attachment 2590022


sheesh, is my chiller garbage then? I do not get what I am doing wrong.


----------



## motivman

yzonker said:


> Chiller test. Water temp started around 14-15C. Just stayed there for the run. If I start higher, it'll slowly creep down and still equalize somewhere near there. Gave me a "not passed" probably because I was doing a bunch of other stuff at the same time and the framerate dropped on one or more loops.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 0 in Speed Way Stress Test
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Pics
> 
> 
> 
> 
> View attachment 2590021
> 
> 
> View attachment 2590022


Can you repeat the same test with your maximum stable overclock for port royal? 20 loops? Also, what is the actual flow rate of your system with the chiller. I noticed If I reduce my flow rate from 300 lph to around 125 lph, the chiller is able to hold colder temps longer. My chiller went up 25F (55F to around 80F ) by running 20 loops of port royal, card was outputting about 560W, not including the additional 80-100W from my CPU.


----------



## yzonker

motivman said:


> sheesh, is my chiller garbage then? I do not get what I am doing wrong.


Here's the clue. My chiller is rated at almost 5 amps. 

Rated current: 1.6A






Active Aqua Chiller with Power Boost, 1/4 HP


Active Aqua Chiller, 1/4 HP Boost




www.hydrofarm.com





No idea about flow. I don't have a flow meter.


----------



## motivman

yzonker said:


> Here's the clue. My chiller is rated at almost 5 amps.
> 
> Rated current: 1.6A
> 
> 
> 
> 
> 
> 
> Active Aqua Chiller with Power Boost, 1/4 HP
> 
> 
> Active Aqua Chiller, 1/4 HP Boost
> 
> 
> 
> 
> www.hydrofarm.com
> 
> 
> 
> 
> 
> No idea about flow. I don't have a flow meter.


not sure what this means. How does that compare to the generic chiller I bought from amazon? I thought it was all in the HP rating. The one I purchased is rated at 1/3 hp, which is technically more powerful that your 1/4hp chiller, but looks like that is not the case, lol.


----------



## yzonker

motivman said:


> not sure what this means. How does that compare to the generic chiller I bought from amazon? I thought it was all in the HP rating. The one I purchased is rated at 1/3 hp, which is technically more powerful that your 1/4hp chiller, but looks like that is not the case, lol.


Well those hp ratings seem to be very loosely tied to the actual capacity. You can see that when we were comparing my 1/4 hp unit to the 1/2 hp model. The 1/2 hp model only really has about 33% more capacity (3010 vs 4020 Btu/hr).

I honestly don't see what else could be the issue unless you just got a defective unit.


----------



## motivman

yzonker said:


> Well those hp ratings seem to be very loosely tied to the actual capacity. You can see that when we were comparing my 1/4 hp unit to the 1/2 hp model. The 1/2 hp model only really has about 33% more capacity (3010 vs 4020 Btu/hr).
> 
> I honestly don't see what else could be the issue unless you just got a defective unit.


I mean the thing can chill the water down to 50F while my system is idle, but once I put any load on the PC, water temp rises FAST


----------



## SilenMar

inedenimadam said:


> The input lag is noticeable, but im still using frame generation anyway. Single player w/controller, so I can get used to the delay. This implementation seems better than it was in plague tale. I am not getting noticeable artifacts this time at least.


The input lag, rendering latency are the same or lower than v1.32 if the v4.00 runs DX11. 
It just loses RT, DLSS, HDR all together. RT and DLSS can even max out 32GB on system RAM after the vRAM is maxed out. 
This level of RT even on Geralt's hair is not something the current gen 4090 can handle.


----------



## doom26464

lawson67 said:


> The same on My Zotac Extreme Vram scales all the way to 2000mhz without any artifacts even though after 1950mhz i get errors with the Vulcan memory test, however it still scales for that extra 50mhz


Yah this is a zotac trinity card. 

Maybe I'll just set vram at +1600 to be safe or +1700 and call it a day.


----------



## ttnuagmada

Krzych04650 said:


> Both WOLED and QD-OLED have this issue inherently, no way around it. Also it isn't just low light scenes, it is most visible in low light scenes but average scenes usually have darker parts too. People are in denial about it since OLED basically became a cult, but it is a fact and it sucks because it makes all current OLEDs unsuitable for gaming. And the worst part about it is that this issue has been known and acknowledged by manufacturers for like 5 years, and yet new iterations of WOLED keep coming out with this issue unresolved, and brand new QD-OLED tech also came out with exact same issue, so it is really inherent to the OLED tech itself and hopes for that ever getting fixed are growing thin.


None of us are in denial about it, it's just that it is very very rare and short-lived when it happens. It's also not at all inherent to OLED. My CHG70 does it a lot more than my C1 does. Meanwhile, every LCD in existence has trash contrast which is 1000000000x more distracting, especially in a dark room. 

Making claims that OLED is a cult and they're unsuitable for gaming is some ridiculous over-the-top hyperbole and it's very hard to imagine that your experience with the issue extends beyond reading complaints on the internet.


----------



## yzonker

ttnuagmada said:


> None of us are in denial about it, it's just that it is very very rare and short-lived when it happens. It's also not at all inherent to OLED. My CHG70 does it a lot more than my C1 does.
> 
> Making claims that OLED is a cult and they're unsuitable for gaming is some ridiculous over-the-top hyperbole and it's very hard to imagine that your experience with the issue extends beyond reading complaints on the internet.


I guess I am. I don't see it at all on my 55" C1. And for those old enough to remember CRTs, I was incredibly sensitive to 60hz flicker, so seems likely I would notice.


----------



## ttnuagmada

yzonker said:


> I guess I am. I don't see it at all on my 55" C1. And for those old enough to remember CRTs, I was incredibly sensitive to 60hz flicker, so seems likely I would notice.


It happens rarely enough that it's possible you've not even played a game that triggers it. Only time I ever see it is when im playing a game like Wow in a busy area that has a lot of FPS fluctuations. I mean it happens some on loading screens due to FPS fluctuating, but anyone who complains about that is just looking for something to complain about. Like I said, my C1 doesn't do anything my CHG70 doesn't do worse.

It's likely very game dependent. All I know is that there are a helluva lot more of us who would never touch another LCD after owning an OLED than there are people claiming OLED is unusable for gaming.


----------



## Mad Pistol

ttnuagmada said:


> It happens rarely enough that it's possible you've not even played a game that triggers it. Only time I ever see it is when im playing a game like Wow in a busy area that has a lot of FPS fluctuations. I mean it happens some on loading screens due to FPS fluctuating, but anyone who complains about that is just looking for something to complain about. Like I said, my C1 doesn't do anything my CHG70 doesn't do worse.
> 
> It's likely very game dependent. All I know is that there are a helluva lot more of us who would never touch another LCD after owning an OLED than there are people claiming OLED is unusable for gaming.


I've tried using LCD monitors after using my LG CX as a monitor, and I just can't. They ALL look like trash by comparison. LG's C lineup of OLEDs are crazy good for entertainment purposes, both computer and console gaming. I do notice gamma shift/flashing during VRR, but it no longer bothers me. It's just a drawback of the tech, and I have accepted it. It is NOT a deal breaker for me, as the positives for image quality and pixel response outweigh it.

OLED is NOT a cult. Look at user reviews on Best Buy's website; these displays get a 4.7/5 or 4.8/5 yearly, which is unheard of for a TV. People genuinely love the product and technology.


----------



## J7SC

yzonker said:


> One thing I'm noticing is that the OLED owners don't seem to feel the need to make posts to justify their purchase. Lol.


...too busy to comment while enjoying my C1 48 OLED & VRR, especially with the DLSS3/FI/NV Reflex 



yzonker said:


> Hey @J7SC , must be nice living in that warm climate up there.  That's Fahrenheit BTW.


...now I get why you have a chiller on the 4090 - it is for warming up your system -- smart move !  

But seriously, our winter weather has been 'unusual' re. arctic outflow, to say the least. Not really a surprise with global climate change and 3x la ninas in a row... but normally, we like to keep the regular annual snow on the local mountains (pics below from this morning), but snow on the window ledges for four days right by the Pacific (you see an inlet in one of the pics) is highly unusual. I am about to drive slither around in that mess. Few people have snow tires on here, and snow removal by the city has been hampered by a variety of factors, compounding the issues. The real fun begins tomorrow with a lot more snow forecast right on one of the busiest travel days of the year. That will be followed by a warming trend and rain -- oh dear.

Anyway, nice and cozy inside - between my old TR / NVL 1,100W system and a few more RTX 4090 / Galax vbios exercises, it is all good.


----------



## lawson67

jootn2kx said:


> Interesting which tool are you using for the memory test?


The Tool is memtest_vulkan you can download it here


----------



## mirkendargen

yzonker said:


> I guess I am. I don't see it at all on my 55" C1. And for those old enough to remember CRTs, I was incredibly sensitive to 60hz flicker, so seems likely I would notice.


It happens in a very specific situation only: fluctuating refresh rate and a near-black area held steady for long enough that you have time to see it flicker. You can often force it to happen in game menus, and some menus where you can force it to happen do the same thing on LCD screens too.


----------



## Shoggoth

A couple of quick BIOS flashing questions, since it's been so long I can't actually remember when I last did it.

Is if perfectly fine to flash any of the available 4090 BIOSes onto my Gainward Phantom GS? I see the card (which I'll be picking up tomorrow) has a 500W limit and only an 111% power limit and I thought I'd flash the Asus Strix BIOS with 600W and 133%. Primarily to get that 133% PL, not so much the wattage.

Also, do I need to boot into the BIOS I want to flash, or do I boot to the OC BIOS and simply flip the switch in order to flash over the silent BIOS?


----------



## cerealkeller

So I’m installing a water block on my 4090 MSI Trio, I have the 520 watt bios atm and so far it’s been fine. But I seem to recall people doing shunt mods having trouble with blown fuses. Can anyone tell me how much this fuse will take before it’s a risk? I think it’s a fuse, I’ll attach a pic. I’ll also add full board pics in case anyone is interested


----------



## man from atlantis

I finally man up and updated to the latest bios on my B550 board which gets rid of the PCIe 4.0 bug, 1 to 2% perf increase across the board:


----------



## KingEngineRevUp

Krzych04650 said:


> Around 2960, although this is thanks to Galax 666W BIOS. All other BIOSes that I've tried would drop to around 2840-2880 in RT games with 3000 set clock, so this is an expected behavior for most cards with stock BIOS I think.
> View attachment 2590029
> 
> 
> 
> 
> Yea that is the thing, OLED has some amazing qualities, and it does fix some long-standing LED LCD issues like low contrast and glow/bleed, and it can look absolutely astounding in some demos, but then real-world usage hits and it turns out that for every one issue that it fixes it introduces three new ones. It is great as a TV for movies and shows exclusively since almost none of the flaws matter in this case and the qualities that it has are especially desirable there, but it definitely doesn't stand the test of real-world desktop usage.


He's asking for the effective clocks, you have to check HWINFO64 for that.


----------



## SilenMar

The reason to buy 4090 is to run better graphics. Then there is no reason to fall back to just enjoy an iceberg of HDR without seeing better pictures.

The HDR in the updated Witcher 3 can look very good. It can be scaled all the way up to 10,000nits. The contrast can be maxed out.

It's a matter of time for the devs to enable the slider option which is not currently available in the game menu but configurable in the dx12user.settings in %USERPROFILE%\Documents\The Witcher 3

Under the [Visual] section, HDRPaperWhite is the max mid tone. MAXTVBrightness is the max highlight luminance.
In a daylight scene, the mid alone can be easily over 1,000nits not even including the highlight. Change these two options to match the parameters of a true HDR monitor then the game will look at least 3x more impactful and more realistic than the original 200nits dim SDR look. It doesn't even look like the same game on different monitors.

HDR 1400
============

[Visuals]
HdrPaperWhite=1200
MaxTVBrightness=1600

============

HDR 1000
============

[Visuals]
HdrPaperWhite=1000
MaxTVBrightness=1100

============


The game contrast can reach the same level of real scene HDR benchmark footage. There are a lot of better images out there.
Just images below are not something a sub-HDR 1,400 monitor can showcase properly.
Witcher 3 Scene 1
Witcher 3 Scene 2
HDR Reference


----------



## mirkendargen

SilenMar said:


> The reason to buy 4090 is to run better graphics. Then there is no reason to fall back to just enjoy an iceberg of HDR without seeing better pictures.
> 
> The HDR in the updated Witcher 3 can look very good. It can be scaled all the way up to 10,000nits. The contrast can be maxed out.
> 
> It's a matter of time for the devs to enable the slider option which is not currently available in the game menu but configurable in the dx12user.settings in %USERPROFILE%\Documents\The Witcher 3
> 
> Under the [Visual] section, HDRPaperWhite is the max mid tone. MAXTVBrightness is the max highlight luminance.
> In a daylight scene, the mid alone can be easily over 1,000nits not even including the highlight. Change these two options to match the parameters of a true HDR monitor then the game will look at least 3x more impactful and more realistic than the original 200nits dim SDR look. It doesn't even look like the same game on different monitors.
> 
> HDR 1400
> ============
> 
> [Visuals]
> HdrPaperWhite=1200
> MaxTVBrightness=1600
> 
> ============
> 
> HDR 1000
> ============
> 
> [Visuals]
> HdrPaperWhite=1000
> MaxTVBrightness=1100
> 
> ============
> 
> 
> The game contrast can reach the same level of real scene HDR benchmark footage. There are a lot of better images out there.
> Just images below are not something a sub-HDR 1,400 monitor can showcase properly.
> Witcher 3 Scene 1
> Witcher 3 Scene 2
> HDR Reference


That isn't at all what the paperwhite setting is meant to be in HDR... You want that to be your desired brightness for SDR pure white. 1200nit for that is madness. The UI will be peak brightness all the time and ruin any hope for contrast.

And incidentally, HDR settings are a freaking disaster of differing names/options/results of values. Windows *really* needs to just use the HDR calibration app settings as an HGIG profile, and games *really *need to just start using that and not having their own random variety of settings that no one knows what mean.


----------



## Brads3cents

lg still does gaming better than samsung

the holy grail will be microlens with deuterium panel and i mean real evo panel which none of these 42" c2s have

LG claims this can reach 2,000 nits with color vibrance better than qd oled and without all the downsides to samsung panels
This will also be significantly better for burn in

ideally we would reach 4k 240hz but i would easily take 165, 144, or even 138 if it can check off those other 2 requirements. with heatsink its even better

I havent found a reason to upgrade my 4k 144hz ips that i bought in 2018... with 163ppi

these new technologies would finally push me over the edge

it's speculated that microlens wont be ready in time for 2023
but if thats the case, 2024 panels will take a big fat dump on all of the panels out today
LG is getting *Phosphorescent Blue *which will be a game changer and ready for 2024 panels. Add in a glossy finish and we have the end game monitor with potential 8k ultrawide as well

for the past 10 years the blue subpixel for all oled displays (including qd oled) has been by far the weakest link. this is soon to change and all lcd's will look like trash in comparison (save for micro led)
it will completely change how bright Oleds can get and also eliminate the burn in factor for good


----------



## SilenMar

mirkendargen said:


> That isn't at all what the paperwhite setting is meant to be in HDR... You want that to be your desired brightness for SDR pure white. 1200nit for that is madness. The UI will be peak brightness all the time and ruin any hope for contrast.
> 
> And incidentally, HDR settings are a freaking disaster of differing names/options/results of values. Windows *really* needs to just use the HDR calibration app settings as an HGIG profile, and games *really *need to just start using that and not having their own random variety of settings that no one knows what mean.


I already said an accurate HDR monitor with full tracked EOTF doesn't need HDR calibration since it is just done on the software level based on the readings from firmware. Only inaccurate monitor with altered EOTF needs to calibrate with that tool since it can be clipping. So you need to use that tool to re-tone HDR from claimed HDR 1000 to the actual HDR 400 alike. 

"Paperwhite" is different in each game. In some it just means UI brightness. 

In Witcher 3 it works as the max mid tone. not the average mid tone or the APL. You can see the contrast increase without lifting the lowlight.


----------



## ttnuagmada

Mad Pistol said:


> I do notice gamma shift/flashing during VRR, but it no longer bothers me. It's just a drawback of the tech, and I have accepted it. It is NOT a deal breaker for me, as the positives for image quality and pixel response outweigh it.


Exactly. What's the alternative? Super washed out, uneven blacks with backlight bleed and/or blooming/haloing etc, on top of that a lot of LCD's still flicker with VRR anyway!


----------



## mirkendargen

SilenMar said:


> I already said an accurate HDR monitor with full tracked EOTF doesn't need HDR calibration since it is just done on the software level based on the readings from firmware. Only inaccurate monitor with altered EOTF needs to calibrate with that tool since it can be clipping. So you need to use that tool to re-tone HDR from claimed HDR 1000 to the actual HDR 400 alike.


So you say I'm wrong....



SilenMar said:


> "Paperwhite" is different in each game. In some it just means UI brightness.


Then say I'm right. Got it.


----------



## SilenMar

mirkendargen said:


> So you say I'm wrong....


I stand by what I say. There is no reason to have Dolby Vision monitor with altered EOTF.

That Windows HDR calibration is just to reconfirm the Max/Min luminance and MAX frame-average luminance from firmware. There is nothing special about it. Say the firmware shows Max peak luminance is 1,800nits. Max frame-average luminance is 1,600nits. Min Luminance is 0.002nit. If the EOTF is tracked accurately then the test patterns will show exact merge point when the test patterns merge at high 1,800ntis, average 1,600nits, low 0.002nit.

So the monitor look exactly the same after calibration. However, I cannot say the same for other monitors with altered EOTF since they are not accurate in the first place. When the actual calibration is needed, HDR monitor can be hardware calibrated with probe. And there aren't many monitors where the HDR/ EOTF track can be calibrated.


----------



## motivman

yzonker said:


> Well those hp ratings seem to be very loosely tied to the actual capacity. You can see that when we were comparing my 1/4 hp unit to the 1/2 hp model. The 1/2 hp model only really has about 33% more capacity (3010 vs 4020 Btu/hr).
> 
> I honestly don't see what else could be the issue unless you just got a defective unit.


I said F it and bought this bad boy, lol. I know it will be super loud, so running this from my garage to my office, LOL. It better hold temps at 5C all year round... haha.









Amazon.com: Active Aqua AACH100HP Hydroponic Water Chiller Cooling System, 1 HP, Rated per hour:10,050 BTU, User-Friendly : Everything Else


Buy Active Aqua AACH100HP Hydroponic Water Chiller Cooling System, 1 HP, Rated per hour:10,050 BTU, User-Friendly: Everything Else - Amazon.com ✓ FREE DELIVERY possible on eligible purchases



www.amazon.com


----------



## yzonker

motivman said:


> I said F it and bought this bad boy, lol. I know it will be super loud, so running this from my garage to my office, LOL. It better hold temps at 5C all year round... haha.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Amazon.com: Active Aqua AACH100HP Hydroponic Water Chiller Cooling System, 1 HP, Rated per hour:10,050 BTU, User-Friendly : Everything Else
> 
> 
> Buy Active Aqua AACH100HP Hydroponic Water Chiller Cooling System, 1 HP, Rated per hour:10,050 BTU, User-Friendly: Everything Else - Amazon.com ✓ FREE DELIVERY possible on eligible purchases
> 
> 
> 
> www.amazon.com


Crap, it's on sale. Sweeet....


----------



## motivman

yzonker said:


> Crap, it's on sale. Sweeet....


Yessir. I had a bunch of amazon gift cards, so once I saw it go on sale, instant BUY.


----------



## StreaMRoLLeR

Glottis said:


> Let's see.
> 
> VRR flicker
> ABL auto brightness limiter
> APL average picture level dimming
> Black/shadow crush
> Burn-in
> Near black chrominance overshoot
> Have to use pixel shifting (making desktop and PC usage look like crap)
> Have to use logo dimming (making game UI look like crap)
> Pink tint lottery
> Vertical banding lottery
> Have to disable desktop icons, hide taskbar, resort to other "babying" techniques.
> Did I miss anything?
> 
> OLED sure is amazing. I probably would have burned 2 OLEDs by now considering how much I played Diablo II Resurrected on my TV


LOL 

VRR flicker = not present on my G1

ABL = Not present in G2, A95K ( will be much better when LG make a heatsink for C series )

Black Crush = Fixed in Gen 5 A9 (22 models )

Burn In = Tolerable after introduction of deuterium based WBE panel 30.000hr lifetime

Pixel shifting = Sometimes annoying

Logo Dimming = Never experienced

Pink Tint Lottery = Fixed in QD-OLED

Vertical Banding Lottery = Fixed in QD-OLED

Have to disable icons aka babysitting methods ? = You are correct


----------



## J7SC

ttnuagmada said:


> Exactly. What's the alternative? Super washed out, uneven blacks with backlight bleed and/or blooming/haloing etc, on top of that a lot of LCD's still flicker with VRR anyway!


Good point. My home-office has multiple monitors in addition to the C1 OLED and the two non-dev-server systems are all interconnected for multi-monitor work (AMD 6900XT to Philips VA workstation panel, C1 OLED and IPS HDR; RTX 4090 to C1 OLED, the Philips VA and IPS HDR). They all have their quirks but the OLED is by far the most preferable, whether using AMD or NVidia for VRR. 

Somewhat related, watching the same highly-produced YT vid ( i.e. by >7ensation) produces slightly different results on the same OLED when using the native LG YT network connection compared to either GPUs, including on gamma etc...so in addition to everything else such as GPU driver settings, it can be decompression algos and such.

FYI, there are some C1s (certain serial number sequence of panels assembled in Mexico) which actually have the EVO panel and can be switched, though LG frowns on that re. potential warranty work...


----------



## galimberti

sym30l1c said:


> 4090 FE here: I tried running the Afterburner scanner and got this:
> 
> 13:09:56 Connected to MSI Afterburner control interface v2.3
> 13:10:17 GPU1 : VEN_10DE&DEV_2684&SUBSYS_165B10DE&REV_A1&BUS_9&DEV_0&FN_0
> 13:10:17 Start scanning, please wait a few minutes
> 13:48:15 Scan succeeded, average core overclock is 165MHz, memory overclock is 200MHz
> 13:48:15 Dominant limiter
> 13:48:15 Voltage
> 13:48:15 Results are considered unstable
> 13:48:16 Overclocked curve exported to MSI Afterburner
> 
> Is this reliable? I have a already tested the card in several games with +220 core and +1000 memory without any issues.
> I've always used Precison X1, so I'm not sure whether to trust Afterburner or whether there is a probblem with the card.


Got exact same results from OC Scan, and it was indeed unstable. I could run +210 core on cyberpunk, but even +135 would fail on Port Royal loop. Also, my stock boost clock is pretty low, 2655mhz, so even with my +120 oc I only get average stock boost (2775mhz) 😢
At least I can run +1500 on vram.


----------



## Krzych04650

KingEngineRevUp said:


> He's asking for the effective clocks, you have to check HWINFO64 for that.


???.... I am talking about effective and OSD also shows effective clock next to set one...


----------



## StreaMRoLLeR

This is the third time 4090 topic falls into oled topic in 2 weeks


----------



## ttnuagmada

J7SC said:


> Good point. My home-office has multiple monitors in addition to the C1 OLED and the two non-dev-server systems are all interconnected for multi-monitor work (AMD 6900XT to Philips VA workstation panel, C1 OLED and IPS HDR; RTX 4090 to C1 OLED, the Philips VA and IPS HDR). They all have their quirks but the OLED is by far the most preferable, whether using AMD or NVidia for VRR.
> 
> Somewhat related, watching the same highly-produced YT vid ( i.e. by >7ensation) produces slightly different results on the same OLED when using the native LG YT network connection compared to either GPUs, including on gamma etc...so in addition to everything else such as GPU driver settings, it can be decompression algos and such.
> 
> FYI, there are some C1s (certain serial number sequence of panels assembled in Mexico) which actually have the EVO panel and can be switched, though LG frowns on that re. potential warranty work...


I do indeed know about the C1/G1 trick

Vincent Teoh subtly shamed me anonymously in one of his videos for telling the internet about it.


----------



## SilenMar

ttnuagmada said:


> Exactly. What's the alternative? Super washed out, uneven blacks with backlight bleed and/or blooming/haloing etc, on top of that a lot of LCD's still flicker with VRR anyway!


There are a lot alternative to showcase better HDR images, not just 100nits SDR plus a few dots of highlights. It is just the minimal part of HDR images. 
Most LCD you see are the average ones. Most of the time a good FALD LCD can even show deeper black scene than OLED with a little tweak of EOTF on the lowlight. The thing that stops you to see better images is just the cost which will be driven down eventually.
On the other hand OLED can be fast but very hard to get brighter as it is organic. The color is limited. Anyone with bit sense of how semiconductor works will know the FALD--->microLED is the future instead of just mixing doped polymers or stacking a layer of QD that is done years ago on LCD.


----------



## J7SC

ttnuagmada said:


> I do indeed know about the C1/G1 trick
> 
> Vincent Teoh subtly shamed me anonymously in one of his videos for telling the internet about it.


...Oddly enough, I learned about it in one of Vincent Teoh's vids, so if he wanted to keep that quiet... 🤪


----------



## ttnuagmada

SilenMar said:


> There are a lot alternative to showcase better HDR images, not just 100nits SDR plus a few dots of highlights. It is just the minimal part of HDR images.
> Most LCD you see are the average ones. Most of the time a good FALD LCD can even show deeper black scene than OLED with a little tweak of EOTF on the lowlight. The thing that stops you to see better images is just the cost which will be driven down eventually.
> On the other hand OLED can be fast but very hard to get brighter as it is organic. The color is limited. Anyone with bit sense of how semiconductor works will know the FALD--->microLED is the future instead of just mixing doped polymers or stacking a layer of QD that is done years ago on LCD.


If microLED ever gets here. Plus it might have it's own set of drawbacks that no one knows about yet.


----------



## ttnuagmada

J7SC said:


> ...Oddly enough, I learned about it in one of Vincent Teoh's vids, so if he wanted to keep that quiet... 🤪


That whole thing was my 15 minutes of internet fame. I was repping OCN and went and showed them all how to overclock a TV lol.


----------



## SilenMar

ttnuagmada said:


> If microLED ever gets here. Plus it might have it's own set of drawbacks that no one knows about yet.


Semiconductors are all about getting smaller. 
The potentials of the microLED is obvious while OLED is already on the pixel level with all the organic problems unsolved in the very beginning. The only way to solve it the hard way is to lose organic characteristics.


----------



## ttnuagmada

SilenMar said:


> Semiconductors are all about getting smaller.
> The potentials of the microLED is obvious while OLED is already on the pixel level with all the organic problems unsolved in the very beginning. The only way to solve it the hard way is to lose organic characteristics.


Sure, but OLED continues to make lifetime/brightness improvements incrementally. If we're still 5-10 years away from a viable way to mass manufacture microLED, OLED may reach the point of "good enough". Plus you have Samsung working on nano-rod tech as well.


----------



## doom26464

Was there an HDR option in the settings for witcher 3 next gen? I don't remember seeing it anywhere?


----------



## yzonker

doom26464 said:


> Was there an HDR option in the settings for witcher 3 next gen? I don't remember seeing it anywhere?


I think it's just on by default. I don't get the AutoHDR pop-up when it loads, suggesting that's it's on.









How To Enable HDR For The Witcher 3 Next-Gen Update On PC


The Witcher 3 Next-Gen update adds plenty of ray-tracing goodness and visual polish to the game, but what about HDR on PC?




twistedvoxel.com


----------



## Nd4spdvn

ttnuagmada said:


> Plus it might have it's own set of drawbacks that no one knows about yet.


True. One that is known is that is sample and hold so it will need a high refresh backplane to be on a development curve in order to provide crisp, high rez motion. Otherwise it may as well be like an "oled on steroids", though I expect for the oled technology to still improve in the next 10 years and still be with us. This will make things interesting as it may very well be better at that time vs a still non-mature microled. We shall see.


----------



## SilenMar

ttnuagmada said:


> Sure, but OLED continues to make lifetime/brightness improvements incrementally. If we're still 5-10 years away from a viable way to mass manufacture microLED, OLED may reach the point of "good enough". Plus you have Samsung working on nano-rod tech as well.


OLED is always dimmer than LCD. The brightness cannot be pushed easily or it doesn't last. LG has used up all the combinations of polymers. It tried to solve the longevity issue like simple chemistry but failed. This is not a good solution as the base materials needs to be stable in the first place. Then it comes up with WOLED with the structural change on the sub pixels which is more like a semiconductor solution. Then there is Samsung QD-OLED. There still needs to be more structural changes like these on sub pixels to reach the level of LCD.

There are minLED, dual-layer, microLED to solve the contrast. With high contrast and high color gamut, LCD alike always deliver better HDR images. Response wise, there are 1.5x more refresh rate LCDs at the same motion blur of OLED but lower input lag which benefits more from G-sync. 

Furthermore, a part of Samsung is super-charged by Taiwanese. The actual Taiwanese has AUO that is sand bagging every 3 years with a high-end LCD display nobody else touches while working on microLED with Apple. 




doom26464 said:


> Was there an HDR option in the settings for witcher 3 next gen? I don't remember seeing it anywhere?


There isn't an option yet in the game menu but you can adjust it in the configuration file.


----------



## dr/owned

yzonker said:


> The only way the 4090 Ti is going to be interesting is if Nvidia either finds significantly faster VRAM (beyond 24Gbps) and increases the voltage limit. I suspect neither will happen. Otherwise it'll just be a marginal jump in performance. Maybe 10%.


I didn't find increased voltage really helped very much. Even like 1.175V wasn't getting me over 3100. My card is a bit of a dog and only wants to do 3050 at best.


----------



## dr/owned

motivman said:


> I said F it and bought this bad boy, lol. I know it will be super loud, so running this from my garage to my office, LOL. It better hold temps at 5C all year round... haha.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Amazon.com: Active Aqua AACH100HP Hydroponic Water Chiller Cooling System, 1 HP, Rated per hour:10,050 BTU, User-Friendly : Everything Else
> 
> 
> Buy Active Aqua AACH100HP Hydroponic Water Chiller Cooling System, 1 HP, Rated per hour:10,050 BTU, User-Friendly: Everything Else - Amazon.com ✓ FREE DELIVERY possible on eligible purchases
> 
> 
> 
> www.amazon.com


If you want a pro tip: build yourself a 5 gallon bucket reservoir and frame it into some polyiso foam board. All home depot stuff. You need volume to act as a thermal battery so the chiller isn't cycling constantly.










Got a little too cold when I was pushing 1.7V:


----------



## Super Panda

Hi, 
Does anyone have any tutorials of how to exacly flash a bios on rtx 4090 ? (I have rtx 4090 Phantom)


----------



## bmagnien

Super Panda said:


> Hi,
> Does anyone have any tutorials of how to exacly flash a bios on rtx 4090 ? (I have rtx 4090 Phantom)





How-To Flash RTX Video Card BIOS To A Different Series


----------



## motivman

dr/owned said:


> If you want a pro tip: build yourself a 5 gallon bucket reservoir and frame it into some polyiso foam board. All home depot stuff. You need volume to act as a thermal battery so the chiller isn't cycling constantly.
> 
> View attachment 2590119
> 
> 
> Got a little too cold when I was pushing 1.7V:
> 
> View attachment 2590120


what chiller are you using?


----------



## Sheyster

schoolofmonkey said:


> See I'm glad there's people like yourself that understand, on another forum I was basically told I was crazy and my C2 was "faulty", until I got them to run VRR Flicker test.
> 
> Don't get me wrong, my C2 42" is worse the the lounge room's C2 55" my wife uses for gaming (3090), but we have different brightness settings etc, both have VRR on.
> 
> The OLED's look fantastic, but it's just not truly suited to PC gaming with VRR, the 32" Odyssey G7 IPS has been more or less perfect other than the stupid scan line issue most of the newer Samsung screens suffer from, at least I can use G-Sync and not feel like I'm in a disco, and IPS still looks fine.


Did you try simply disabling VRR (G-Sync) with the C2 42" and frame capping at 117 FPS? A 4090 will hold that FPS in almost any game, so VRR isn't a factor except for a few games.


----------



## dr/owned

motivman said:


> what chiller are you using?


Nowadays its retired. At the time that was an HC-500 just to mess around with max overclocks. Chillers are expensive to run and don't really net that much gain since silicon conductivity doesn't change until you go deep sub zero.


----------



## Nico67

mouacyk said:


> That's hardcore. Why not liquid metal it also, for additional cooling and clocks?


It may help a bit, but LM is generally better for very hot tiny die contact areas. at 21-40c it may only make a degree difference and requires you to have both surfaces perfectly flat.



motivman said:


> So I got waterchiller, but seems like its only good for short benchmark runs. I set it to 55F and ran Port Royal Stress test, after 20 runs, Water temperature got to 89F, lol. This is about 8C delta T from my ambient temps. I get the same performance from my Mora 360 LT with fans running at 1000 rpm. Maybe I need one of those 1/2HP waterchillers??
> 
> i got one of those chinese made 1/3hp chiller... it cant hold temps at all, haha.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Amazon.com : NA Aquarium Chiller 79Gal 1/3 HP Water Chiller for Hydroponics System Low Noise for Fish Tank Axolotl Coral Reef Tank 300L : Pet Supplies
> 
> 
> Amazon.com : NA Aquarium Chiller 79Gal 1/3 HP Water Chiller for Hydroponics System Low Noise for Fish Tank Axolotl Coral Reef Tank 300L : Pet Supplies
> 
> 
> 
> www.amazon.com


It could be a couple of things, firstly how good the heat exchanger is and secondly the coolant capacity. The unit make work fine, but just not be able to hold so lower a temp.

The 1hp unit should easily cover anything in the future and on a positive note should run for shorter bursts on pulldown.



yzonker said:


> That's completely incorrect regarding the chiller. As I've said before, my 1/4 hp chiller can hold mid teens at gaming loads in the 500-600w range. The one @motivman bought must either be defective or it's spec'ed higher than it actually performs.


There can be other factors, like ambient air temp where chiller is as it looses effectiveness as the temp goes up. Also how good the condenser (radiator/ fan) on the chiller is.


----------



## Nico67

dr/owned said:


> If you want a pro tip: build yourself a 5 gallon bucket reservoir and frame it into some polyiso foam board. All home depot stuff. You need volume to act as a thermal battery so the chiller isn't cycling constantly.
> 
> View attachment 2590119
> 
> 
> Got a little too cold when I was pushing 1.7V:
> 
> View attachment 2590120


nice, here's my version 


















modified 10 gallon insulated drink dispenser 

I was getting some bio issues even with silver coils and mayhems additive, and I though they might be causing the nickel break down issues I was also getting. So now I have an aquarium UV light mounted in the lid, seems to be helping


----------



## motivman

Nico67 said:


> nice, here's my version
> 
> View attachment 2590131
> 
> View attachment 2590132
> 
> 
> modified 10 gallon insulated drink dispenser
> 
> I was getting some bio issues even with silver coils and mayhems additive, and I though they might be causing the nickel break down issues I was also getting. So now I have an aquarium UV light mounted in the lid, seems to be helping


Any tips on how you added the fittings or G1/4 ports? Would I just need two if I am making a Giant 5 gallon reservoir?


----------



## dr/owned

Nico67 said:


> I was getting some bio issues even with silver coils and mayhems additive, and I though they might be causing the nickel break down issues I was also getting. So now I have an aquarium UV light mounted in the lid, seems to be helping


Open air loops can get real painful. I'm pretty sure I discovered some new lifeforms when I had 20 gallons going on. I think the key is you need a violent amount of water flow to keep the whole shebang mixed so there are no deadspots. My current loop is maybe 3 gallons and seems to be stable with Mayhem inhibitors and biocide in distilled water.


----------



## motivman

dr/owned said:


> Open air loops can get real painful. I'm pretty sure I discovered some new lifeforms when I had 20 gallons going on. I think the key is you need a violent amount of water flow to keep the whole shebang mixed so there are no deadspots. My current loop is maybe 3 gallons and seems to be stable with Mayhem inhibitors and biocide in distilled water.


I really want to try this. But I have no idea how to add inlet/outlet port to the water jug. Any guides or tips on how to get this done?


----------



## Nico67

25mm holesaw very carefully cut up to to inner lining (I think I cut through the outer lining and slighty into the foam and then turned by hand to cut up to inner lining, but it was many years ago), then open up inner lining hole to fit XSPC G1/4 bulkhead fitting. I then used a series of other fittings and o rings to bring it flush with the outside edge. Works pretty well, but was mainly done orginally for strength as the earlier iteration had DCC pumps directly connected on the side of the res.

Hole placements were spaced around the container to try to spread flow and keep cool water at the bottom and warmer return water at the top.

pretty sure these where the fittings,


































probably were all the same size orings around the outer alphacool bulkhead. The XSPC inner bulkhead does water tight seal the inner lining too, with dual o rings.
There may be better bits these days you can use, just need to get creative


----------



## yzonker

Nico67 said:


> 25mm holesaw very carefully cut up to to inner lining (I think I cut through the outer lining and slighty into the foam and then turned by hand to cut up to inner lining, but it was many years ago), then open up inner lining hole to fit XSPC G1/4 bulkhead fitting. I then used a series of other fittings and o rings to bring it flush with the outside edge. Works pretty well, but was mainly done orginally for strength as the earlier iteration had DCC pumps directly connected on the side of the res.
> 
> Hole placements were spaced around the container to try to spread flow and keep cool water at the bottom and warmer return water at the top.


When I built a smaller version of this for my chiller, I bought some 1/4" NPT nuts at the hardware store to secure the fittings to the reservoir rather than depending on them threading in to the plastic and/or gluing them in. 

One issue though that hasn't been brought up. If the large res is not air tight, then water will either drain to or from your machine when you remove a filler plug on the res in the machine. Direction of water flow obviously depends on the relative heights. 

This is why I don't use one anymore. You can disconnect the chiller lines first before removing a plug, but I didn't like this given I was running the machine 24/7. Also if the large res is below your machine, water can migrate to the big res anyway and cause your machine to be low on water.


----------



## Nico67

yzonker said:


> One issue though that hasn't been brought up. If the large res is not air tight, then water will either drain to or from your machine when you remove a filler plug on the res in the machine. Direction of water flow obviously depends on the relative heights.


Yeah, water management can be problematic, I use the drink dispenser nozzle to drain the res, but also plug stop the XSPC fittings from the inside of the tank when changing components or to keep the pumps primed when refilling. you can also attach a hose to the return fitting to expel water directly out of the res to flush loops with clean water in the res. Had to learn a few tricks over the years, even using rubber gloves when adding removing plug stops and hoses to limit water contamination.


----------



## tubs2x4

4090 overkill for 1440p for those who been using the card for a while?


----------



## J7SC

Another special cooling option would be to use a phase cooler. I used those before when I did not have LN2 handy for subzero. For direct mount, my phase unit was quite capable on 4C/8T and even some 6C/12T HEDTs as well as older oc'ed GPU cores with up to - 50C...while it has the cooling power to bring temps down for higher-wattage CPUs and GPUs, it does have trouble with modern chips that have aggressive boost and voltage behavior - it's transient response is too slow. While there are cascading phase coolers out there which address that, they use way too much power and are just too big, bulky and loud for anything but the garage or the yard.

HOWEVER, back in the day, I rigged up a flat copper piece adapter for the phase cooler head that evenly covered the top of a 360x60 rad that in turn was part of a multi-rad system (that's important)...haven't used it for a few years but that would probably still work w/o having to disconnect the regular w-cooling loop, at least as long as you have 5+ L of cooling fluids in your loop and the system is running (you don't want to freeze any spots / liquids with that approach). In a nutshell, the major advantage is that it just utilizes your existing w-cooling loop that has more than one rad (no fittings to change), and you can turn it on/off any time as long as the computer is running.









In other news...
4090s and even 7900XTXs still seem mostly sold out in my weekly local checks, while the 4080s and 7900XTs are plentiful...


----------



## 8472

tubs2x4 said:


> 4090 overkill for 1440p for those who been using the card for a while?


I don't game at that res, but I think that heavily depends on what games you play and the refresh rate of your display. 

If you mainly play single player titles at 60hz imo it'd be a huge waste as there are plenty of cheaper cards that can do that.


----------



## lmfodor

For those of you who have a 4090 Galaxy, either the Master or the Gaming OC, have you noticed a difference in the speed of the fans? I noticed a difference of 5%, and the other day when I was looking at a Der Bauer review I noticed that the same thing happened when opening GPUz. What I did is put the fans at 100% on MSI AB and I see that one spins at 3000 RPM and the second at 2600 RPM. I don't know if this is a problem, but I would like to know if the same thing happens to those with Gigabyte. I was surprised to see in the Der Bauer video that it had the same behavior with the Master model.


Spoiler: Der Bauer's Fan Speed

















Spoiler: My Galaxy OC at 100% speed















Thanks!


----------



## J7SC

lmfodor said:


> For those of you who have a 4090 Galaxy, either the Master or the Gaming OC, have you noticed a difference in the speed of the fans? I noticed a difference of 5%, and the other day when I was looking at a Der Bauer review I noticed that the same thing happened when opening GPUz. What I did is put the fans at 100% on MSI AB and I see that one spins at 3000 RPM and the second at 2600 RPM. I don't know if this is a problem, but I would like to know if the same thing happens to those with Gigabyte. I was surprised to see in the Der Bauer video that it had the same behavior with the Master model.
> 
> 
> Spoiler: Der Bauer's Fan Speed
> 
> 
> 
> 
> View attachment 2590164
> 
> 
> 
> 
> 
> 
> Spoiler: My Galaxy OC at 100% speed
> 
> 
> 
> 
> View attachment 2590165
> 
> 
> 
> 
> Thanks!


I think that is normal as some of these cards have three fans (but only two entries in the monitoring software); the outer two spin at a slightly different speed compared to the center one which also counterrotates for sound and vibration reasons. When my Gigabyte Gaming OC still had the air-cooler, it showed a similar differential.

----

I tried out the Galax vBios for gaming for the first time (I normally use the G-G-OC vbios for gaming). Even w/o any extra sliders for voltage or PL, it consistently hovered around 3030 MHz (3025 effective). It also ran a bit hotter since I used more power...max was 533 W at 'stock' PL  and came via Cyberpunk 2077 (everything maxed at 4K). FS 2020 was at about 430 W (as opposed to 390 W for G-OC) and F1 2021 settled between the two. Ambient was 25 C (unlike outside 🥶)


----------



## kx11




----------



## ArcticZero

Has anyone tried the Alphacool 4090 FE block? Managed to snag a card recently and I don't really want to get EKWB again so this is my primary option. 











Alphacool Eisblock Aurora Acryl GPX-N RTX 4090 Founders Edition with Backplate

EDIT: Screw it, ordered the damn thing.


----------



## Shoggoth

Not yet. Mine will be delivered on the 27th, which can't come around quickly enough as I can't fit the 4090 in my case as is. I have Alphacool's block on my 6800 XT Red Devil though and it does an excellent job there. Hence my choice of Alphacool again.


----------



## GAN77

ArcticZero said:


> Has anyone tried the Alphacool 4090 FE block?


They announced something new. Alphacool Core Geforce RTX 4090 Reference Design and Founders Edition.


----------



## ArcticZero

GAN77 said:


> They announced something new. Alphacool Core Geforce RTX 4090 Reference Design and Founders Edition.


Woah so soon too. I wonder how much better it is? Now itching to maybe cancel my PPCs order 

EDIT: Saw this posted by an Alphacool rep on TPU. I like the solid brass terminal and new look so I went ahead and requested an order cancel (with apologies), and ordered a Core FE block from Aquatuning.



> The Aurora coolers for the RTX 4XXX and RX 7900 series are almost identical on the technical side, i.e. in terms of cooler base, fins and water flow. There are only minor differences on the terminal side, but they are not significant. So we have made the technical turnaround for both product lines and improved them compared to the Aurora cooler for the RTX 3XXX and RX 6XXX series. The general statements do not refer to the difference between Aurora and Core, but to the cooler generation that we have made for the RTX 3XXX and RX 6XXX series.
> So the Core cooler offers no real performance advantage over the current Aurora series for the current cards. The advantages of the Core cooler are, besides the solid brass terminal, more cosmetic in terms of appearance.
> 
> In general, however, we always improve coolers in details. We have never used one technology for an entire graphics card generation, but have always improved small details in later new coolers. For example, the next step will be that we will include the Apex thermal compound with 17W/mk instead of the Subzero thermal compound with 16W/mk. This will be a flying change. With the RTX 3XXX generation, we have adjusted the cooler bottom and the water paths several times. Just compare the Reference cooler for the 3080/3090 with the cooler for the 3090TI. You can quickly see that we have already done a lot of things differently for the 3090TI, which clearly goes in the direction that we see with the current Aurora and Core coolers for the RTX 4XXX and RX 7900 generation.
> We are generally not standing still in this area.


----------



## GAN77

ArcticZero said:


> Woah so soon too


4-5 weeks.









Alphacool Core Geforce RTX 4090 Founders Edition mit Backplate


Alphacool's Core Produktserie zeichnet sich aus durch hochwertige Qualität, herausragende Performance sowie der einheitlichen und funktionellen Designsprache. Der Kupferkühler, welcher gemeinsam mit dem Abschlussterminal aus einem Stück...




www.alphacool.com


----------



## ArcticZero

GAN77 said:


> 4-5 weeks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Alphacool Core Geforce RTX 4090 Founders Edition mit Backplate
> 
> 
> Alphacool's Core Produktserie zeichnet sich aus durch hochwertige Qualität, herausragende Performance sowie der einheitlichen und funktionellen Designsprache. Der Kupferkühler, welcher gemeinsam mit dem Abschlussterminal aus einem Stück...
> 
> 
> 
> 
> www.alphacool.com


Yep one step ahead of you and went ahead and ordered it from Aquatuning while having my cancellation processed at PPCs. Cheers for the heads up 🍻


----------



## ttnuagmada

SilenMar said:


> Unless it has physical grid. There aren't many FALD LCDs have that features. Even a FALD LCD doesn't have physical grid.


yes, even with a physical grid. I'm telling you guy, you're making yourself look dumb here. Within an FALD zone itself, contrast is limited to the panels static contrast. You claiming otherwise shows you lack a very basic understanding of how an LCD panel works. The only thing a grid can do is help prevent zone bleed.




> The contrast is still higher than the ABL OLED.
> I have a QD-OLED AW3423DW for comparison. It has less contrast every single time.


You quite literally don't know what you're looking at. ABL has literally nothing to do with contrast.


----------



## GRABibus

tubs2x4 said:


> 4090 overkill for 1440p for those who been using the card for a while?


You will be bottlenecked by CPU (GPU usage below 90%) in most of the games, even with a 13900K.


----------



## Gogoo

Hey dear people,
I recently purchased 4090 Gainward (non-gs) and I noticed that it is clocking itself to 2.9-3Ghz by itself. It is stock and I havent even installed its software yet, is this normal?








This is the values from 10-15 mins of Cyberpunk. Also I cant seem to find the frame generation option, is this a special build that you need from a special place?


----------



## jootn2kx

Gogoo said:


> Hey dear people,
> I recently purchased 4090 Gainward (non-gs) and I noticed that it is clocking itself to 2.9-3Ghz by itself. It is stock and I havent even installed its software yet, is this normal?
> View attachment 2590237
> 
> This is the values from 10-15 mins of Cyberpunk. Also I cant seem to find the frame generation option, is this a special build that you need from a special place?


Have you enabled HAGS / Resizable bar in the bios?
You need these to be enabled to have frame generation option.


----------



## tubs2x4

8472 said:


> I don't game at that res, but I think that heavily depends on what games you play and the refresh rate of your display.
> 
> If you mainly play single player titles at 60hz imo it'd be a huge waste as there are plenty of cheaper cards that can do that.


Well I have a 3070 and it’s good but 1440 max out it’s bare minimum. 4090 probably last me long time.


----------



## StreaMRoLLeR

long2905 said:


> just give up man if Vincent or Hardware Unboxed cant convince him, what make you think you can do the job lol


He his one of the biggest 🤡 I ever seen here. He claimed LCD can do better blacks compared to OLED. What he try to do anyway ? 🤷‍♂️


----------



## 8472

tubs2x4 said:


> Well I have a 3070 and it’s good but 1440 max out it’s bare minimum. 4090 probably last me long time.


The 3070 is roughly a 2080ti in terms of horsepower. I noticed a 50% gain in frames in Shadow of the Tomb Raider at 4k when upgrading from a 2080ti to a 3090. A 3090 or 3090ti would give a noticeable improvement without costing $1600+.

Also, looking at the techpowerup review at 1440p, the 4090 is hit or miss, depending on the game it sometimes blows everything else out of the water and other times it has similar performance to a 3090ti.


----------



## bhav

StreaMRoLLeR said:


> He his one of the biggest 🤡 I ever seen here. He claimed LCD can do better blacks compared to OLED. What he try to do anyway ? 🤷‍♂️


Most of these people have never even used an OLED and claim OLED is too expensive while buying a 4090 for their 1080p 60hz screen.

If they don't have a 4K OLED why they even buying a 4090 for? Even if they have a 4K LCD like LOL upgrade that trash panel first.

For me 4090 would be nice now for my C1, but I want to trade in for a 4K 240hz OLED as soon as they drop, then I will need 5090 and better outputs than a 4090, so full skip on this gen for me.


----------



## Gogoo

jootn2kx said:


> Have you enabled HAGS / Resizable bar in the bios?
> You need these to be enabled to have frame generation option.


I do have both enabled yes. I upgraded from 3080TI so these options remained turned on.


----------



## tubs2x4

8472 said:


> The 3070 is roughly a 2080ti in terms of horsepower. I noticed a 50% gain in frames in Shadow of the Tomb Raider at 4k when upgrading from a 2080ti to a 3090. A 3090 or 3090ti would give a noticeable improvement without costing $1600+.
> 
> Also, looking at the techpowerup review at 1440p, the 4090 is hit or miss, depending on the game it sometimes blows everything else out of the water and other times it has similar performance to a 3090ti.


Yes I was looking at 4080 founders edition can still backirder from best buy yet. Be $600 cheaper after tax but maybe I should move up to 4k screen instead then 12700kf won’t be a bottle neck.


----------



## StreaMRoLLeR

Gogoo said:


> Hey dear people,
> I recently purchased 4090 Gainward (non-gs) and I noticed that it is clocking itself to 2.9-3Ghz by itself. It is stock and I havent even installed its software yet, is this normal?
> View attachment 2590237
> 
> This is the values from 10-15 mins of Cyberpunk. Also I cant seem to find the frame generation option, is this a special build that you need from a special place?


Seems great card

1- Can you upload Curve ss ? For 1.05 and 1.1,what is corresponding core freq at those voltage

2-In port royal make windowed custom run and voltage lock at 1,1v for 3000mhz. Whats your EFFECTIVE core freq ? 

3-If thats really good chip it should do 3090-3120 easy with 1,1 voltage


----------



## StreaMRoLLeR

Just made an ticket to be fowarded for Galax OC Team

If 666w vBios is increasing chip voltage more then 1.100mV 

Anyone here have direct contact with them ?


----------



## yzonker

Gogoo said:


> I do have both enabled yes. I upgraded from 3080TI so these options remained turned on.


CDPR hasn't released the CP update with frame gen. Everyone is still waiting for it.


----------



## alasdairvfr

yzonker said:


> CDPR hasn't released the CP update with frame gen. Everyone is still waiting for it.


What I don't understand is there are some people that have had a beta version of the game that had DLSS3 working and CDPR haven't released it to everyone. While I know studios often like to wait till the kinks are worked out, I never pegged CDPR as a company that does this. Their games are usually released in a rough state with patches smoothing things out over time.


----------



## yzonker

alasdairvfr said:


> What I don't understand is there are some people that have had a beta version of the game that had DLSS3 working and CDPR haven't released it to everyone. While I know studios often like to wait till the kinks are worked out, I never pegged CDPR as a company that does this. Their games are usually released in a rough state with patches smoothing things out over time.


My guess is that their primary focus has been the Witcher 3 update lately. Once they get that sorted a little better, we'll probably see the CP update.


----------



## jootn2kx

StreaMRoLLeR said:


> Just made an ticket to be fowarded for Galax OC Team
> 
> If 666w vBios is increasing chip voltage more then 1.100mV
> 
> Anyone here have direct contact with them ?


Not that we know of  you can try contact them good idea. I think its definately doing something with the voltage going above 1.1 or there are some other voltage parameters increased we can't monitor in software.


----------



## chispy

Finally my water block arrived today. Initially i went with the Bykski water block for my msi gaming 4090 , but it arrived from China with a cracked acrylic plate :/ . I have since send it back and got refund it , by luck i found the Phantek water block in stock at the US store and i snatch it rather quickly , i really wanted it the Bykski block but i could not wait any longer , hence Phantek will do 👌 .

Will post my findings using this Phantek water block later.


----------



## ArcticZero

chispy said:


> Finally my water block arrived today. Initially i went with the Bykski water block for my msi gaming 4090 , but it arrived from China with a cracked acrylic plate :/ . I have since send it back and got refund it , by luck i found the Phantek water block in stock at the US store and i snatch it rather quickly , i really wanted it the Bykski block but i could not wait any longer , hence Phantek will do 👌 .
> 
> Will post my findings using this Phantek water block later.
> 
> 
> View attachment 2590272
> 
> 
> View attachment 2590273


It's so beautiful. The aesthetics make me wish they did one for FE as well.


----------



## Nizzen

NVIDIA RTX 4090 Owner's Club


----------



## J7SC

chispy said:


> Finally my water block arrived today. Initially i went with the Bykski water block for my msi gaming 4090 , but it arrived from China with a cracked acrylic plate :/ . I have since send it back and got refund it , by luck i found the Phantek water block in stock at the US store and i snatch it rather quickly , i really wanted it the Bykski block but i could not wait any longer , hence Phantek will do 👌 .
> 
> Will post my findings using this Phantek water block later.
> 
> 
> View attachment 2590272
> 
> 
> View attachment 2590273


Nice ! I used to be a loyal EKWB customer, then also added Heatkiller to my list of preferred vendors (Alphacool & such still to try). Lately though, I abandoned the old favourites and prefer Phanteks and Bykski.

Apart from using Phanteks' X570 CPU blocks (x2), I also have a Phanteks block on the 3090 Strix (now in another build after being replaced by the 4090 / Bykski combo). The Phanteks 4090 block you showed looks somewhat similar to their Glacier 3090 product, at least around the center-left area (there are also some differences). The Phanteks quality is fantastic in my experience, the only fly in the ointment was/is the vertical mount as it keeps collecting and trapping air-bubbles just to the top left of the microfins...even after multiple air bubble removals over many months. I never did mount it horizontally, so I don't know for sure if the mounting orientation was the issue. FYI, the vertically-mounted Bykski blocks don't seem too have that issue.

And yes, I love Cherenkov radiation blue...


----------



## KedarWolf

If cost is not an issue, I swear by the Optimus Water Cooling blocks.

The only issue with them though is they have a huge full-cover backplate thermal pad, and it leaks non-harming nonconductive oil that can get on the GPU PCB or even the motherboard on horizontal motherboard mounts and it's hard to clean off.

The second issue it they only make them for a few models of video cards but I bought my Strix 4090 around that.

But as far as performance goes, best in the business.


----------



## ArcticZero

Optimus was my first choice given how stunning their designs look, and how amazing the Foundation block has been for my 5950x. Unfortunately my desire for a Strix led to nowhere and I ended up with a FE instead. Wish they made blocks for these too.


----------



## KedarWolf

ArcticZero said:


> Optimus was my first choice given how stunning their designs look, and how amazing the Foundation block has been for my 5950x. Unfortunately my desire for a Strix led to nowhere and I ended up with a FE instead. Wish they made blocks for these too.


They eventually are making blocks for the FE I believe, possibly the first quarter of next year.

Edit: You can contact their support, and get an update when they may be available.

Second edit: The black block is stunning!!


----------



## GAN77

Watercool showed and told a little.

*Modular high-performance base plate*
_To increase the cooling performance in the area of the GPU core, we have integrated a modular high-performance base plate into the heat sink. Due to the fine fins and the resulting increase in the cooling surface, the cooling performance is significantly higher compared to a monolithic design. We chose the fin structure in such a way that the flow rate is kept high in addition to the performance increase at the GPU core. All distances to heat-relevant components were kept tight, which means that the heat of the thermal problem zones such as GDDR6X Ram and voltage converter will be dissipated ideally. The PEEK spacers made of the high-performance plastic isolate the cooler optimally from the PCB and simplify the assembly. Thermal Grizzly's legendary Kryonaut thermal compound ensures the best possible heat transfer._



*







Modular high-*


----------



## dboom

EK full WB ready.








I scored 11 127 in Speed Way


AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com












I scored 28 711 in Port Royal


AMD Ryzen 9 5900X, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com


----------



## kx11

Testing out 13900kf










I scored 16 161 in Time Spy Extreme


Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## joseph.tanner10

Just going to post my 4090 experience, not sure if it will help anyone.

Bought a 4090 TUF non OC from my local Microcenter on launch day. I purchased a EKWB Vector 2 block for it. After installing the block I would see temps rise to 51-55c stock @ 415-435 watt power draw in Speedway loop. water at 35c. Ambient @ 22c. I pulled the card out twice and checked the mount. Seemed to be good contact each time. I used KPx thermal paste each time and used a even spread method. When I OC'd the card @ 260 core and 1100Mem would see temps immediately jump to 57c them top out at 63c @ 530-570watts in same speedway bench. water was 39c. Each time after the benchmark temps would immediately drop to whatever water temp was and stay there.

I've since been able to attain a 4090 FE from a best buy drop. Again at my local Microcenter I noticed a water block from Bitspower for the 4090FE. Bitspower. I figured I'd give it a shot since I'd be waiting for alphacool to ship it's new core series GPU for the FE edition. I also purchased another temp sensor by Bitpower, the digital kind. I placed that before water enters the gpu and CPU. After mounting this block I now see temps in the same Speedway bench @ stock 42-44c @ 415-435 watts. Water 29c at the first temp sensor, and the other temp sensor is after the CPU and GPU and it read 33c. After a dirty OC just to throw some heat at it ( 240 core and 1100mem) I saw temps @ 47-50c @ 500-575 watts. Water was 30c and 35c respectively. Each bench was 45min. All done in a Lian Li V3000plus with 3 480mm rads. two heatkiller L series and 1 EKWB new rad 44mm thick. Heatkillers have Infinity fans set to max out @ 1100rpm and EK rad has T-30's maxing out at 1000rpm. Pump speed maxes out at 3500rpm @ 30c water.

That was a big jump for me in the right direction. I'm not saying the first temps were EK's fault as I'm sure I messed up the mount somehow even though I checked twice. I'm more impressed that bitpower made a really good block. Was unaware it would perform so well. The biggest takeaway from all this is NO MORE COIL WHINE!!!!


----------



## dr/owned

joseph.tanner10 said:


> That was a big jump for me in the right direction. I'm not saying the first temps were EK's fault as I'm sure I messed up the mount somehow even though I checked twice. I'm more impressed that bitpower made a really good block. Was unaware it would perform so well. The biggest takeaway from all this is NO MORE COIL WHINE!!!!


With my Phantek's block sucking I can't say for sure what the problem is. It didn't have a jetplate which probably didn't help it, but I was seeing something like +30C delta with 500W load. My guess is that they made the standoffs targeting the metal-frame around the die height which is 2.6mm. The die itself sits at 3.0mm off the PCB. I think this was causing the block to deform around the die in a ^ shape. With large die like this I think they probably should target the standoffs at 2.9mm which is what Alphacool did and it is great performance.


----------



## jootn2kx

joseph.tanner10 said:


> Just going to post my 4090 experience, not sure if it will help anyone.
> 
> Bought a 4090 TUF non OC from my local Microcenter on launch day. I purchased a EKWB Vector 2 block for it. After installing the block I would see temps rise to 51-55c stock @ 415-435 watt power draw in Speedway loop. water at 35c. Ambient @ 22c. I pulled the card out twice and checked the mount. Seemed to be good contact each time. I used KPx thermal paste each time and used a even spread method. When I OC'd the card @ 260 core and 1100Mem would see temps immediately jump to 57c them top out at 63c @ 530-570watts in same speedway bench. water was 39c. Each time after the benchmark temps would immediately drop to whatever water temp was and stay there.
> 
> I've since been able to attain a 4090 FE from a best buy drop. Again at my local Microcenter I noticed a water block from Bitspower for the 4090FE. Bitspower. I figured I'd give it a shot since I'd be waiting for alphacool to ship it's new core series GPU for the FE edition. I also purchased another temp sensor by Bitpower, the digital kind. I placed that before water enters the gpu and CPU. After mounting this block I now see temps in the same Speedway bench @ stock 42-44c @ 415-435 watts. Water 29c at the first temp sensor, and the other temp sensor is after the CPU and GPU and it read 33c. After a dirty OC just to throw some heat at it ( 240 core and 1100mem) I saw temps @ 47-50c @ 500-575 watts. Water was 30c and 35c respectively. Each bench was 45min. All done in a Lian Li V3000plus with 3 480mm rads. two heatkiller L series and 1 EKWB new rad 44mm thick. Heatkillers have Infinity fans set to max out @ 1100rpm and EK rad has T-30's maxing out at 1000rpm. Pump speed maxes out at 3500rpm @ 30c water.
> 
> That was a big jump for me in the right direction. I'm not saying the first temps were EK's fault as I'm sure I messed up the mount somehow even though I checked twice. I'm more impressed that bitpower made a really good block. Was unaware it would perform so well. The biggest takeaway from all this is NO MORE COIL WHINE!!!!
> View attachment 2590323


You mean no more coil whine because the waterblock dampens the noise in some way? Very interesting but also bit confused


----------



## joseph.tanner10

jootn2kx said:


> You mean no more coil whine because the waterblock dampens the noise in some way? Very interesting but also bit confused


Sorry, I wrote it a bit confusing. I now have a Founder edition, instead of the TUF. the founders has no coil whine now, zero.


----------



## J7SC

Yesterday, I posted three game results (FS 2020, F1 2021, CP 2077) for the Galax vbios... Below, I expand that by the G-G-OC results with the same mild oc settings per MSI AB. Same approximate ambient and same OLEDh la la monitor and game settings w/everything maxed. FYI, F1 2021 is actually a worse power hog than CP 2077...

The Galax vbios definitely has some spicy secret sauce in it...


----------



## mouacyk

GAN77 said:


> Watercool showed and told a little.
> 
> *Modular high-performance base plate*
> _To increase the cooling performance in the area of the GPU core, we have integrated a modular high-performance base plate into the heat sink. Due to the fine fins and the resulting increase in the cooling surface, the cooling performance is significantly higher compared to a monolithic design. We chose the fin structure in such a way that the flow rate is kept high in addition to the performance increase at the GPU core. All distances to heat-relevant components were kept tight, which means that the heat of the thermal problem zones such as GDDR6X Ram and voltage converter will be dissipated ideally. The PEEK spacers made of the high-performance plastic isolate the cooler optimally from the PCB and simplify the assembly. Thermal Grizzly's legendary Kryonaut thermal compound ensures the best possible heat transfer._
> 
> 
> 
> *
> 
> 
> 
> 
> 
> 
> 
> Modular high-*


Isn't this bad for cleaning?


----------



## wtf_apples

Hello
Trying the galax bios now, seems to have disabled one of my display ports. Is that normal? I switched it over to a different one to get my second monitor working again.
Thanks


----------



## J7SC

wtf_apples said:


> Hello
> Trying the galax bios now, seems to have disabled one of my display ports. Is that normal? I switched it over to a different one to get my second monitor working again.
> Thanks


...more or less normal, depending on the stock i/o layout and config of your card


----------



## mirkendargen

wtf_apples said:


> Hello
> Trying the galax bios now, seems to have disabled one of my display ports. Is that normal? I switched it over to a different one to get my second monitor working again.
> Thanks


Asus card? I think they're the only ones with different display configs this time around.


----------



## wtf_apples

oh ya forgot to mention that, asus tuf


----------



## bmagnien

J7SC said:


> Yesterday, I posted three game results (FS 2020, F1 2021, CP 2077) for the Galax vbios... Below, I expand that by the G-G-OC results with the same mild oc settings per MSI AB. Same approximate ambient and same OLEDh la la monitor and game settings w/everything maxed. FYI, F1 2021 is actually a worse power hog than CP 2077...
> 
> The Galax vbios definitely has some spicy secret sauce in it...
> 
> View attachment 2590376


@J7SC What are your settings for flight sim? Getting some cpu bottleneck I believe. Trying to max out visual fidelity, but fine with using dlss and frame gen. With absolutely everything maxed at 4K I’m just on the underside of what I’d consider playable. I feel like there’s just a couple settings I need to tone down that I probably wouldn’t even notice from a visual sense. Any ideas? There’s like a billion settings…


----------



## J7SC

bmagnien said:


> @J7SC What are your settings for flight sim? Getting some cpu bottleneck I believe. Trying to max out visual fidelity, but fine with using dlss and frame gen. With absolutely everything maxed at 4K I’m just on the underside of what I’d consider playable. I feel like there’s just a couple settings I need to tone down that I probably wouldn’t even notice from a visual sense. Any ideas? There’s like a billion settings…


...prepping some screenies now for FS2020 settings; what CPU and RAM do you have ?


----------



## bmagnien

J7SC said:


> ...prepping some screenies now for FS2020 settings; what CPU and RAM do you have ?


5800x3d -30 all core, 3800 c14


----------



## mirkendargen

wtf_apples said:


> oh ya forgot to mention that, asus tuf


Normal and expected then


----------



## J7SC

bmagnien said:


> 5800x3d -30 all core, 3800 c14


FS 2020 and some system settings below. A couple of quick points: 
FS2020 ties you into their servers as much as they can, but I run an unlimited 'local' rolling cache which really helps once you have visited areas before (thus loaded them into local cache). That helps a lot re. stutter, which may actually be network latency. Also, the faster your internet connection, the better (I'm on 1Gbps up & down). Not shown below is ground and sea traffic which is near-maxed.

Re. HDR10 in FS 2020, I tend to toggle that depending on graphics settings which usually is GSync on, HDR (Win) on, 3840x2160 / 120 Hz. If I fly at night, dusk or dawn, I'll add HDR10 inside FS 2020. Per my prior posts above, I typically game on the GPU with stock voltage & PL, + 212 to +244 core, and + 1404 to +1499 on VRAM. 

Your 5800X3D should actually beat my 5950X in FS 2020, though the latest FS2020 DX12 patch has helped a lot.


----------



## bmagnien

J7SC said:


> FS 2020 and some system settings below. A couple of quick points:
> FS2020 ties you into their servers as much as they can, but I run an unlimited 'local' rolling cache which really helps once you have visited areas before (thus loaded them into local cache). That helps a lot re. stutter, which may actually be network latency. Also, the faster your internet connection, the better (I'm on 1Gbps up & down). Not shown below is ground and sea traffic which is near-maxed.
> 
> Re. HDR10 in FS 2020, I tend to toggle that depending on graphics settings which usually is GSync on, HDR (Win) on, 3840x2160 / 120 Hz. If I fly at night, dusk or dawn, I'll add HDR10 inside FS 2020. Per my prior posts above, I typically game on the GPU with stock voltage & PL, + 212 to +244 core, and + 1404 to +1499 on VRAM.
> 
> Your 5800X3D should actually beat my 5950X in FS 2020, though the latest FS2020 DX12 patch has helped a lot.
> View attachment 2590381


Awesome. I’ll try out the rolling cache. Can you give me a ballpark of how big that file gets? Also - just taking a quick glance at your settings, is there any one that is not literally at its max?


----------



## J7SC

bmagnien said:


> Awesome. I’ll try out the rolling cache. Can you give me a ballpark of how big that file gets? Also - just taking a quick glance at your settings, is there any one that is not literally at its max?


...settings are probably maxed (...4090 says hi😀) ...I used to limit local rolling cache to 200 GB but now just run it w/o limit. I don't think it ever actually had more than 70 GB data in it. Also: All saves are 'local', not on the XBox server....my 'test flight' is the Discovery Flight / NY one with the Cessna (not the new one with the helicopter - I keep on crashing_ that _thing 🤪). You should notice a difference w/rolling cache after a flew flights in both directions thus pre-loading the rolling cache.


----------



## jootn2kx

Is running the card @ 76/77°C(82vram and 87 hotspot) bit too hot for running games? The card is overclocked with +160 core and +1900 memory.
Wat do you guys think?
I set my fans bit lower for the noise and my case is closed.


----------



## wtf_apples

Tried the galax bios and it seems to work ok.. Need to get my card blocked. Spent the evening fixing my os, reloading the image and getting the drivers sorted...
Need to lower gpu memory(lol 2k) and push the cpu more.
NVIDIA GeForce RTX 4090 video card benchmark result - Intel Core i9-13900K Processor,EVGA Corp. Z690 DARK KINGPIN (3dmark.com)


----------



## KingEngineRevUp

If you guys hate math, please skip this post immediately. 

So lets try and think of a way to increase our memory temperatures for better OC while having a water block. 









For my specific block during gaming loads: Delta T = 22K, Thermal Conductivity (Given by EKWB) = K = 3.5 W/mK, that means Qd/A = Delta T * K = 77. Max temperature of memory is 55C. 

If I want my memory to be at 80C, then I need to raise my delta by +25K

K = 77 / (22 + 25) = *1.5 W/mK*

So in theory... If I use thermal pads that are 1.5-1.8 W/mK, my memory will be at approximately 70C (1.8 W/mK) - 80C (1.5 W/mK) and I could get my memory OC's back. My system won't green screen also when idling and temperatures drop to 30C and below. 

But that doesn't tell me the whole story... Because thermal pad properties depend on mounting pressure... Going from 3.5 to 1.5 is a 2.33 factor

EKWB pads








So at 9-10 psi, the resistance is at 0.5, that means I need a pad with a resistance at 1.165

The only one I found is this one https://www.digikey.com/en/products/detail/wakefield-vette/PL-1-1-5-76X127-15/16023924










At 10 psi the resisitance is at 0.8, this wouldn't give me the results I'm looking for. Instead of my memory being at 80C, it would be at approximately *68C, it would add 13.2 to my Delta T going from 22C to 35.2C. *


----------



## mirkendargen

KingEngineRevUp said:


> If you guys hate math, please skip this post immediately.
> 
> So lets try and think of a way to increase our memory temperatures for better OC while having a water block.
> View attachment 2590389
> 
> 
> For my specific block during gaming loads: Delta T = 22K, Thermal Conductivity (Given by EKWB) = K = 3.5 W/mK, that means Qd/A = Delta T * K = 77. Max temperature of memory is 55C.
> 
> If I want my memory to be at 80C, then I need to raise my delta by +25K
> 
> K = 77 / (22 + 25) = *1.5 W/mK*
> 
> So in theory... If I use thermal pads that are 1.5-1.8 W/mK, my memory will be at approximately 70C (1.8 W/mK) - 80C (1.5 W/mK) and I could get my memory OC's back. My system won't green screen also when idling and temperatures drop to 30C and below.
> 
> But that doesn't tell me the whole story... Because thermal pad properties depend on mounting pressure... Going from 3.5 to 1.5 is a 2.33 factor
> 
> EKWB pads
> View attachment 2590390
> 
> So at 9-10 psi, the resistance is at 0.5, that means I need a pad with a resistance at 1.165
> 
> The only one I found is this one https://www.digikey.com/en/products/detail/wakefield-vette/PL-1-1-5-76X127-15/16023924
> 
> View attachment 2590391
> 
> 
> At 10 psi the resisitance is at 0.8, this wouldn't give me the results I'm looking for. Instead of my memory being at 80C, it would be at approximately *68C, it would add 13.2 to my Delta T going from 22C to 35.2C. *


Putting an appropriate number of layers of tape (I don't know how many layers is appropriate without testing) on the block between the pads and the block is way more practical than trying to find an appropriately bad thermal pad that actually has accurate specs.

But even doing this your memory will cool down to your coolant temp when idle, and could still crash if you immediately put it under load at some speed that's only stable at higher temps. This is really something that would only be useful for benching where you could run it at load at a lower speed to warm it up, then clock it at your desired speed for a bench run.


----------



## KingEngineRevUp

mirkendargen said:


> Putting an appropriate number of layers of tape (I don't know how many layers is appropriate without testing) on the block between the pads and the block is way more practical than trying to find an appropriately bad thermal pad that actually has accurate specs.


At least the thermal pads I'm looking at have been thoroughly tested.

Adding tape would probably work too, I'd put them on the block side and use Kapton tape.

This also could be calculated knowing the tapes thermal conductivity rating and doing the calculation in series.


----------



## StreaMRoLLeR

KingEngineRevUp said:


> If you guys hate math, please skip this post immediately.
> 
> So lets try and think of a way to increase our memory temperatures for better OC while having a water block.
> View attachment 2590389
> 
> 
> For my specific block during gaming loads: Delta T = 22K, Thermal Conductivity (Given by EKWB) = K = 3.5 W/mK, that means Qd/A = Delta T * K = 77. Max temperature of memory is 55C.
> 
> If I want my memory to be at 80C, then I need to raise my delta by +25K
> 
> K = 77 / (22 + 25) = *1.5 W/mK*
> 
> So in theory... If I use thermal pads that are 1.5-1.8 W/mK, my memory will be at approximately 70C (1.8 W/mK) - 80C (1.5 W/mK) and I could get my memory OC's back. My system won't green screen also when idling and temperatures drop to 30C and below.
> 
> But that doesn't tell me the whole story... Because thermal pad properties depend on mounting pressure... Going from 3.5 to 1.5 is a 2.33 factor
> 
> EKWB pads
> View attachment 2590390
> 
> So at 9-10 psi, the resistance is at 0.5, that means I need a pad with a resistance at 1.165
> 
> The only one I found is this one https://www.digikey.com/en/products/detail/wakefield-vette/PL-1-1-5-76X127-15/16023924
> 
> View attachment 2590391
> 
> 
> At 10 psi the resisitance is at 0.8, this wouldn't give me the results I'm looking for. Instead of my memory being at 80C, it would be at approximately *68C, it would add 13.2 to my Delta T going from 22C to 35.2C. *


I love how this post is produced 

According to my finding for my Suprim Liquid X

I dont need 80c's to stable my perfect memory OC. In fact I only crashed with my stable mem oc after cold booted the pc. After the second round when my card heats up evenly everywhere, ( probably around 58-62c on mem on second round), I could get it stable again.

May I ask where the 80c comes from ? Does it overclock more around 80c ? ( excluding the LN2 and Elmor vram heater conditions )


----------



## StreaMRoLLeR

jootn2kx said:


> Is running the card @ 76/77°C(82vram and 87 hotspot) bit too hot for running games? The card is overclocked with +160 core and +1900 memory.
> Wat do you guys think?
> I set my fans bit lower for the noise and my case is closed.


Do not stay at +1900 MEM for daily 24/7 usage. According to previous pages from a Giga G OC owner, after 2 weeks he got crashes. Settle down to 1500 for Gaming. AFAIK you have gainward 4090 which isnt known for superior heatsink design like Strix or Giga G OC. So I would say a bit high.


----------



## Nico67

KingEngineRevUp said:


> If you guys hate math, please skip this post immediately.
> 
> So lets try and think of a way to increase our memory temperatures for better OC while having a water block.
> View attachment 2590389
> 
> 
> For my specific block during gaming loads: Delta T = 22K, Thermal Conductivity (Given by EKWB) = K = 3.5 W/mK, that means Qd/A = Delta T * K = 77. Max temperature of memory is 55C.
> 
> If I want my memory to be at 80C, then I need to raise my delta by +25K
> 
> K = 77 / (22 + 25) = *1.5 W/mK*
> 
> So in theory... If I use thermal pads that are 1.5-1.8 W/mK, my memory will be at approximately 70C (1.8 W/mK) - 80C (1.5 W/mK) and I could get my memory OC's back. My system won't green screen also when idling and temperatures drop to 30C and below.
> 
> But that doesn't tell me the whole story... Because thermal pad properties depend on mounting pressure... Going from 3.5 to 1.5 is a 2.33 factor
> 
> EKWB pads
> View attachment 2590390
> 
> So at 9-10 psi, the resistance is at 0.5, that means I need a pad with a resistance at 1.165
> 
> The only one I found is this one https://www.digikey.com/en/products/detail/wakefield-vette/PL-1-1-5-76X127-15/16023924
> 
> View attachment 2590391
> 
> 
> At 10 psi the resisitance is at 0.8, this wouldn't give me the results I'm looking for. Instead of my memory being at 80C, it would be at approximately *68C, it would add 13.2 to my Delta T going from 22C to 35.2C. *


TBH, the only real solution is to run a GPU only block like a thermosphere and heatsink everything else, or just take the hit and know your card will last longer being cooler


----------



## Edge0fsanity

KedarWolf said:


> If cost is not an issue, I swear by the Optimus Water Cooling blocks.
> 
> The only issue with them though is they have a huge full-cover backplate thermal pad, and it leaks non-harming nonconductive oil that can get on the GPU PCB or even the motherboard on horizontal motherboard mounts and it's hard to clean off.
> 
> The second issue it they only make them for a few models of video cards but I bought my Strix 4090 around that.
> 
> But as far as performance goes, best in the business.


eyeglass cleaner with a mf cloth will get the oil off. Safe to use on any surface as well.


----------



## yzonker

mirkendargen said:


> Putting an appropriate number of layers of tape (I don't know how many layers is appropriate without testing) on the block between the pads and the block is way more practical than trying to find an appropriately bad thermal pad that actually has accurate specs.
> 
> But even doing this your memory will cool down to your coolant temp when idle, and could still crash if you immediately put it under load at some speed that's only stable at higher temps. This is really something that would only be useful for benching where you could run it at load at a lower speed to warm it up, then clock it at your desired speed for a bench run.


Yes, this is why I haven't bothered to try to re-mount my card. It will still cold soak down close to the coolant temp. Even benching this is a problem since the bench will still start near coolant temp. Even if I run something prior to starting the bench, my testing showed it was only 3 or 4 seconds before the mem cooled back off to near idle temp. It helps to run something prior (like the mem tester) but it has limited effectiveness because of this.

Although for benching with an active backplate heater like I've been using, I could probably keep the mem warmer since the block will be less effective at transferring the heat I'm adding. But I've gotten within 50-75 mhz of my max air cooled OC at around 12-15C water (can get it all back at ambient water), so I called it good enough for now. Assuming I could recover the full 75mhz, it still would be less than 100pts in PR probably.


----------



## GAN77

HOF 4090.


----------



## Mystic33

GAN77 said:


> HOF 4090.


3120mhz on the core clock and instant crash in the time spy... must be a joke that silicon...


----------



## StreaMRoLLeR

Mystic33 said:


> 3120mhz on the core clock and instant crash in the time spy... must be a joke that silicon...


Wonders what is voltage IC for HoF. Strix and Suprim should be able to access that galax hof tweak program if they have same voltage IC ?


----------



## 681933

Well I just got and installed my 4090 and I can confirm the 5800X3D carries it like a dream, full 99% GPU utilization all the time. There is also no coil whine on this specific Suprim X model, loving it. Can't believe I was about to throw out my Seasonic because ASUS used crappy coils.

There are also no shutdown issues with my Seasonic Prime PX-750.


----------



## KingEngineRevUp

StreaMRoLLeR said:


> I love how this post is produced
> 
> According to my finding for my Suprim Liquid X
> 
> I dont need 80c's to stable my perfect memory OC. In fact I only crashed with my stable mem oc after cold booted the pc. After the second round when my card heats up evenly everywhere, ( probably around 58-62c on mem on second round), I could get it stable again.
> 
> May I ask where the 80c comes from ? Does it overclock more around 80c ? ( excluding the LN2 and Elmor vram heater conditions )


70-80C was around where I was before IIRC. Truth, I don't know the threshold of where I need to be there to get my higher memory OC 

But more importantly, I don't want to deal with low idle temperature causing me to green screen anymore. So 80C gaming would mean 50ish C idle.

I just hate having to apply my clocks when I game and turning them off when I'm done. Sometimes I forget and the green screen happens when my water temperature goes back down.


----------



## Blameless

KingEngineRevUp said:


> But that doesn't tell me the whole story... Because thermal pad properties depend on mounting pressure... Going from 3.5 to 1.5 is a 2.33 factor


It also doesn't take into account the amount of heat sunk by the PCB rather than the block, which is significant, but quite uncertain without a lot of work.

It's going to come down to mostly trial and error. I'd just get some nice, cheap, conformable, 1W/mK pads and start there. If that gets close enough, you could then cool or insulate the back of the PCB in the vicinity of the memory, as needed.


----------



## yzonker

KingEngineRevUp said:


> 70-80C was around where I was before IIRC. Truth, I don't know the threshold of where I need to be there to get my higher memory OC
> 
> But more importantly, I don't want to deal with low idle temperature causing me to green screen anymore. So 80C gaming would mean 50ish C idle.
> 
> I just hate having to apply my clocks when I game and turning them off when I'm done. Sometimes I forget and the green screen happens when my water temperature goes back down.


No, at idle mem temp will still be very close to water temp since the chips don't pull hardly any power at idle. Locking the frequency keeps it from idling, but that is only a 2C increase on my card, but with 10 W/mK thermal putty. I wouldn't expect more than a 4C increase with cheap pads. 

Keep in mind that going from 3.5 W/mK pads to 10+ W/mK pads only reduces mem temps on a 30 series card by ~10C, and the 1Gb mem chips pull a lot more power than the 2Gb chips on the 4090.


----------



## kx11

Am i losing performance here?? i have too many NVMEs hooked like 5 of them and 2 sata ssds


----------



## J7SC

Mystic33 said:


> 3120mhz on the core clock and instant crash in the time spy... must be a joke that silicon...


...I was thinking the same, especially with all those extra voltage adjustment options the Galax HoF software offers...then again, that card probably really comes alive when a LN2 pot is mounted.

@bmagnien ...forgot to add that I run resizable_BAR 'forced on' for all the games - seems to help with FS 2020 as well on my setup.


----------



## yzonker

kx11 said:


> Am i losing performance here?? i have too many NVMEs hooked like 5 of them and 2 sata ssds
> 
> 
> 
> View attachment 2590434


A little bit. Probably only measurable with a benchmark. Less than 5%.


----------



## kx11

yzonker said:


> A little bit. Probably only measurable with a benchmark. Less than 5%.


----------



## J7SC

kx11 said:


> View attachment 2590435


...Not quite sure why that is. Bios settings on PCIe gen preference ? On my X570 Asus DarkH with 2x SN850s and multiple Sata, I get this, whether with either the 3090 or 4090:









---

Back to using the stock G-GA-OC vBios instead of the Galax; these are my typical games settings; FS2020 max power in pic below, with CP 2077 max power is ~ 445W w/the same settings...


----------



## kx11

J7SC said:


> ...Not quite sure why that is. Bios settings on PCIe gen preference ? On my X570 Asus DarkH with 2x SN850s and multiple Sata, I get this, whether with either the 3090 or 4090:
> View attachment 2590445



It was the NVME adapter in PCI slot 2 (x8) pulling bandwidth away from PCI slot1 (x16), now it's fixed bu the NVME is runniing 50% slower speeds since it's in 3rd PCI slot (x4)


----------



## J7SC

kx11 said:


> It was the NVME adapter in PCI slot 2 (x8) pulling bandwidth away from PCI slot1 (x16), now it's fixed bu the NVME is runniing 50% slower speeds since it's in 3rd PCI slot (x4)


...ahh, got it.


----------



## Nolongerhuman

My pc restarts when I game and it does not seem load dependent I ran all sorts of artificial benchmarks and it performs great. Only with games and nothing else and just certain games but a lot of them too. It’s weird because I’ve never had a bsod not one since I built it not a single ctd not one either. It’s like when a game crashes the pc restarts instead idk what’s going on I ran a mem test everything seems fine all my temps are fine the kid doesn’t seem to matter when it happens. Every component is new I’ll post my specs below 

GPU - RTX Strix OC 4090
CPU - RYZEN 9 7950x
MEM - Trident Z5 NEO AMD Expo 2x16GB
MOBO - MSI MEG ACE X670E
PSU - Thermaltake ToughPower GF3 ATX 3.0 1650w
STORAGE - Hynix Platinum 500GB (OS Drive) + 2x 2TB Storage Drives


----------



## StreaMRoLLeR

Crylune said:


> Well I just got and installed my 4090 and I can confirm the 5800X3D carries it like a dream, full 99% GPU utilization all the time. There is also no coil whine on this specific Suprim X model, loving it. Can't believe I was about to throw out my Seasonic because ASUS used crappy coils.
> 
> There are also no shutdown issues with my Seasonic Prime PX-750.


Glad to hear another Suprim X with no coil whine. May you upload a small sample video ? There are people hunting-seeking those kind of videos and will help them. 

What is your effective at 1.1v 3000mhz flat in PR ? ( stock bios )


----------



## Pepillo

Another one with the Galax bios, works great on my modest Gainward Phantom:


















I scored 20 252 in Time Spy Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

Nolongerhuman said:


> My pc restarts when I game and it does not seem load dependent I ran all sorts of artificial benchmarks and it performs great. Only with games and nothing else and just certain games but a lot of them too. It’s weird because I’ve never had a bsod not one since I built it not a single ctd not one either. It’s like when a game crashes the pc restarts instead idk what’s going on I ran a mem test everything seems fine all my temps are fine the kid doesn’t seem to matter when it happens. Every component is new I’ll post my specs below
> 
> GPU - RTX Strix OC 4090
> CPU - RYZEN 9 7950x
> MEM - Trident Z5 NEO AMD Expo 2x16GB
> MOBO - MSI MEG ACE X670E
> PSU - Thermaltake ToughPower GF3 ATX 3.0 1650w
> STORAGE - Hynix Platinum 500GB (OS Drive) + 2x 2TB Storage Drives


Does this with everything at defaults? CPU/Mem/GPU.


----------



## lmfodor

J7SC said:


> I think that is normal as some of these cards have three fans (but only two entries in the monitoring software); the outer two spin at a slightly different speed compared to the center one which also counterrotates for sound and vibration reasons. When my Gigabyte Gaming OC still had the air-cooler, it showed a similar differential.
> 
> ----
> 
> I tried out the Galax vBios for gaming for the first time (I normally use the G-G-OC vbios for gaming). Even w/o any extra sliders for voltage or PL, it consistently hovered around 3030 MHz (3025 effective). It also ran a bit hotter since I used more power...max was 533 W at 'stock' PL  and came via Cyberpunk 2077 (everything maxed at 4K). FS 2020 was at about 430 W (as opposed to 390 W for G-OC) and F1 2021 settled between the two. Ambient was 25 C (unlike outside )
> View attachment 2590185


Hi @J7SC, I tried with an undervolt curve and lower PL and finally both fans show the same speed on GPUz and MSI AF. So it seems to be normal. 

Regarding Galaxy vBIOS, is it worth it for daily use (for gaming)? Or is it only for Benchs? I'm with stock fans, so maybe it would make sense for those moving to watercool. Thank you!


Sent from my iPhone using Tapatalk Pro


----------



## 681933

LOL, beyond fast indeed


----------



## Nolongerhuman

yzonker said:


> Does this with everything at defaults? CPU/Mem/GPU.


yes I tried that and the issue persists over locking or stock the same thing happens I only use gaming boost and expo for the overclocking


----------



## Nico67

Nolongerhuman said:


> yes I tried that and the issue persists over locking or stock the same thing happens I only use gaming boost and expo for the overclocking


Try the 7950x with fixed clk and voltage, lots of weird boost issues with AM5. Also all the temps are good?


----------



## Nolongerhuman

Nico67 said:


> Try the 7950x with fixed clk and voltage, lots of weird boost issues with AM5. Also all the temps are good?


Yeah the temps are great I will try that is there a recommended fixed clk and voltage?


----------



## Nolongerhuman

Nico67 said:


> Try the 7950x with fixed clk and voltage, lots of weird boost issues with AM5. Also all the temps are good?


Also I set the uclk=memclk but I’m assuming that’s not the same as making the clk fixed


----------



## galimberti

Question:
Are vram oc and core oc indepentent? Or does increasing vram clock affects core oc stability?
Because it looks like I can get stable, +165 on core, and +1500 on vram, but not both at the same time.


----------



## yzonker

galimberti said:


> Question:
> Are vram oc and core oc indepentent? Or does increasing vram clock affects core oc stability?
> Because it looks like I can get stable, +165 on core, and +1500 on vram, but not both at the same time.


Yea, there is some minor interdependence. Usually don't have to back one or the other off much though.


----------



## J7SC

lmfodor said:


> Hi @J7SC, I tried with an undervolt curve and lower PL and finally both fans show the same speed on GPUz and MSI AF. So it seems to be normal.
> 
> Regarding Galaxy vBIOS, is it worth it for daily use (for gaming)? Or is it only for Benchs? I'm with stock fans, so maybe it would make sense for those moving to watercool. Thank you!
> 
> 
> Sent from my iPhone using Tapatalk Pro


...I would say it is primarily for benching...essentially, it can give you s.th. like ~ 90 to 120 extra watts at the cost of more heat, and may be an extra clock bin in oc'ing at full blast settings. It also doesn't down-clock as much. Still, I am not sure if it is worth it for gaming and the fps gain. I would definitely water-cool the card when using the Galax vbios, be it for gaming or benching, as it always seems to add more heat... Just make sure you read up on the VRAM temp issue re. water-cooling as VRAM actually likes higher temps (within reason), unlike the core.

Does your card of a dual vbios ? If so, I would just toggle between them (ie. benching and gaming); that's what I do with my setup.



Crylune said:


> LOL, beyond fast indeed
> 
> View attachment 2590471


...busted for speeding 🥴, on Christmas Eve no less.


----------



## lmfodor

J7SC said:


> ...I would say it is primarily for benching...essentially, it can give you s.th. like ~ 90 to 120 extra watts at the cost of more heat, and may be an extra clock bin in oc'ing at full blast settings. It also doesn't down-clock as much. Still, I am not sure if it is worth it for gaming and the fps gain. I would definitely water-cool the card when using the Galax vbios, be it for gaming or benching, as it always seems to add more heat... Just make sure you read up on the VRAM temp issue re. water-cooling as VRAM actually likes higher temps (within reason), unlike the core.
> 
> Does your card of a dual vbios ? If so, I would just toggle between them (ie. benching and gaming); that's what I do with my setup.
> 
> 
> 
> ...busted for speeding , on Christmas Eve no less.


Hi @J7SC, yes, I’ve dual BIOS so I’d flash in the Silent mode BIOS, at list for a little fun with the OC.. I’ll take care of the temps. Thanks! 


Sent from my iPhone using Tapatalk Pro


----------



## J7SC

lmfodor said:


> Hi @J7SC, yes, I’ve dual BIOS so I’d flash in the Silent mode BIOS, at list for a little fun with the OC.. I’ll take care of the temps. Thanks!
> 
> 
> Sent from my iPhone using Tapatalk Pro


 all set than. FYI, just remember to boot down Windows while holding down the shift key before throwing that physical vbios switch...


----------



## Pepillo

Quick question, is it possible to exceed the +2000 limit of memories in the afterburner?


----------



## wtf_apples

J7SC said:


> all set than. FYI, just remember to boot down Windows while holding down the shift key before throwing that physical vbios switch...


Never heard of this method tbh. I just shut down and flick the switch. Whats the benefit?


----------



## wtf_apples

that galax is a retail card


----------



## mirkendargen

wtf_apples said:


> Never heard of this method tbh. I just shut down and flick the switch. Whats the benefit?


I think if you have fast startup enabled in Windows this forces a fresh startup rather than hibernating the kernel. I always disable that anyway.


----------



## wtf_apples

mirkendargen said:


> I think if you have fast startup enabled in Windows this forces a fresh startup rather than hibernating the kernel. I always disable that anyway.


oh ya forgot about that. I always disable fastboot in windows and bios. Ill make a note of the shift method so I remember!


----------



## J7SC

wtf_apples said:


> Never heard of this method tbh. I just shut down and flick the switch. Whats the benefit?





wtf_apples said:


> oh ya forgot about that. I always disable fastboot in windows and bios. Ill make a note of the shift method so I remember!


Whenever you load a new / different vbios, Windows will enumerate it as a separate GPU in its registry. If you have fastboot enabled, you will likely get a 'son of BSOD' initially after booting down and switching vbios, then restart. After that, it will reboot ok anyway. When holding down shift when booting down, you circumvent that (minor) issue. Also, I ate too much turkey and stuffing😋


----------



## cerealkeller

Edge0fsanity said:


> eyeglass cleaner with a mf cloth will get the oil off. Safe to use on any surface as well.


I have found the easiest way to clean up oily residue is to use dish soap, if you have glasses and you've never tried dish soap, it's a game changer


----------



## Krzych04650

J7SC said:


> Whenever you load a new / different vbios, Windows will enumerate it as a separate GPU in its registry. If you have fastboot enabled, you will likely get a 'son of BSOD' initially after booting down and switching vbios, then restart. After that, it will reboot ok anyway. When holding down shift when booting down, you circumvent that (minor) issue. Also, I ate too much turkey and stuffing😋


Hm, I didn't know that. I guess that explains why it was adding - 1, - 2, - 3... to the name of my audio device with every new BIOS flash.


----------



## lmfodor

J7SC said:


> all set than. FYI, just remember to boot down Windows while holding down the shift key before throwing that physical vbios switch...


Hm, do you mean press shift while shutting down Windows to enter safe mode? Or when you start the PC after changing the vBIOS switch, when Windows is starting? It is clear that I never did [emoji28]


Sent from my iPhone using Tapatalk Pro


----------



## alasdairvfr

Merry Christmas Overclockers!


----------



## leonman44

Merry Christmas to everyone !
Guys only Gpu available is the Zotac Extreme Airo but what it bugs me is the limited power limit to 495watts. Is there a way to unlock 600watt power limit by flashing another bios? Should i grab this instead of waiting for a better card?


----------



## Krzych04650

leonman44 said:


> Merry Christmas to everyone !
> Guys only Gpu available is the Zotac Extreme Airo but what it bugs me is the limited power limit to 495watts. Is there a way to unlock 600watt power limit by flashing another bios? Should i grab this instead of waiting for a better card?


You can basically flash any BIOS to any card this time around. Reference boards like this one (well not exactly reference since it stretched a bit, but components are the same) generally have less whine so it is good. I've only seen some complaints about fan noise on Zotac, I don't know how true is that, and whether it matters for you (waterblock or not?)


----------



## leonman44

Krzych04650 said:


> You can basically flash any BIOS to any card this time around. Reference boards like this one (well not exactly reference since it stretched a bit, but components are the same) generally have less whine so it is good. I've only seen some complaints about fan noise on Zotac, I don't know how true is that, and whether it matters for you (waterblock or not?)


Yes i will install a waterblock i am not going to use the stock aircooler , at least not long enough until the waterblock will arrive because thats not currently in stock i will have to preorder it. 
Thats basically why i want the unlocked power limit to get as much as i can out of it with good cooling.
What is the best bios to flash for that purpose?


----------



## Krzych04650

leonman44 said:


> Yes i will install a waterblock i am not going to use the stock aircooler , at least not long enough until the waterblock will arrive because thats not currently in stock i will have to preorder it.
> Thats basically why i want the unlocked power limit to get as much as i can out of it with good cooling.
> What is the best bios to flash for that purpose?


Galax 666W. It is somewhere in this thread and also on techpowerup.


----------



## dboom

EK full block is covering the bios switch. I have to punch the cover in order to reach that dip switch.


----------



## yzonker

Krzych04650 said:


> Galax 666W. It is somewhere in this thread and also on techpowerup.


I finally got around to making a sig and added a link to the only bios that matters.


----------



## J7SC

lmfodor said:


> Hm, do you mean press shift while shutting down Windows to enter safe mode? Or when you start the PC after changing the vBIOS switch, when Windows is starting? It is clear that I never did [emoji28]
> 
> 
> Sent from my iPhone using Tapatalk Pro


...nope, just hold down the 'shift key' as you click 'shut down' and the system boots down...


----------



## borant

Pepillo said:


> Quick question, is it possible to exceed the +2000 limit of memories in the afterburner?


Asus GPU Tweak III can go higher, just note that +2000 in Afterburner is equal to +4000 in GPU Tweak.
My FE on water can go to +4400 when it may start to do artefacts


----------



## Shoggoth

dboom said:


> EK full block is covering the bios switch. I have to punch the cover in order to reach that dip switch.



Ditto with the Alphacool block, at least on my Gainward Phantom GS.


----------



## Xdrqgol

Did anyone around with 4090 FE changed thermal pads/paste - was it worth it?


----------



## yzonker

Discovered one of the pads under my backplate wasn't making good contact and was hurting the effectiveness of heating the backplate for the men. Fixing that got my air cooled clock of +1800-1825 back. No big gains but nice to at least get that back fully. 









I scored 29 473 in Port Royal


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 11 357 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

Xdrqgol said:


> Did anyone around with 4090 FE changed thermal pads/paste - was it worth it?


I doubt its worth it unless you're seeing high temps on the core or a high hotspot temp. Memory doesn't like to be cool.


----------



## Xdrqgol

yzonker said:


> I doubt its worth it unless you're seeing high temps on the core or a high hotspot temp. Memory doesn't like to be cool.


it is one of the best - all time - i have ever had. 3Ghz / 1.03v - 55C max in full benchmark.
I am just thinking - as well know - they do a ****ty job and I can maybe squeeze a bit more and ensure longevity with better pads/paste.


----------



## Nico67

Nolongerhuman said:


> Yeah the temps are great I will try that is there a recommended fixed clk and voltage?


Nah, set "cpu core ratio" to 52 should probably be enough, its just a test to see if boosting is the issue, so don't worry about the cpu speed being low.


----------



## Nico67

dboom said:


> EK full block is covering the bios switch. I have to punch the cover in order to reach that dip switch.


Doh, didn't think about that. I suppose its not to bad if you only have it as a last resort recovery


----------



## sugi0lover

Hi, Is it possible to use evc with 4090 Zotac Amp Extreme? If there is any info, can someone link me to that? Thanks in advance~


----------



## StreaMRoLLeR

sugi0lover said:


> Hi, Is it possible to use evc with 4090 Zotac Amp Extreme? If there is any info, can someone link me to that? Thanks in advance~


Hey sugi welcome 

Zotac uses analog voltage IC. So AFAIK you need digital IC for elmor evc to work


----------



## J7SC

I hope everyone is enjoying the Christmas holiday break - Season's Greetings !

May be I am suffering from turkey overload, but after a few days if recuperation, it is time for the 'RTX 4090 domino effect' to finally be completed as it bumped my RTX 3090 Strix - which I used to think was fast until I saw the 4090 clock consistently over 3200 MHz...

The 2x w-cooled 2080 Ti in NVLink in the pic below will be joint by the w-cooled 3090 Strix on the X399 Creation TR motherboard (which does support 3x GPU according to the manual; in addition to offering resizable_BAR). This behemoth build - which also has a second mobo with w-cooled CPU on the back - already weighs more than a 100 pounds in its current state, so what's a few more pounds ? All that will display on a 55 inch IPS HDR. So let the games begin tomorrow. That and additional work tasks that like lots of CUDA cores and VRAM...


----------



## bmagnien

yzonker said:


> Discovered one of the pads under my backplate wasn't making good contact and was hurting the effectiveness of heating the backplate for the men. Fixing that got my air cooled clock of +1800-1825 back. No big gains but nice to at least get that back fully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 473 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 357 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


I hate it when poor thermal pad contact causes temps to be too cold 😆


----------



## StreaMRoLLeR

yzonker said:


> Discovered one of the pads under my backplate wasn't making good contact and was hurting the effectiveness of heating the backplate for the men. Fixing that got my air cooled clock of +1800-1825 back. No big gains but nice to at least get that back fully.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 473 in Port Royal
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 357 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


was it artifacting at 1538 mhz ? 

Port royal is strange. I get little artifacts at +1700 but perfectly stable at + 1725 ( 1528mhz )


----------



## akgis

Found a wierd issue, sometimes when starting a game everything is fine but the paning of the camera, I mean when alot of stuff move in the screen its not smooth, seems like the game is running at 144hz bu the background isnt hard to explain, there is no tearing is just not 144hz smooth, if I move a personage or a veicule its ok but as soon I move the camera I notice it

I used to solve this withh a restart, but a driver start using ctrl-win-shift-b(driver restart) solves it even if you are inside the game everything goes back to smooth.

I dont think its the gsync because my 3090 didnt had this or its something new between 4090 and my LG 4k144hz monitor...

Just wonder if anything had/have this issue aswell.


----------



## yzonker

StreaMRoLLeR said:


> was it artifacting at 1538 mhz ?
> 
> Port royal is strange. I get little artifacts at +1700 but perfectly stable at + 1725 ( 1528mhz )


No, I had to go closer to +1825 to get artifacts.


----------



## Krzych04650

akgis said:


> Found a wierd issue, sometimes when starting a game everything is fine but the paning of the camera, I mean when alot of stuff move in the screen its not smooth, seems like the game is running at 144hz bu the background isnt hard to explain, there is no tearing is just not 144hz smooth, if I move a personage or a veicule its ok but as soon I move the camera I notice it
> 
> I used to solve this withh a restart, but a driver start using ctrl-win-shift-b(driver restart) solves it even if you are inside the game everything goes back to smooth.
> 
> I dont think its the gsync because my 3090 didnt had this or its something new between 4090 and my LG 4k144hz monitor...
> 
> Just wonder if anything had/have this issue aswell.


It can be all kinds of things. 

I does sound a bit like frame skipping from VRR not kicking in properly, driver reset often fixes that. Whenever you get this issue run UFO Test: Frame Skipping Checker (testufo.com). Although if that was the case then you would notice this issue on desktop as well right after boot, not just games.

I'd generally start by looking at frametime graph and measuring 1% lows, I've found that quite a lot of games can have issues with unlocked FPS on 4090, especially DX11, and then frametime graph looks like this:









This can be resolved by enabling Low Latency Mode, either to On or Ultra, this reduces or removes render que and fixes the issue entirely, like here: 









It can have adverse effects as well, so it should be tested for each game separately.

Generally, you should have this kind of monitoring on toggle key at all times. I would go as far as to say that it is impossible to use your PC properly without it, how else are you going to know what is going on, you just can't. 

At 144 Hz you should be using G-sync On, V-sync On and FPS lock to 140. Running without any kind of synchronization or above screen refresh rate can in itself make the game look unsmooth and erratic. 

If you do all of that and frametimes on monitoring are going to look good and games still won't be feeling smooth regardless, then there is going to be deeper problem. 

You can check what background apps are you running and if they are using any kind of screen capture or overlay, but it is kind of hard to run something like that unintentionally. Still, running the game as raw as possible without anything running in the background is an important diagnosing step.

Other than that, typical steps like clean reinstalling drivers with DDU may help. Depending on how old your Windows install is you may want to consider brand new installation. 

There are probably some more things to try, I've never really made any list and ideas always come while troubleshooting, but this is some basic troubleshooting and diagnosing.


----------



## Brads3cents

whats a bit odd is that nobody is shunting this time around
in the old 3090 thread shunting was a huge thing. I would say that everyone is happy with the power limit and that these cards would prefer more voltage instead, but then everyone is flashing the galax bios so that obviously isnt the case

so why is nobody trying to shunt? I would suspect at least a couple would have done it in this thread by now


----------



## StreaMRoLLeR

Brads3cents said:


> whats a bit odd is that nobody is shunting this time around
> in the old 3090 thread shunting was a huge thing. I would say that everyone is happy with the power limit and that these cards would prefer more voltage instead, but then everyone is flashing the galax bios so that obviously isnt the case
> 
> so why is nobody trying to shunt? I would suspect at least a couple would have done it in this thread by now


Because we arent power limited but voltage limited. Nvidia fine tuned 4090 chips very much

I recommend watching Der8auer Strix Elmor video. It is so fined tuned that if you raise voltage above nvidia limit, 0.25mV to 1.125mV you instantly get power limit. He then switch to unlocked bios and the card consuming 758W but the overclock jump wasnt good. ( he had faulty card tho )

The thing with Galax bios must be removal internal phantom limiters or raise of msvdd and nvdd.

Still waiting an answer from Galax if voltage still locked to 1.100mV


----------



## J7SC

Brads3cents said:


> whats a bit odd is that nobody is shunting this time around
> in the old 3090 thread shunting was a huge thing. I would say that everyone is happy with the power limit and that these cards would prefer more voltage instead, but then everyone is flashing the galax bios so that obviously isnt the case
> 
> so why is nobody trying to shunt? I would suspect at least a couple would have done it in this thread by now


...I am happy with 670 W+ (Galax) and 3225 MHz on the core as is, but I am contemplating either shunting the VRAM segment, or get an ElmorLabs EVC2SE...


----------



## ttnuagmada

Brads3cents said:


> whats a bit odd is that nobody is shunting this time around
> in the old 3090 thread shunting was a huge thing. I would say that everyone is happy with the power limit and that these cards would prefer more voltage instead, but then everyone is flashing the galax bios so that obviously isnt the case
> 
> so why is nobody trying to shunt? I would suspect at least a couple would have done it in this thread by now


My 3090 was power limited in all sorts of scenarios, but my 4090 doesn't seem to be power limited at all. Guess there's just not nearly the reason to do it this gen for most of us.


----------



## Krzych04650

Brads3cents said:


> whats a bit odd is that nobody is shunting this time around
> in the old 3090 thread shunting was a huge thing. I would say that everyone is happy with the power limit and that these cards would prefer more voltage instead, but then everyone is flashing the galax bios so that obviously isnt the case
> 
> so why is nobody trying to shunt? I would suspect at least a couple would have done it in this thread by now


There was one guy with shunt modded Trio X here I think, I recall something about him pushing the card to 1000W to see if 12VHPWR cable melts, and it didn't.

I don't do any extreme overclocking so I don't know how it looks from that perspective, but for water or even chilled water Galax 666W is entirely sufficient, and with some headroom left. 4090 is not like 2080 Ti or 3090 where you needed to give them like 175% power limit so they would finally stop power throttling, the card has very generous stock power limits and is extraordinarily efficient and well tuned, meddling with it physically is kind of pointless. The only gripe with it are those strange phantom limiters that Galax 666W also resolves.


----------



## Xdrqgol

I have my 4090 FE at 1.05v max. It doesn’t go past that , no matter what I do 133% power slider in MSI Afterburner. Max W i have see was 490W . I haven’t seen anything above that in all benchmarks and/or games.

The FE has 600W bios but …it never reaches that.


----------



## Nizzen

J7SC said:


> ...I am happy with 670 W+ (Galax) and 3225 MHz on the core as is, but I am contemplating either shunting the VRAM segment, or get an ElmorLabs EVC2SE...


Do you have 4090 strix or 4090 tuf?


----------



## borant

Xdrqgol said:


> I have my 4090 FE at 1.05v max. It doesn’t go past that , no matter what I do 133% power slider in MSI Afterburner. Max W i have see was 490W . I haven’t seen anything above that in all benchmarks and/or games.
> 
> The FE has 600W bios but …it never reaches that.


did you move Core Voltage slider to 100% too?


----------



## Xdrqgol

borant said:


> did you move Core Voltage slider to 100% too?


It doesn’t work on my MSI afterburner … seems locked.

what should I do to unlock it?


----------



## yzonker

Xdrqgol said:


> It doesn’t work on my MSI afterburner … seems locked.
> 
> what should I do to unlock it?


Is it enabled in the settings and are you running 4.6.5 Beta 2?


----------



## borant

Xdrqgol said:


> It doesn’t work on my MSI afterburner … seems locked.
> 
> what should I do to unlock it?


latest version: Afterburner

settings:


----------



## Xdrqgol

Thank you! I will give it a try!


----------



## Xdrqgol

borant said:


> latest version: Afterburner
> 
> settings:
> View attachment 2590739


The download link doesn't work  tried on 2 difference computers and different browsers...


----------



## Krzych04650

Xdrqgol said:


> It doesn’t work on my MSI afterburner … seems locked.
> 
> what should I do to unlock it?


There is an option to unlock voltage control in the settings.


----------



## Xdrqgol

I have managed to download the BETA 2 from guru3d and unlocked the voltage, now it goes to 1.1v thanks a lot.


----------



## J7SC

Nizzen said:


> Do you have 4090 strix or 4090 tuf?


...neither, > Giga-G-OC


----------



## Nizzen

J7SC said:


> ...neither, > Giga-G-OC


Strix and tuf is easy with Elmor evc, but Giga, I don't know.


----------



## J7SC

Nizzen said:


> Strix and tuf is easy with Elmor evc, but Giga, I don't know.


...I agree; planning to write to them with TPU part numbers for the Giga-G-OC controllers and check.


----------



## tubs2x4

Xdrqgol said:


> I have managed to download the BETA 2 from guru3d and unlocked the voltage, now it goes to 1.1v thanks a lot.


What did it max out before?


----------



## Xdrqgol

tubs2x4 said:


> What did it max out before?


on the version I was using , I was not able to unlock the voltage and move the slider to 100%.


----------



## Pk1

Hey guys! So I got lucky and picked up a MSI Suprim Liquid X today. I want to waterblock the card but block options seem quite limited...EK, Byski, Alphacool or Phanteks. Leaning towards Byski or Phanteks but not thrilled with either. Anyone have these blocks and are they any good? Should I look to trade Suprim Liquid for FE/Tuf or just stick with the Liquid? Thanks.


----------



## J7SC

Pk1 said:


> Hey guys! So I got lucky and picked up a MSI Suprim Liquid X today. I want to waterblock the card but block options seem quite limited...EK, Byski, Alphacool or Phanteks. Leaning towards Byski or Phanteks but not thrilled with either. Anyone have these blocks and are they any good? Should I look to trade Suprim Liquid for FE/Tuf or just stick with the Liquid? Thanks.


That's a nice card ! 
First thing I would do is check temps re. performance of the stock AIO - if ok, I wouldn't even bother custom water-cooling, noting that with this gen, VRAM temp should be in the min ~ 50+ C range. Before pulling the card apart, you can also try out additional / custom fans (up to x6) in push-pull, in case there is a stock temp issue.

May be run a few GPU benchies w/HWInfo or GPU-Z open and post.


----------



## motivman

Pk1 said:


> Hey guys! So I got lucky and picked up a MSI Suprim Liquid X today. I want to waterblock the card but block options seem quite limited...EK, Byski, Alphacool or Phanteks. Leaning towards Byski or Phanteks but not thrilled with either. Anyone have these blocks and are they any good? Should I look to trade Suprim Liquid for FE/Tuf or just stick with the Liquid? Thanks.


Byski for sure. Their block is phenominal with my Giga 4090 Gaming OC


----------



## GosuPl

RTX 3090 STRIX vs RTX 3090 Ti vs RTX 4090 STRIX
FP32 / RT ON / DLSS ON
1440p / 4K

Test platform :

i9-13900K 5.7 GHz P/ EOFF / 5.0 Ring EVGA Z690 CLASSIFIED 
G.Skill Trident Z5 6400 MHz CL32.38.38.32 T2 G2 + II/III 
RTX 3090 STRIX PT 480W / OC + 80 / + 1100 Fan ~ 2200 RPM 
RTX 3090 Ti FTW3 PT 480W / OC + 90 / + 800 Fan ~ 2200 RPM 
RTX 4090 STRIX PT 600W / OC + 100/ + 800 Fan ~ 2200 RPM 
be quiet! Dark Pro 12 1500 (bq! 12VHPWR for RTX 3090 Ti / RTX 4090) 
LC 360/45 + 240/45 + 1x D5 for CPU 
Win 10 Pro 
Nvidia Drivers 526.47

Pure testing with full OSD information on each GPU starts at 15:09 and last for 50:58

Before and after the tests, discussing the topic in Polish. I a recommend to jump to the tests right away, unless someone knows Polish, then I invite you


----------



## motivman

J7SC said:


> That's a nice card !
> First thing I would do is check temps re. performance of the stock AIO - if ok, I wouldn't even bother custom water-cooling, noting that with this gen, VRAM temp should be in the min ~ 50+ C range. Before pulling the card apart, you can also try out additional / custom fans (up to x6) in push-pull, in case there is a stock temp issue.
> 
> May be run a few GPU benchies w/HWInfo or GPU-Z open and post.


I beg to differ. I am running the same vram overclock like I did on the stock air cooler on 10C chilled water...


----------



## J7SC

motivman said:


> I beg to differ. I am running the same vram overclock like I did on the stock air cooler on 10C chilled water...


...no begging - Christmas is over🥴

I'm just trying to help @Pk1 establish a base line in terms of temps for his lovely new MSI Suprim LiquidX. Also, looking at HWBot Port Royal, the top 5 entries are subzero and have an average VRAM speed of 1489.8 MHz, many of then running VRAM heaters.


----------



## wtf_apples

J7SC said:


> ...no begging - Christmas is over🥴
> 
> I'm just trying to help @Pk1 establish a base line in terms of temps for his lovely new MSI Suprim LiquidX. Also, looking at HWBot Port Royal, the top 5 entries are subzero and have an average VRAM speed of 1489.8 MHz, many of then running VRAM heaters.


what exactly are they using to heat the vrm? this is pretty interesting


----------



## motivman

J7SC said:


> ...no begging - Christmas is over🥴
> 
> I'm just trying to help @Pk1 establish a base line in terms of temps for his lovely new MSI Suprim LiquidX. Also, looking at HWBot Port Royal, the top 5 entries are subzero and have an average VRAM speed of 1489.8 MHz, many of then running VRAM heaters.


Hey, I am not doubting what you are saying. Just saying, I put this card on a chiller and couldn't believe I could run my memory that high with low temps. Only changes I made was use original Bykski pads, and all my cold memory issues were gone. When I had artic TP-3 pads, I couldn't even bench +1500, but now with Byski pads, and waterchiller, I am able to bench +1750 just like on air cooler. makes not sense.


----------



## J7SC

wtf_apples said:


> what exactly are they using to heat the vrm? this is pretty interesting


Galax HoF 4090 on LN2 w/dual 12VHPWR and a nice heater on the back for VRAM


----------



## wtf_apples

oh right one. clever idea using socker warmers. that backplate is massive haha


----------



## Nizzen

GosuPl said:


> RTX 3090 STRIX vs RTX 3090 Ti vs RTX 4090 STRIX
> FP32 / RT ON / DLSS ON
> 1440p / 4K
> 
> Test platform :
> 
> i9-13900K 5.7 GHz P/ EOFF / 5.0 Ring EVGA Z690 CLASSIFIED
> G.Skill Trident Z5 6400 MHz CL32.38.38.32 T2 G2 + II/III
> RTX 3090 STRIX PT 480W / OC + 80 / + 1100 Fan ~ 2200 RPM
> RTX 3090 Ti FTW3 PT 480W / OC + 90 / + 800 Fan ~ 2200 RPM
> RTX 4090 STRIX PT 600W / OC + 100/ + 800 Fan ~ 2200 RPM
> be quiet! Dark Pro 12 1500 (bq! 12VHPWR for RTX 3090 Ti / RTX 4090)
> LC 360/45 + 240/45 + 1x D5 for CPU
> Win 10 Pro
> Nvidia Drivers 526.47
> 
> Pure testing with full OSD information on each GPU starts at 15:09 and last for 50:58
> 
> Before and after the tests, discussing the topic in Polish. I a recommend to jump to the tests right away, unless someone knows Polish, then I invite you


Nice job


----------



## StreaMRoLLeR

Pk1 said:


> Hey guys! So I got lucky and picked up a MSI Suprim Liquid X today. I want to waterblock the card but block options seem quite limited...EK, Byski, Alphacool or Phanteks. Leaning towards Byski or Phanteks but not thrilled with either. Anyone have these blocks and are they any good? Should I look to trade Suprim Liquid for FE/Tuf or just stick with the Liquid? Thanks.


No need to buy a waterblock. I own Suprim Liquid X

I am using Phanteks T30 fans at 1536rpm -1620rpm

If you have a good batch ( mem temps at 68c absolute max ) you never have worry about temps

Dying Light 2 max settings 4K and 1620 rpm t30 fans, the core is 51c max, mems at 62c pulling 408W

If you use default Gale P12 fans use them at at least 1500RPM cuz 1140 rpm isnt enough.

If you uV+ OC and staying around 330-360W the card runs around 42C. The cooler master pump is better then prev gen Asetek pump in terms of flow


----------



## Gorod

What are the statistics and observations so far to be considered poor, average and good overclocks for gpu and vmem ? 
Lets take maximum clocks that pass Port Royal for example. 3000 and +1200 on vmem is a pretty bad sample for 4090 right ?


----------



## Nizzen

Gorod said:


> What are the statistics and observations so far to be considered poor, average and good overclocks for gpu and vmem ?
> Lets take maximum clocks that pass Port Royal for example. 3000 and +1200 on vmem is a pretty bad sample for 4090 right ?


3000 core and +1200 on vram is pretty much doable for everyone. Perfect for 24/7 gaming stable 

~3050mhz is max for many people, and 3100 is good.

"Bad" bin looks like 3020mhz and good bin looks like 3100. On ~25c ambient.

3000 to 3100mhz i therm of preformance gain is pretty small LOL


----------



## StreaMRoLLeR

Gorod said:


> What are the statistics and observations so far to be considered poor, average and good overclocks for gpu and vmem ?
> Lets take maximum clocks that pass Port Royal for example. 3000 and +1200 on vmem is a pretty bad sample for 4090 right ?


What determines good bin is = Desired and effective clock gap. My suprim doing 3030mhz on PR but consistently getting more points against 3075-3090mhz chips, because effective is higher. ( Galax bios seems to remove this phantom limiter for almost all cards tho ) 

I would say any chip which can do 3000mhz is nice. There are some G- G OC cards which can do 3120 on stock bios those are god tier chips.

For memory every card should do + 1350 + 1400 unless you are really unfortunate and unlucky. 

+1500 is safe 24/7 gaming limit for Ram. Outside of 3D Mark you have no reason to pass +1500mhz in games ( 1 guy reported he got bsod and crashes at +1700 for 2 week of gaming then reverted to 1500 )


----------



## man from atlantis

Has anyone tried undervolting at/around 875mV, What's the highest stable clocks you could get?


----------



## Nizzen

man from atlantis said:


> Has anyone tried undervolting at/around 875mV, What's the highest stable clocks you could get?


This isn't underclock.net 😘🤠

Give us your result, and maybe more people will follow


----------



## leonman44

Finally ! I just ordered the Asus Tuf 4090 , last one in stock and for a better price than the Zotac Extreme. Cant wait to try it!


----------



## dk_mic

StreaMRoLLeR said:


> What determines good bin is = Desired and effective clock gap. My suprim doing 3030mhz on PR but consistently getting more points against 3075-3090mhz chips, because effective is higher. ( Galax bios seems to remove this phantom limiter for almost all cards tho )
> I would say any chip which can do 3000mhz is nice. There are some G- G OC cards which can do 3120 on stock bios those are god tier chips.
> For memory every card should do + 1350 + 1400 unless you are really unfortunate and unlucky.
> +1500 is safe 24/7 gaming limit for Ram. Outside of 3D Mark you have no reason to pass +1500mhz in games ( 1 guy reported he got bsod and crashes at +1700 for 2 week of gaming then reverted to 1500 )


You are overestimating RAM OC capabilities, +1500 is far from average.








3DMark.com search


3DMark.com search




www.3dmark.com





If you look at f.x. Time Spy leaderboards (just the best submission per user), filter for RAM OC (anything above 1313 / 10 500 MHz) you will see that out of 28 301 results with a RAM OC only 6% have itn higher than +1500, similar to Speedway and Port Royal.

Here is a little table to translate the bins to OC offset
Stock = 10 500 MHz (1313 * 8)
OC by 1000 = 11 500 MHz (1438 * 8)
OC by 1500 = 12 000 MHz (1500 * 8) 
OC by 2000 = 12 500 MHz (1562 * 8)


----------



## kryptonfly

man from atlantis said:


> Has anyone tried undervolting at/around 875mV, What's the highest stable clocks you could get?


I can play The Witcher 3 at 2565mhz for 875mV set in afterburner but because of the temp it falls at 880mV minimum. Same with stock bios (Giga-G-OC) and Galax, same effective clock for this temp ~230W 35°C (depends of fans turned on or not).

It seems the Endwalker benchmark at 4k is heavier than TimeSpy, harder in clock too. I could pass at 3165mhz (3180mhz max).


----------



## leonman44

I have to order a waterblock as well now , i am looking at ek ones so it can be aura sync compatible as 90% of my components are.

Should i pay more for the one with the active backplate or its useless other than aesthetics for the 4090?


----------



## yzonker

motivman said:


> Hey, I am not doubting what you are saying. Just saying, I put this card on a chiller and couldn't believe I could run my memory that high with low temps. Only changes I made was use original Bykski pads, and all my cold memory issues were gone. When I had artic TP-3 pads, I couldn't even bench +1500, but now with Byski pads, and waterchiller, I am able to bench +1750 just like on air cooler. makes not sense.


No black screening/artifacting at idle at those temps with the max mem OC? Seems reasonable to me otherwise as I managed to get back to my air cooled mem clock as well, albeit with a backplate heater. Similar increase in mem temps as using the lower W/mK pads.

Looks like you can go lower in temp than me too. My TUF hits that cold bug point around 10-12C water and performance drops at the same clocks. Don't think that's the VRAM as increasing VRAM temps didn't change anything in regards to the cold bug.


----------



## yzonker

leonman44 said:


> I have to order a waterblock as well now , i am looking at ek ones so it can be aura sync compatible as 90% of my components are.
> 
> Should i pay more for the one with the active backplate or its useless other than aesthetics for the 4090?


No, definitely don't bother with the ABP. That would just potentially cause you to lose mem OC like we've been discussing.


----------



## Krzych04650

man from atlantis said:


> Has anyone tried undervolting at/around 875mV, What's the highest stable clocks you could get?


Not 875mv, that is too low and performance is going to end up below stock, but at 925mv reasonable clocks like 2700-ish clock seem pretty easy and if you overclock the VRAM at the same time you will be around stock performance, sometimes a bit higher sometimes a bit lower. Also the coil whine is reduced massively.















I am not using it daily since this kind of performance drop is just too much in demanding games where you are fighting for every last frame, but for ones that run a lot faster, it makes sense to use it to reduce whine.


----------



## man from atlantis

Quake II RTX, 3840*1620,

COREVRAMPWRFPSNORMALIZED PERFProfile25353000~348104102%CURVE METHOD 2, [email protected], +1500MHz VRAM, 111% PL27453000~421109107%CURVE METHOD 2, [email protected], +1500MHz VRAM, 111% PL~27152625~438102100%DEFAULT, Palit GameRock OC~29403000~487112110%+200MHz Core, +1500MHz VRAM, 111% PL

I got CTD at 2550MHz/875mV, so dialed back to 2535MHz. I used to use Q2RTX to as stability test for my 3080, the game doesn't let me down.


----------



## Lmah2x

man from atlantis said:


> Has anyone tried undervolting at/around 875mV, What's the highest stable clocks you could get?


My card does +255 at 875mV. But the performance drops below stock, off the top of my head about 5%, but probably more. I think the sweet spot is 925mV up to 1V on the 4090 at least since you don't lose performance. If you want lower performance its usually better to just go with a lower tier card.


----------



## J7SC

I am still wondering what the sometimes referenced 'phantom limiters' are all about...as was already suggested, the Galax vbios does not seem to be acquainted with them. Below is a HWInfo from a bench run with the Galax, and that wasn't even at max clocks. In Cyberpunk 2077 bench (4K, DLSS Quality), I am closing in on 90 fps, unlike with similar settings on the stock vbios.

At the same time, I remember the 76.3 billion transistors in a 4090 (compared to 28.3 billion in a 3090 and 18.6 billion in a 2080 Ti) - may be those phantom limits are there for a reason ~🥴. I would like to use the Galax vbios for gaming as well (but limited to 1.05v and 100% PL ~ 500W) but it does behave like a XOC w/o some safety limits (= phantom limiters?).


----------



## Sheyster

J7SC said:


> I would like to use the Galax vbios for gaming as well (but limited to 1.05v and 100% PL ~ 500W) but it does behave like a XOC w/o some safety limits (= phantom limiters?).


Galax HOF BIOS is 550w at 100% PL, I've been using it and been primarily playing MW2/WZ2 lately. So far the max I've seen it draw at 1.05v/stock clocks is 502w.


----------



## motivman

yzonker said:


> No, definitely don't bother with the ABP. That would just potentially cause you to lose mem OC like we've been discussing.


yeah, active backplate is a NO NO this gen. You will loose performance with that.


----------



## Panchovix

J7SC said:


> I would like to use the Galax vbios for gaming as well (but limited to 1.05v and 100% PL ~ 500W) but it does behave like a XOC w/o some safety limits (= phantom limiters?).


I'm using it for gaming, but I undervolted lol. The reason is because the effective clocks are almost identical when undervolting with the Galax VBIOS. I'm not sure if it's evading some safety limits, but temps and voltage seem "normal"


----------



## J7SC

Sheyster said:


> Galax HOF BIOS is 550w at 100% PL, I've been using it and been primarily playing MW2/WZ2 lately. So far the max I've seen it draw at 1.05v/stock clocks is 502w.





Panchovix said:


> I'm using it for gaming, but I undervolted lol. The reason is because the effective clocks are almost identical when undervolting with the Galax VBIOS. I'm not sure if it's evading some safety limits, but temps and voltage seem "normal"


Thanks. Yeah, I'm thinking about using the Galax for gaming but as mentioned limited to 1.05v (or even undervolted to 1.0v?) and PL at or below 100%. The stock vbios is ok for gaming, but the Galax is smoother even at the same clocks, IMO (limiters?). To switch between vbios, I always have to take that giant tempered glass panel off the TT C8 so I like to stay on the Galax, if safe for daily use with those more limited settings.


----------



## Krzych04650

J7SC said:


> I am still wondering what the sometimes referenced 'phantom limiters' are all about...as was already suggested, the Galax vbios does not seem to be acquainted with them. Below is a HWInfo from a bench run with the Galax, and that wasn't even at max clocks. In Cyberpunk 2077 bench (4K, DLSS Quality), I am closing in on 90 fps, unlike with similar settings on the stock vbios.
> 
> At the same time, I remember the 76.3 billion transistors in a 4090 (compared to 28.3 billion in a 3090 and 18.6 billion in a 2080 Ti) - may be those phantom limits are there for a reason ~🥴. I would like to use the Galax vbios for gaming as well (but limited to 1.05v and 100% PL ~ 500W) but it does behave like a XOC w/o some safety limits (= phantom limiters?).
> View attachment 2590897


Yea it is really interesting, I've never seen anything like this. It is really bizarre how not only you can never actually reach specified power limit on all other BIOSes, but also the point at which power limit is hit varies depending on the application/game. 

Galax BIOS is not entirely immune to it either. The only game where I can still hit ~620W power draw even after getting ~5% power draw reduction from going on water is Path of Exile with Global Illumination enabled and right combination of resolution and framerate, and it does start reporting power limit in Afterburner at around 620W, even if stuff like Furmark does go to 660W. 

4090 definitely feels different than everything before it, a lot more advanced, it seems like there is a lot more going on in terms of "internal optimizations" or however you could call it and those few sliders that we do have access to are not giving the same level of control as they used to.


----------



## Xdrqgol

To bad EVGA is not around anymore, the KINGPIN would have been great!


----------



## leonman44

yzonker said:


> No, definitely don't bother with the ABP. That would just potentially cause you to lose mem OC like we've been discussing.





motivman said:


> yeah, active backplate is a NO NO this gen. You will loose performance with that.


Thank you guys , but i wont lose any performance by installing the normal waterblock right?


----------



## J7SC

Xdrqgol said:


> To bad EVGA is not around anymore, the KINGPIN would have been great!


...agree, but you can always dry your tears with this...


----------



## 2080tiowner

Hi all, i've ordered a gainward rtx 4090 phantom, i've currently a 7900XTX and i would like to know if the rtx 4090 is more efficient than the 7900XTX and or if i'll have really more fps... can you say me ? thanks !


----------



## Brads3cents

J7SC said:


> ...agree, but you can always dry your tears with this...


if that card was actually available for sale we could have a different discussion


----------



## motivman

leonman44 said:


> Thank you guys , but i wont lose any performance by installing the normal waterblock right?


yes, just use stock pads for the memory that comes with the block.


----------



## J7SC

Brads3cents said:


> if that card was actually available for sale we could have a different discussion


...the Galax HoF is always hard to get; and the record breaker (LN2) is the OCL version which usually is just reserved for top oc team members (or $$$s in the secondary market).

Then it gets a bit weirder re. exact Galax HoF versions, depending on the *region* you're in (at least if they follow the pattern they used with the 3090 HoF):











Meantime, 4090 HoF OCL unboxing (remember that re. clocks, this really should be / will be on sub-zero)


----------



## StreaMRoLLeR

J7SC said:


> ...the Galax HoF is always hard to get; and the record breaker (LN2) is the OCL version which usually is just reserved for top oc team members (or $$$s in the secondary market).
> 
> Then it gets a bit weirder re. exact Galax HoF versions, depending on the *region* you're in (at least if they follow the pattern they used with the 3090 HoF):
> 
> View attachment 2590914
> 
> 
> 
> Meantime, 4090 HoF OCL unboxing (remember that re. clocks, this really should be / will be on sub-zero)


Only OC Lab edition is binned. I really miss Kingpins where all " region " consumer cards are binned ( i remember some kingpins from evga forum were trash tho not being able to hold 2100mhz in 4k games ). We see a regular HoF crashing with more then 1.100mV at 3120mhz. So its not near binned at all. AFAIK for Oc lab edition you need to have a hwbot score. The closest pcb you can get like a HOF are Suprim and Strix. Strix even have I2C header


----------



## Muut

Gorod said:


> What are the statistics and observations so far to be considered poor, average and good overclocks for gpu and vmem ?
> Lets take maximum clocks that pass Port Royal for example. 3000 and +1200 on vmem is a pretty bad sample for 4090 right ?


I did 3150 (3125 effective because of warmer weather at the moment). Mem is + 1580.

I actually filmed the whole run, you can see my afterburner settings at the end. As the Galax bios helped me beat my previous high score which took me 40+ trials on stock bios, on the 1st try, I wanted a proof the run was not bugged. This was my 2nd run, since I didn't think about filming the 1st time.









I scored 29 654 in Port Royal


Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Link to youtube in the description


----------



## dboom

motivman said:


> yeah, active backplate is a NO NO this gen. You will loose performance with that.


Nope, vram +1500 with it too.


----------



## Xdrqgol

Muut said:


> I did 3150 (3125 effective because of warmer weather at the moment). Mem is + 1580.
> 
> I actually filmed the whole run, you can see my afterburner settings at the end. As the Galax bios helped me beat my previous high score which took me 40+ trials on stock bios, on the 1st try, I wanted a proof the run was not bugged. This was my 2nd run, since I didn't think about filming the 1st time.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 654 in Port Royal
> 
> 
> Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Link to youtube in the description


What was your score with your default bios?


----------



## Xdrqgol

J7SC said:


> ...agree, but you can always dry your tears with this...
> View attachment 2590904


True but - KingPin was easier to get on the market and as it has been mentioned a couple of times here - they were all binned (or at least better binned than what you can find on the market).
_I had the KingPin every generation... they were simple built different._ 

HOF cards... uff... Itried 2 - and the cheap plastic it is just not something I liked. Not a huge fan unless is the OC LAB edition which even that one -doesn't feel as a complete package for an average user (not talking about overlockers). 

KingPin has been always much better quality( not talking about the fastest)


----------



## dboom

Nico67 said:


> Doh, didn't think about that. I suppose its not to bad if you only have it as a last resort recovery


It's about vbios dip switch. You can switch between oem vbios and galax one for example.


----------



## Nizzen

Xdrqgol said:


> True but - KingPin was easier to get on the market and as it has been mentioned a couple of times here - they were all binned (or at least better binned than what you can find on the market).
> _I had the KingPin every generation... they were simple built different._
> 
> HOF cards... uff... Itried 2 - and the cheap plastic it is just not something I liked. Not a huge fan unless is the OC LAB edition which even that one -doesn't feel as a complete package for an average user (not talking about overlockers).
> 
> KingPin has been always much better quality( not talking about the fastest)


You had many HOF cards?
I have 3090 Hof, and I love it. White pcb and voltage control is what I like the best. Looks really good too.


----------



## Muut

Xdrqgol said:


> What was your score with your default bios?











I scored 29 475 in Port Royal


Intel Core i9-12900KS Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com




29475 but I'm now wondering if it's normal when I compare clocks with for instance this one :









I scored 28 747 in Port Royal


Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Which is the one where I have the highest clocks / effective clocks. 

Good thing is that I still have headroom considering the clocks from the 28k run


----------



## yzonker

Xdrqgol said:


> True but - KingPin was easier to get on the market and as it has been mentioned a couple of times here - they were all binned (or at least better binned than what you can find on the market).
> _I had the KingPin every generation... they were simple built different._
> 
> HOF cards... uff... Itried 2 - and the cheap plastic it is just not something I liked. Not a huge fan unless is the OC LAB edition which even that one -doesn't feel as a complete package for an average user (not talking about overlockers).
> 
> KingPin has been always much better quality( not talking about the fastest)


KP wasn't binned. There were many that were duds just like any other card. Same was true for the 3090 Ti KP. The only thing they did was verify it could run the higher factory clocks, nothing else.


----------



## Xdrqgol

Nizzen said:


> You had many HOF cards?
> I have 3090 Hof, and I love it. White pcb and voltage control is what I like the best. Looks really good too.


I did but KP for me personally was much better!


----------



## GRABibus

J7SC said:


> I would like to use the Galax vbios for gaming as well (but limited to 1.05v and 100% PL ~ 500W) but it does behave like a XOC w/o some safety limits (= phantom limiters?).
> View attachment 2590897


I use it for gaming with sliders like this in MSI AB :
Voltage => 0
PL => 65%.

In MW2 4K with +165MHz on core, this is between 2900MHz and 3000MHz at 0,885V-1V, fully stable (Still on stock air cooler).

Max power 400W


----------



## Xdrqgol

yzonker said:


> KP wasn't binned. There were many that were duds just like any other card. Same was true for the 3090 Ti KP. The only thing they did was verify it could run the higher factory clocks, nothing else.


KP was built better , you should watch a bit GM cover of King Pin lab.
All of them I had in possession had very good clocks compared to the market. I haven't heard one saying he has a "dud" that is worse OC than for example an Asus strix ... so you will have to show me . Yes some KP were better than other KP but overall all KP were better than all these BS Asus, MSI,etc.


----------



## yzonker

Xdrqgol said:


> KP was built better , you should watch a bit GM cover of King Pin lab.
> All of them I had in possession had very good clocks compared to the market. I haven't heard one saying he has a "dud" that is worse OC than for example an Asus strix ... so you will have to show me . Yes some KP were better than other KP but overall all KP were better than all these BS Asus, MSI,etc.


There were one or more in the 3090 thread. No clue who they were now. You just got lucky to not get a dud.


----------



## Xdrqgol

yzonker said:


> There were one or more in the 3090 thread. No clue who they were now. You just got lucky to not get a dud.


👀 I had over 200 pcs — i highly doubt that.


----------



## yzonker

Xdrqgol said:


> 👀 I had over 200 pcs — i highly doubt that.


What? You had 200 KP cards? How could you possibly get that many when you had to go through the queue?


----------



## Xdrqgol

yzonker said:


> What? You had 200 KP cards? How could you possibly get that many when you had to go through the queue?


well I will let you figure that out, it is pretty easy.


----------



## J7SC

IMO, it would be a close call between a late-model KingPin and Galax HoF - so I like samples of both, please !

With the 780 Ti series, the KP was still using mostly the same PCB as the Classifieds; and the Classies also used the same EVBot KP flash (pic), confirmed by KP's TIN to me...once they reached the RTX2K series, the KingPin Edition PCBs (fortunately to some extent) were much more free-standing, ie. when compared to the FTW-U as Classies had been abolished (accused of KP cannibalism ) by then. Two of my 780 Ti Classies are happily marching on in a retro gamer (also below)...

As to clocks and duds, it all depends on your cooling solution - if anything, KP's would do much better when frozen and with extra voltages. Some of you may recall 'ye ol' ASIC table' in GPUZ of old (top right) with corresponding info. FYI, while AISC quality ratings are no longer given, using your MSI AB to generate an unmolested (stock vbios, stock settings) VF curve kind of tell you the same thing if you have several cards to compare.


----------



## yzonker

Xdrqgol said:


> well I will let you figure that out, it is pretty easy.


I already did.


----------



## bmagnien

dk_mic said:


> You are overestimating RAM OC capabilities, +1500 is far from average.
> 
> 
> 
> 
> 
> 
> 
> 
> 3DMark.com search
> 
> 
> 3DMark.com search
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> If you look at f.x. Time Spy leaderboards (just the best submission per user), filter for RAM OC (anything above 1313 / 10 500 MHz) you will see that out of 28 301 results with a RAM OC only 6% have itn higher than +1500, similar to Speedway and Port Royal.
> 
> Here is a little table to translate the bins to OC offset
> Stock = 10 500 MHz (1313 * 8)
> OC by 1000 = 11 500 MHz (1438 * 8)
> OC by 1500 = 12 000 MHz (1500 * 8)
> OC by 2000 = 12 500 MHz (1562 * 8)


Where are you downloading the raw data from the leaderboards? Assuming you had access to that in order to filter 28k+ results?


----------



## yzonker

bmagnien said:


> Where are you downloading the raw data from the leaderboards? Assuming you had access to that in order to filter 28k+ results?


I think maybe dk_mic was just using the values in the bar graph at the top and dividing by the total. I got 6% when I did that.


----------



## bmagnien

yzonker said:


> I think maybe dk_mic was just using the values in the bar graph at the top and dividing by the total. I got 6% when I did that.


Ah you’re probably right. Would be nice if they offered an export function though!


----------



## yzonker

bmagnien said:


> Ah you’re probably right. Would be nice if they offered an export function though!


I honestly don't know how meaningful that is. A lot of people don't really max out their card when they OC and run benchmarks.


----------



## wtf_apples

Im trying to find good review for the Bykski Granzon 4090 Series GPU Cooler Water Block for Asus ROG Strix/TUF, anyone here have experience with it?


----------



## Vayne4800

Greetings all,

I recently bought a Palit Gamerock 4090 OC. I read about and followed a bit on the discussion regarding the video of the PCB breakdown. I also tend to historically overclock to the most stable round the year OCs on my cards. Here are my questions and would appreciate helping me out (do note that I did my best to go through this thread as it was very informative):


With consideration to the video, should I be concerned to OC my card similarly to how I used to do in old generation cards on air? (no mods, just basic Afterburner stuff). I understand the gains aren't much but I just overclock like most of us and enjoy knowing I am squeezing every bit of performance (despite efficiencies). I am considering maxing out the voltage and PL via Afterburner.
Temps on GPU can reach 70C and Hotspot 80-82C on default fan curve. Which is quite and coming from my Zotac 3090, these temps are so good considering my Zotac 3090 would hit 75C and 88C under load and repasting and added pads. Are the threshold similar between 3090 and 4090 or should I concern myself differently with the 4090, especially given the PCB analysis video?
The power adapter is barely fitting in my Corsair 750D. The way I would describe it is that the sidepanel doesn't bend it but pushes every so lightly. I hope that is fine? CPU Cooler headroom is 170mm on this case. My PSU is EVGA 1200w.
I used MSI afterburner and got an OC of +75Mhz on core. Above +150Mhz or near it seems unstable.
My memory and via Heaven Benchmark would start showing corruption at +1700Mhz. It was fine at +1500Mhz with no performance degradation through observing FPS.
ECC was disabled. Should I enable it when testing for VRAM OC and then disable it after I find my max?
For a few hours, I used the following OC: +100% Vcore, +100Mhz Core, +1500Mhz VRAM, 111% PL, 84C Threshold (decoupled from PL).
All OC attempts have been with default fan curve.
Currently using the following: +75Mhz on Core and +1000Mhz on VRAM. Everything else on default.

Appreciate your feedback and opinions.

Thank you.


----------



## 8472

Just ordered a FE from bestbuy. 

Does anyone know if they charge a restocking fee for opened GPUs? I'm a total tech member fwiw.


----------



## J7SC

Vayne4800 said:


> Greetings all,
> 
> I recently bought a Palit Gamerock 4090 OC. I read about and followed a bit on the discussion regarding the video of the PCB breakdown. I also tend to historically overclock to the most stable round the year OCs on my cards. Here are my questions and would appreciate helping me out (do note that I did my best to go through this thread as it was very informative):
> 
> 
> With consideration to the video, should I be concerned to OC my card similarly to how I used to do in old generation cards on air? (no mods, just basic Afterburner stuff). I understand the gains aren't much but I just overclock like most of us and enjoy knowing I am squeezing every bit of performance (despite efficiencies). I am considering maxing out the voltage and PL via Afterburner.
> Temps on GPU can reach 70C and Hotspot 80-82C on default fan curve. Which is quite and coming from my Zotac 3090, these temps are so good considering my Zotac 3090 would hit 75C and 88C under load and repasting and added pads. Are the threshold similar between 3090 and 4090 or should I concern myself differently with the 4090, especially given the PCB analysis video?
> The power adapter is barely fitting in my Corsair 750D. The way I would describe it is that the sidepanel doesn't bend it but pushes every so lightly. I hope that is fine? CPU Cooler headroom is 170mm on this case. My PSU is EVGA 1200w.
> I used MSI afterburner and got an OC of +75Mhz on core. Above +150Mhz or near it seems unstable.
> My memory and via Heaven Benchmark would start showing corruption at +1700Mhz. It was fine at +1500Mhz with no performance degradation through observing FPS.
> ECC was disabled. Should I enable it when testing for VRAM OC and then disable it after I find my max?
> For a few hours, I used the following OC: +100% Vcore, +100Mhz Core, +1500Mhz VRAM, 111% PL, 84C Threshold (decoupled from PL).
> All OC attempts have been with default fan curve.
> Currently using the following: +75Mhz on Core and +1000Mhz on VRAM. Everything else on default.
> 
> Appreciate your feedback and opinions.
> 
> Thank you.


...only a select few points re. your questions...
4090s do behave a bit differently re. heat as 3090s. 82 C Hotspot is ok (depending on ambient temp in the room), but for Hotspot, you would ant to avoid getting to 90 C and above. Memory temp in the 50 C and + range is ok. Your power adapter sounds a bit more serious re. bending. Do make sure that after the squeezed mounting you mentioned, it stayed fully inserted (no gap) where it fits into the card. Re. VRAM, EEC is obviously more costly re/. performance but I have run ECC vs non-ECC (posted here a few weeks back) where the actual top MHz speed was the same. Heaven is a good test, as is CP 2077 (RTX) and Metro Exodus RTX (and others). I use MSI AB w/either sliders or curve and I find it is all that I really need for daily use. BTW, when you ran +100% on voltage, +100 MHz on core, +1500 MHz on VRAM and 111% on PL for an hour, any issues ?

---

For everyone's amusement...
I have been pushing my old (Dec '18) work+play Threadripper w/2x Aorus XTR WF WB 2080 Tis after refurbishing bits and flashing a new bios etc for an upcoming mod. Check out those cool PerCap reasons. The cards can actually clock a bit higher, but then it is OCP time. I also liked it back in the day when PCIe inputs were more or less balanced and PCIe slot power actually meant s.th. 

With an oc'ed TR at about 340W, another 175W or so on peripherals (3x D5s etc), the 1300W Platinum PSU started to breather heavier...2x 2080 Ti is 'still only' 37.2 billion transistors, or less than half of what our 4090s have. May be I should not worry about that 670W+ the Galax 4090 vbios peaks at, compared to the 750W or so for the combined GPUs below but less performance...


----------



## Helmbo

Just bought a Gigabyte 4090 OC, replacing my watercooled 6900xt. - so have to find a good block.

Question - Why is everyone agains EKWBs active backplate? something about memory?


----------



## mirkendargen

Helmbo said:


> Just bought a Gigabyte 4090 OC, replacing my watercooled 6900xt. - so have to find a good block.
> 
> Question - Why is everyone agains EKWBs active backplate? something about memory?


Why bother with an active backplate when there are no components on the back?


----------



## Agent-A01

mirkendargen said:


> Why bother with an active backplate when there are no components on the back?


Yeah seems gimmicky. Maybe will be good for the 4090Ti with 48GB of VRAM.

What is the best WB for the FE? I just got one from best buy too and can't decide on one.
Typically have bought EKWB in the past.


----------



## Helmbo

mirkendargen said:


> Why bother with an active backplate when there are no components on the back?


Because its the only block i can find for that card. Besides a Bykski, which im not interested in. - you know of any others? - that would be a great help 

Edit - Think i literally just found it - Alphacool Eisblock Aurora Geforce RTX 4090 Aorus Master - Gaming mit Backplate


----------



## J7SC

Helmbo said:


> Because its the only block i can find for that card. Besides a Bykski, which im not interested in. - you know of any others? - that would be a great help


I have the 4090 Gigabyte Gaming OC w/the Bykski block (right next to a 6900XT w/ Byksk in the same case) and love them. That said, Phanteks has a new block for the Gigabyte Gaming OC/Aorus Master > here. I like the Phanteks block on my 3090 Strix a lot and also use their CPU blocks...I had switched the 3090 Strix over from an EKWB block which had several quality issues...


----------



## Helmbo

J7SC said:


> I have the 4090 Gigabyte Gaming OC w/the Bykski block (right next to a 6900XT w/ Byksk in the same case) and love them. That said, Phanteks has a new block for the Gigabyte Gaming OC/Aorus Master > here. I like the Phanteks block on my 3090 Strix a lot and also use their CPU blocks...I had switched the 3090 Strix over from an EKWB block which had several quality issues...



Just bought the Alphacool  - should be here in week 1 2023. but thank you regardless.


----------



## mirkendargen

Helmbo said:


> Because its the only block i can find for that card. Besides a Bykski, which im not interested in. - you know of any others? - that would be a great help
> 
> Edit - Think i literally just found it - Alphacool Eisblock Aurora Geforce RTX 4090 Aorus Master - Gaming mit Backplate


Get interested in the Bykski like everyone else, me included, and problem solved lol.


----------



## J7SC

Helmbo said:


> Just bought the Alphacool  - should be here in week 1 2023. but thank you regardless.


I never had Alphacool, but look forward to try them one day soon. Anyway, you should record your facial expression once you take that giant air-cooler off and see the size of the PCB... 🙃


----------



## Helmbo

J7SC said:


> I never had Alphacool, but look forward to try them one day soon. Anyway, you should record your facial expression once you take that giant air-cooler off and see the size of the PCB... 🙃


When i look through the fins on the cooler, there is no Board!! so i can imagine its actually super tiny compared to my 6900xt


----------



## StreaMRoLLeR

@J7SC What is your 1.05mV - 1.100mV Curve values ? 

Assuming you have golden bin, is it 2900+ at 1.100mv at default curve ?


----------



## J7SC

StreaMRoLLeR said:


> @J7SC What is your 1.05mV - 1.100mV Curve values ?
> 
> Assuming you have golden bin, is it 2900+ at 1.100mv at default curve ?


...well, as it happens, just did a quick bench run before counting sheep


----------



## tomerturbo

J7SC said:


> ...well, as it happens, just did a quick bench run before counting sheep
> View attachment 2591012


how did you mange to do curve overclock over 3000mhz in msi afterburner when i open curve it limit to 3000mhz


----------



## WilliamLeGod

any1 here getting the 4090ti if they release?


----------



## Benni231990

WilliamLeGod said:


> any1 here getting the 4090ti if they release?


Shut up and take my Money


----------



## Muut

tomerturbo said:


> how did you mange to do curve overclock over 3000mhz in msi afterburner when i open curve it limit to 3000mhz












change both lines to 

VFCurveEditorMinFrequency = 1000
VFCurveEditorMaxFrequency = 3500

for example


----------



## tomerturbo

Muut said:


> View attachment 2591015
> 
> 
> change both lines to
> 
> VFCurveEditorMinFrequency = 1000
> VFCurveEditorMaxFrequency = 3500
> 
> for example
> 
> 
> View attachment 2591016


thank you very much


----------



## StreaMRoLLeR

J7SC said:


> ...well, as it happens, just did a quick bench run before counting sheep
> View attachment 2591012


Well.. thats not what i ask. Please press reset then show default values at at 1.05, 1.1


----------



## Vayne4800

J7SC said:


> ...only a select few points re. your questions...
> 4090s do behave a bit differently re. heat as 3090s. 82 C Hotspot is ok (depending on ambient temp in the room), but for Hotspot, you would ant to avoid getting to 90 C and above. Memory temp in the 50 C and + range is ok. Your power adapter sounds a bit more serious re. bending. Do make sure that after the squeezed mounting you mentioned, it stayed fully inserted (no gap) where it fits into the card. Re. VRAM, EEC is obviously more costly re/. performance but I have run ECC vs non-ECC (posted here a few weeks back) where the actual top MHz speed was the same. Heaven is a good test, as is CP 2077 (RTX) and Metro Exodus RTX (and others). I use MSI AB w/either sliders or curve and I find it is all that I really need for daily use. BTW, when you ran +100% on voltage, +100 MHz on core, +1500 MHz on VRAM and 111% on PL for an hour, any issues ?


In regards to cable bending, if I leave the case open, it barely makes a difference, hence it is very lightly pushing against the sidepanel at the roughly the midpoint of the cable length. It is firmly in place.

So in regards to ECC, I was thinking to test VRAM OC stability using it until performance degrades and then stop there. After that I turn off ECC since no point keeping it on after I have found the OC.

No issues with that OC I used. I just backed out due to the fearmongering regarding the Palit PCB. Hence the reason of my original post.


----------



## Edge0fsanity

WilliamLeGod said:


> any1 here getting the 4090ti if they release?


Depends on how much faster it is and block compatibility. It would need to be 15-20% faster and my block would need to fit. I suspect it'll be 10% faster and no block will work due to dual 12vhpwr connectors.


----------



## kryptonfly

StreaMRoLLeR said:


> @J7SC What is your 1.05mV - 1.100mV Curve values ?
> 
> Assuming you have golden bin, is it 2900+ at 1.100mv at default curve ?


We also should consider the temp as higher temp will decrease the curve a little after playing a game for example. Mine with stock vbios Giga-G-OC v95.02.18.80.5F with Bykski WB at 21°C.
1.10v :









1.05v :


----------



## alasdairvfr

WilliamLeGod said:


> any1 here getting the 4090ti if they release?


For me really depends when in the product lifecycle of the Lovelace arch it launches. 3090ti came out way too late and cost way too much, I was mildly tempted but ended up getting the 4090 only 6m or so later. If the 3090ti had come out when the 3080ti released, would have been an easier and more obvious upgrade (I went from 3080->3080ti at that time)


----------



## Muut

J7SC said:


> ...well, as it happens, just did a quick bench run before counting sheep
> View attachment 2591012












So am I 😜


----------



## StreaMRoLLeR

Muut said:


> View attachment 2591028
> 
> 
> So am I 😜


What did you test at age of empires 2 ? xD


----------



## Muut

StreaMRoLLeR said:


> What did you test at age of empires 2 ? xD


Windows explorer and Discord


----------



## ArcticZero

Helmbo said:


> Just bought the Alphacool  - should be here in week 1 2023. but thank you regardless.


Also ordered an Alphacool, specifically the new Core block. Mine ships week 3 though but it should be fine. Excited to compare results with other blocks here.


----------



## motivman

ArcticZero said:


> Also ordered an Alphacool, specifically the new Core block. Mine ships week 3 though but it should be fine. Excited to compare results with other blocks here.


Y'all are missing out with the Bykski block TBH. I get 20C delta at 570W with that block. Can't imagine the alphacool performing better, plus the Bykski looks suprisingly good in person compared to pictures.


----------



## leonman44

My baby just arrived !!! 
























Bigger than my keyboard and very premium metal build quality.
So sad that I have to run blitz kit for my loop so I can clean everything from inside and then uninstall everything from my pc case , cut the bottom of it to open completely the airflow , remove the side glass and the top metal piece and then reverse all my radiators to exhaust. I could hardly handle the temps of the 2080 during the summer within 10-12c liquid difference temp from the ambient temp which can reach 32-33c so the liquid gets dangerously at 45c! 
The cooling system won’t survive the 4090 as a closed filtered case. Also new psu…

I have so much work to do to make this work as good as it can.


----------



## Sheyster

WilliamLeGod said:


> any1 here getting the 4090ti if they release?


Depends on the performance uplift I suppose. And supply.. I won't pay over MSRP for it.


----------



## J7SC

tomerturbo said:


> how did you mange to do curve overclock over 3000mhz in msi afterburner when i open curve it limit to 3000mhz


...edited the config file; please see @Muut 's informative post above



StreaMRoLLeR said:


> Well.. thats not what i ask. Please press reset then show default values at at 1.05, 1.1
> 
> View attachment 2591018
> 
> View attachment 2591017


...may be not what you asked, but it is what I answered  
...anyway, stock curve, then picked 3210 at @ 1.1v and shifted the whole curve up to match.



Muut said:


> Windows explorer and Discord


...must try those with my next boost-up ; I used Superposition 8k for my testing >here


----------



## Shoggoth

WilliamLeGod said:


> any1 here getting the 4090ti if they release?



Not a chance. The stock 4090 is already overkill on my 3440 screen and will last me for a good few years before I need to think about an upgrade.


----------



## GRABibus

Muut said:


> View attachment 2591015
> 
> 
> change both lines to
> 
> VFCurveEditorMinFrequency = 1000
> VFCurveEditorMaxFrequency = 3500
> 
> for example
> 
> 
> View attachment 2591016


Pas mal


----------



## GRABibus

WilliamLeGod said:


> any1 here getting the 4090ti if they release?


No.
The only upgrade for me in the next months will be CPU (7000 X3D or AMD next gen).


----------



## Shoggoth

Helmbo said:


> Just bought the Alphacool  - should be here in week 1 2023. but thank you regardless.


I mounted the Alphacool block here a couple of days ago and I'm pretty happy with it. The highest I've seen is a hotspot temperature of 62C after a couple of hours of Superposition, which'll do me nicely.


----------



## yzonker

The "vs" mode in Speedway is a great way to check the legitimacy of runs on the HOF. You can easily see artifacted runs by comparing the fps trace between your machine running without artifacts and other runs in the 3DMark database. Too bad the other 3DMark benchmarks don't have this.

I just run on my normal OS with everything running, etc... so my trace is well below the ones I tested against, but you can easily see in the 2nd one where it artifacted. I guess I won't name names other than the first comparison that is obviously legit was @kryptonfly 's 11394 score. Not singling Kryptonfly out, just grabbed that run as I was fairly confident it was legit.

My trace is always the lower orange line.

Legit,









Artifacted,


----------



## Sheyster

Shoggoth said:


> Not a chance. The stock 4090 is already overkill on my 3440 screen and will last me for a good few years before I need to think about an upgrade.


Considering that a Samsung 4K240 monitor is already available now, there will be a need for the Ti.


----------



## Muut

yzonker said:


> The "vs" mode in Speedway is a great way to check the legitimacy of runs on the HOF. You can easily see artifacted runs by comparing the fps trace between your machine running without artifacts and other runs in the 3DMark database. Too bad the other 3DMark benchmarks don't have this.
> 
> I just run on my normal OS with everything running, etc... so my trace is well below the ones I tested against, but you can easily see in the 2nd one where it artifacted. I guess I won't name names other than the first comparison that is obviously legit was @kryptonfly 's 11394 score. Not singling Kryptonfly out, just grabbed that run as I was fairly confident it was legit.
> 
> My trace is always the lower orange line.
> 
> Legit,
> View attachment 2591063
> 
> 
> Artifacted,
> 
> View attachment 2591064



I had made the same comparison with port royal on Hwbot's discord and posted a comparison of what my legit runs looked in comparison to my "bugged run". 









I could see each run was identical when looking at the curve's shape. The last curve had a bump at the end, which is where I artefacted. 
Not the most scientific way to compare but good enough to spot bugged runs and legit runs. 

It would be nice if they implemented such a feature everywhere but it would only work on GPU bound benchmarks where your GPU is at 99% load all the time (unlike firestrike for example)


----------



## Xdrqgol

Sheyster said:


> Considering that a Samsung 4K240 monitor is already available now, there will be a need for the Ti.


It will not be worth whatever they do for the “Ti” version because they will consume more W and how many FPS do you think it will have compared to 4090..? Definetly not sufficient to be worth the money.
This product will be aimed only to enthusiast that want to have the latest no matter what. 

I have the G8 and my 4090 runs well enough in 4K for now. Only upgrade worth is next generation 5000 series!


----------



## Shoggoth

Sheyster said:


> Considering that a Samsung 4K240 monitor is already available now, there will be a need for the Ti.



Sure enough, but only for those interested in the kind of games which benefit from 240Hz and I'm not among them.


----------



## dpoverlord

Odd Question coming from a 3090 user.

Running games super smooth on my EBGA 3090 FTW3 + 5930k at 4k.

Few things runs lower and I was thinking my money would be better spent getting a 4090 vs upgrading my Asus Rampage V Extreme Platform.

Few people were telling me that upgrading the mobo / cpu would be a better bet. Any thoughts?


----------



## Sheyster

Xdrqgol said:


> This product will be aimed only to enthusiast that want to have the latest no matter what.


I guess this goes without saying, but you're in the right place for that!  Plenty of folks like that here.


----------



## Sn0wHookr

Just picked up a Gigabyte 4090 Gaming OC, which is a beast of a card coming from a GTX 1080Ti. Managed to fit it in my Fractal Meshify 2 Compact without any room to spare. 
I noticed on stock settings the boost clocks only go up to 2640mhz where most reviews show the card boosting to between 2700-2800mhz with stock settings. Any idea why mine doesn't boost as high? Using a Corsair RM850x with the official Corsair 12vhpwr cable. 
I've done some overclocking and and settled on a 80% power limit with +315 core boost and +1200 memory. The core clocks are now boosting to between 2900-3000mhz with lower power so pretty happy with that.


----------



## SilenMar

GosuPl said:


> RTX 3090 STRIX vs RTX 3090 Ti vs RTX 4090 STRIX
> FP32 / RT ON / DLSS ON
> 1440p / 4K
> 
> Test platform :
> 
> i9-13900K 5.7 GHz P/ EOFF / 5.0 Ring EVGA Z690 CLASSIFIED
> G.Skill Trident Z5 6400 MHz CL32.38.38.32 T2 G2 + II/III
> RTX 3090 STRIX PT 480W / OC + 80 / + 1100 Fan ~ 2200 RPM
> RTX 3090 Ti FTW3 PT 480W / OC + 90 / + 800 Fan ~ 2200 RPM
> RTX 4090 STRIX PT 600W / OC + 100/ + 800 Fan ~ 2200 RPM
> be quiet! Dark Pro 12 1500 (bq! 12VHPWR for RTX 3090 Ti / RTX 4090)
> LC 360/45 + 240/45 + 1x D5 for CPU
> Win 10 Pro
> Nvidia Drivers 526.47
> 
> Pure testing with full OSD information on each GPU starts at 15:09 and last for 50:58
> 
> Before and after the tests, discussing the topic in Polish. I a recommend to jump to the tests right away, unless someone knows Polish, then I invite you


Wonder where is your SLI comparison now.

Back in 2017 when 8700K SLI start to beat 6950X SLI due to less CPU bottleneck regardless of PCIe lanes.


----------



## J7SC

Sheyster said:


> I guess this goes without saying, but you're in the right place for that!  Plenty of folks like that here.


...yes, that old familiar slippery slope...get a new GPU that can max the 4K120 monitor; go for a new GPU after that and then '''need''' a 8K240 Hz monitor; get a new 8K240 monitor and go for a new....

I am feeling a bit better though re. waiting for certain other upgrades. My 5950X / Asus DarkH combo is still more than capable even w/the 4090, and I skipped the whole AM5 / 13900 K 'upgrade' because life isn't all about ego benching.

Still, a new year is knocking on the door and soon, my 5950X setup will be two years old, so...?


----------



## dpoverlord

J7SC said:


> ...yes, that old familiar slippery slope...get a new GPU that can max the 4K120 monitor; go for a new GPU after that and then '''need''' a 8K240 Hz monitor; get a new 8K240 monitor and go for a new....
> 
> I am feeling a bit better though re. waiting for certain other upgrades. My 5950X / Asus DarkH combo is still more than capable even w/the 4090, and I skipped the whole AM5 / 13900 K 'upgrade' because life isn't all about ego benching.
> 
> Still, a new year is knocking on the door and soon, my 5950X setup will be two years old, so...?
> View attachment 2591077


Ah thats my slippery slope. Load up Skyrim, max out mods.... GPU is not strong enough. 
Next Upgrade cycle, step and repeat .... Never play Skyrim.
Got a 3090 and now aiming for the 4090, but I have been "HOLDING" on the CPU / Chipset upgrade for my aging X99. Think the 7950X3d is the key to go with my 4090? It sounds like the Intel Chipset is falling very behind.

Am I crazy for using a 5930k / X99 Rampage with the incoming 4090 or would I be better of staying with the 3090 until I upgrade my platform?


----------



## J7SC

dpoverlord said:


> Ah thats my slippery slope. Load up Skyrim, max out mods.... GPU is not strong enough.
> Next Upgrade cycle, step and repeat .... Never play Skyrim.
> Got a 3090 and now aiming for the 4090, but I have been "HOLDING" on the CPU / Chipset upgrade for my aging X99. Think the 7950X3d is the key to go with my 4090? It sounds like the Intel Chipset is falling very behind.
> 
> Am I crazy for using a 5930k / X99 Rampage with the incoming 4090 or would I be better of staying with the 3090 until I upgrade my platform?


...my 3090 Strix did just fine on the X99 / 5960 Intel HEDT, and the 4090 has no problem whatsoever with a fastish & heavily w-cooled 5950X / X570 - though I never did try the 4090 on the X99 because I don't have that X99 setup anymore (still have another X99 mobo somewhere but I think it has PCIe slot issues). The 4090s with their extra bandwidth probably would miss out on a X99, even at 4K - 4090s are simply that fast.

...AM5 socket is supposed to stick around for a few gens (unlike LG1700) and the 7950X3D would be / will be a gaming monster with its giant cache. CES '23 starts in 8 days - by then we should know more.


----------



## Xdrqgol

J7SC said:


> ...yes, that old familiar slippery slope...get a new GPU that can max the 4K120 monitor; go for a new GPU after that and then '''need''' a 8K240 Hz monitor; get a new 8K240 monitor and go for a new....
> 
> I am feeling a bit better though re. waiting for certain other upgrades. My 5950X / Asus DarkH combo is still more than capable even w/the 4090, and I skipped the whole AM5 / 13900 K 'upgrade' because life isn't all about ego benching.
> 
> Still, a new year is knocking on the door and soon, my 5950X setup will be two years old, so...?
> View attachment 2591077


we wouldn’t be here ofc…

7950x3D is something I am waiting as well , but for the 4090 ti definitely NO.

i hope besides being fast , that we can push 7950x3D further than 6400mhz on MeM side…


----------



## Blameless

Picked up my Gigabyte Windforce today. Haven't installed it yet, but I'm pleasantly surprised by the general construction and cleanliness. It is missing the thermal pads on one of the memory power stages that others have noted, but it's close enough to the edge of the board that I can reach it with the 10 gauge dispensing needles I've got, so I'll just pump some thermal putty into it right away.

Warranty sticker on the backplate screw also came off without any damage. Not going to putty the backplate until I'm sure I'm keeping this sample (there is no thermal connection to theb backplate, other than a single pad that covers half of one row of input filtering capacitors), but it should be very easy to do.

Everything else looks solid, so I doubt I'll even bother to fully dismantle it, unless there are demonstrable thermal issues.



Xdrqgol said:


> i hope besides being fast , that we can push 7950x3D further than 6400mhz on MeM side…


Wouldn't count on it, as IOD is going to be the same. It's also not likely going to really benefit from maxing out memory performance either. When I build a 7800X3D setup, it's getting the cheapest dual-rank Hynix M-die I can find to go with it.


----------



## dpoverlord

J7SC said:


> ...my 3090 Strix did just fine on the X99 / 5960 Intel HEDT, and the 4090 has no problem whatsoever with a fastish & heavily w-cooled 5950X / X570 - though I never did try the 4090 on the X99 because I don't have that X99 setup anymore (still have another X99 mobo somewhere but I think it has PCIe slot issues). The 4090s with their extra bandwidth probably would miss out on a X99, even at 4K - 4090s are simply that fast.
> 
> ...AM5 socket is supposed to stick around for a few gens (unlike LG1700) and the 7950X3D would be / will be a gaming monster with its giant cache. CEA '23 starts in 8 days - by then we should know more.


thanks @J7SC it sounds like I had a similar X99 build as you. My EVGA 3090 FTW3 does pretty well at 4k gaming 120FPS. I did a double take with the 4090 because I saw how much better it did at 4k so I thought it would do the same for my X99 system. Never even considered the X99 would not have the bandwidth. Intels raptor lake seems EOL so had been waiting for Metor Lake

I realize now that the new AMD platform with a 7950X3d would really be the way to go. Think I should return the 4090 then and just wait and save the money for that or stay with the 4090 and save a bit more? Curious now which would give me the largest improvement for 4k gaming.


----------



## leonman44

Blameless said:


> Picked up my Gigabyte Windforce today. Haven't installed it yet, but I'm pleasantly surprised by the general construction and cleanliness. It is missing the thermal pads on one of the memory power stages that others have noted, but it's close enough to the edge of the board that I can reach it with the 10 gauge dispensing needles I've got, so I'll just pump some thermal putty into it right away.
> 
> Warranty sticker on the backplate screw also came off without any damage. Not going to putty the backplate until I'm sure I'm keeping this sample (there is no thermal connection to theb backplate, other than a single pad that covers half of one row of input filtering capacitors), but it should be very easy to do.
> 
> Everything else looks solid, so I doubt I'll even bother to fully dismantle it, unless there are demonstrable thermal issues.
> 
> 
> 
> Wouldn't count on it, as IOD is going to be the same. It's also not likely going to really benefit from maxing out memory performance either. When I build a 7800X3D setup, it's getting the cheapest dual-rank Hynix M-die I can find to go with it.


I noticed the warranty sticker on my ASU’s card too and made me worry that I might lose my warranty if I just install the waterblock. How did you manage to remove the warranty sticker without ruining it ?


----------



## Blameless

leonman44 said:


> I noticed the warranty sticker on my ASU’s card too and made me worry that I might lose my warranty if I just install the waterblock. How did you manage to remove the warranty sticker without ruining it ?


The one on my card wasn't one of the pre-slit ones designed to fall apart if it's tampered with and it was also placed off-center enough that I could fit a dental probe under the edge. I also removed it before applying any power to the card, so the adhesive was likely never heated.

Anyway, I'd check consumer protection laws in your area. These stickers are technically meaningless in the States...I just pull them and save them to avoid wasted time with ignorant or confrontational support personnel.


----------



## KingEngineRevUp

Xdrqgol said:


> but for the 4090 ti definitely NO.


Yeah, if history repeats itself, you have about 6 months to enjoy it. Then the 5090 will be out at a cheaper price.


----------



## Chili195

Think my FE is a fairly solid yet unspectacular performer.

I can get it to +180 (so it just briefly touches 3000 @ 1.1v) and +1100 on the memory. I'm not convinced its stable so will probably have to lower that. Voltage/PL sliders at max.

Port Royal (with memory at +1200):








I scored 27 670 in Port Royal


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Timespy:








I scored 29 853 in Time Spy


AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





If I increase either further I get a crash, usually when loading or just after load.

The thing is I've had the same crash with same error a couple of times even running at stock so I'm not sure if there is an underlying issue. The memory seems particularly sensitive.

I did wonder if my power supply comes into play. It's an EVGA 850w G2 used since 2016 so it might be struggling possibly? I'm running it with a 5800X3D, two NVMEs, 3 SSDs, 1 HDD and D5.

Also does anyone else's FE have a weird wobbly fan? The top fan on mine (on the backside) "wobbles" a bit when running at slower speeds whereas the other one looks fine (but that one makes a grinding noise when at max).

Video (begins about 20 seconds in)

I was going to put a waterblock on it even though it looks like I'm not going to get a huge amount out of it but just wanted to make sure it was free of issues before I open it up. I'm just looking for any excuse to exchange it really.


----------



## yzonker

Chili195 said:


> Think my FE is a fairly solid yet unspectacular performer.
> 
> I can get it to +180 (so it just briefly touches 3000 @ 1.1v) and +1100 on the memory. I'm not convinced its stable so will probably have to lower that. Voltage/PL sliders at max.
> 
> Port Royal (with memory at +1200):
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 27 670 in Port Royal
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Timespy:
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 853 in Time Spy
> 
> 
> AMD Ryzen 7 5800X3D, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> If I increase either further I get a crash, usually when loading or just after load.
> 
> The thing is I've had the same crash with same error a couple of times even running at stock so I'm not sure if there is an underlying issue. The memory seems particularly sensitive.
> 
> I did wonder if my power supply comes into play. It's an EVGA 850w G2 used since 2016 so it might be struggling possibly? I'm running it with a 5800X3D, two NVMEs, 3 SSDs, 1 HDD and D5.
> 
> Also does anyone else's FE have a weird wobbly fan? The top fan on mine (on the backside) "wobbles" a bit when running at slower speeds whereas the other one looks fine (but that one makes a grinding noise when at max).
> 
> Video (begins about 20 seconds in)
> 
> I was going to put a waterblock on it even though it looks like I'm not going to get a huge amount out of it but just wanted to make sure it was free of issues before I open it up. I'm just looking for any excuse to exchange it really.


A couple of YouTube reviewers noted the wobbly fan on their review cards. Not sure if it's an issue or not but yours definitely isn't the only one.


----------



## Sheyster

J7SC said:


> My 5950X / Asus DarkH combo is still more than capable even w/the 4090, and I skipped the whole AM5 / 13900 K 'upgrade' because life isn't all about ego benching.





dpoverlord said:


> Intels raptor lake seems EOL so had been waiting for Metor Lake


Well, if this rumor below turns out to be true, LGA 1700 may still have a little gas left in the tank!









Intel May Cancel Meteor Lake Desktop CPUs in Favor of Raptor Lake Refresh - ExtremeTech


Intel might end up releasing Meteor Lake on mobile only, opting instead for a Raptor Lake refresh for the desktop in 2023.




www.extremetech.com


----------



## pt0x-

I have my EKWB block installed on my 4090 Strix OC, must say I'm very happy with the temps. 

What I did notice is that during benching I never hit 540w+ even in furmark. Im running the stock bios still. 

Can anyone report if the newest strix bios will do anything in this regard? 

And/or, when flashing the galax 660w bios, what port will I loose? Asking because my case only allows access to some ports. (Only need one dp and one hdmi).

Im using a seasonic prime tx-1600 with 4x8 cablemod cable btw.


----------



## Mad Pistol

Started playing Death Stranding a few days ago. I know it doesn't have any RT or insane effects, but it looks absolutely GORGEOUS on my LG CX + 4090. 4K maxed, HDR, locked at 117 FPS. It has a very surreal look to it that makes the terrain look almost real, yet not. The 4090 makes this game absolutely sublime.


----------



## yzonker

pt0x- said:


> I have my EKWB block installed on my 4090 Strix OC, must say I'm very happy with the temps.
> 
> What I did notice is that during benching I never hit 540w+ even in furmark. Im running the stock bios still.
> 
> Can anyone report if the newest strix bios will do anything in this regard?
> 
> And/or, when flashing the galax 660w bios, what port will I loose? Asking because my case only allows access to some ports. (Only need one dp and one hdmi).
> 
> Im using a seasonic prime tx-1600 with 4x8 cablemod cable btw.


The bottom hdmi port (when mounted horizontally) will not work properly with the Galax bios. I haven't tested the DP's though.

At least that's what I've found with my TUF. 

Just flash it and see what you get.


----------



## yzonker

Mad Pistol said:


> Started playing Death Stranding a few days ago. I know it doesn't have any RT or insane effects, but it looks absolutely GORGEOUS on my LG CX + 4090. 4K maxed, HDR, locked at 117 FPS. It has a very surreal look to it that makes the terrain look almost real, yet not. The 4090 makes this game absolutely sublime.


Yep, I picked that up for free on Epic and Guardians of the Galaxy while it was on sale on Steam. Both are really good. GotG makes the 4090 suck down quite a bit of power too. I was seeing 500w+ at 4k max settings at 1100mv.


----------



## bmagnien

AMD 7XXX3D will probably be my next upgrade assuming it provides a similarly large gaming bump as did the 5800x3D, bonus points if it supports proper OC and doesn’t have thermal issues.

The ram limitations are a bit concerning especially considering ddr5 is getting faster and cheaper. Seeing 8000mhz, ~cl30 becoming more mainstream on the intel side gives me pause on whether to stick with AMD this time around.

or I could just give up on RAM OC like this guy:


----------



## GRABibus

Gaming 4K @ GPU > 3100MHz possible, even at 60 degrees GPU temp 😊


----------



## yzonker

bmagnien said:


> AMD 7XXX3D will probably be my next upgrade assuming it provides a similarly large gaming bump as did the 5800x3D, bonus points if it supports proper OC and doesn’t have thermal issues.
> 
> The ram limitations are a bit concerning especially considering ddr5 is getting faster and cheaper. Seeing 8000mhz, ~cl30 becoming more mainstream on the intel side gives me pause on whether to stick with AMD this time around.
> 
> or I could just give up on RAM OC like this guy:
> 
> View attachment 2591098


Well if the 7xxx3D is anything like the 5800x3D, that lack of DDR5 speed won't matter very much. Back when I first got the 5800x3D, I ran the SotTR benchmark (1080p low) at just XMP 3600 and also with my tuned 3800 timings. There was only about 2 fps difference. Ram performance was almost irrelevant (within reason of course).


----------



## J7SC

Sheyster said:


> Well, if this rumor below turns out to be true, LGA 1700 may still have a little gas left in the tank!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Intel May Cancel Meteor Lake Desktop CPUs in Favor of Raptor Lake Refresh - ExtremeTech
> 
> 
> Intel might end up releasing Meteor Lake on mobile only, opting instead for a Raptor Lake refresh for the desktop in 2023.
> 
> 
> 
> 
> www.extremetech.com


A nice white Z790 Apex, 13900KS and white Galax HoF would look absolutely great in my collection, but the P+E cores could be problematic for some older code I sometimes run (I also have an acquaintance who has trouble with folding on his 13900K, as a separate issue). My last two builds based on X570 16C/32T 'desktop' were the exception for me as I usually build HEDT for work+play dual use...but the HEDT genre has been sort of lost over the last few years. That said, new (Zen4 core) Threadrippers are coming, and the slide below shows a bit more about Intel reentering HEDT as well. 

Still, that rumoured 200MB cache on the 7950X3D has my attention - similar to when our dog smells a big pack of fresh bacon 😋


----------



## Agent-A01

pt0x- said:


> I have my EKWB block installed on my 4090 Strix OC, must say I'm very happy with the temps.
> 
> What I did notice is that during benching I never hit 540w+ even in furmark. Im running the stock bios still.
> 
> Can anyone report if the newest strix bios will do anything in this regard?
> 
> And/or, when flashing the galax 660w bios, what port will I loose? Asking because my case only allows access to some ports. (Only need one dp and one hdmi).
> 
> Im using a seasonic prime tx-1600 with 4x8 cablemod cable btw.


Temps? What''s your delta t? Hotspot vram?


----------



## J7SC

dpoverlord said:


> thanks @J7SC it sounds like I had a similar X99 build as you. My EVGA 3090 FTW3 does pretty well at 4k gaming 120FPS. I did a double take with the 4090 because I saw how much better it did at 4k so I thought it would do the same for my X99 system. Never even considered the X99 would not have the bandwidth. Intels raptor lake seems EOL so had been waiting for Metor Lake
> 
> I realize now that the new AMD platform with a 7950X3d would really be the way to go. Think I should return the 4090 then and just wait and save the money for that or stay with the 4090 and save a bit more? Curious now which would give me the largest improvement for 4k gaming.


...it depends on your use-case of your machine, and of course budget...if it is primarily gaming, I would keep the 4090 and seriously think about a AM4 value board and a 5800X3D if you feel you have to upgrade _now_. The X99 platform will still work for your 4090, but you are leaving a chunk of performance on the table, even in regard to things like resizable_BAR.

I realize it is a bit counter-intuitive to recommend an AM4 5800X3D re. earlier comments about not investing in a platform that has no real upgrade path - but it is probably the cheapest option for now to max your 4090 gaming experience, and for years to come. Besides, you get the PCIe 4.0 goodies while having what is still considered as a top-of-the-line gaming CPU and save some money for a bigger system upgrade. In the meantime, you can watch the upcoming HEDT wars and figure out which team you want to back...


----------



## kryptonfly

yzonker said:


> The "vs" mode in Speedway is a great way to check the legitimacy of runs on the HOF. You can easily see artifacted runs by comparing the fps trace between your machine running without artifacts and other runs in the 3DMark database. Too bad the other 3DMark benchmarks don't have this.
> 
> I just run on my normal OS with everything running, etc... so my trace is well below the ones I tested against, but you can easily see in the 2nd one where it artifacted. I guess I won't name names other than the first comparison that is obviously legit was @kryptonfly 's 11394 score. Not singling Kryptonfly out, just grabbed that run as I was fairly confident it was legit.
> 
> My trace is always the lower orange line.
> 
> Legit,
> View attachment 2591063
> 
> 
> Artifacted,
> 
> View attachment 2591064


I can confirm all my runs scale perfectly with vram on SW, my max is 11397 at 1537mhz vram. This one 11394 is exactly consistent at the same clocks. I start to lose perf beyond +1790. I never experienced any kind of artifact in all bench, it seems there were lots of bugs before December when we take a closer look to 3dmark data, I got mine on 6 December and my bios was built on 20 October, even if it's a rev 1.0 it was not the first on sales. 


dpoverlord said:


> Ah thats my slippery slope. Load up Skyrim, max out mods.... GPU is not strong enough.
> Next Upgrade cycle, step and repeat .... Never play Skyrim.
> Got a 3090 and now aiming for the 4090, but I have been "HOLDING" on the CPU / Chipset upgrade for my aging X99. Think the 7950X3d is the key to go with my 4090? It sounds like the Intel Chipset is falling very behind.
> 
> Am I crazy for using a 5930k / X99 Rampage with the incoming 4090 or would I be better of staying with the 3090 until I upgrade my platform?


I can tell for sure I was cpu limited with a golden 6950X at 4.6 Ghz, ring 3.8 Ghz and ram 3400C13 at 4K in the Endwalker bench with my previous 3090 (so imagine with a 4090 !) especially in the 2nd scene in the town and the last. I gained around +40 fps in this scene and in bad mono-thread games (dx11 mainly) with the z690 and just to compare : 6950X + 3090 at 1080p = 13900K + 4090 at 4K because I do around 32K in both systems. You will see the difference in RTX games. After that it's up to you AMD or Intel, wait for AMD announces.


----------



## Helmbo

Shoggoth said:


> I mounted the Alphacool block here a couple of days ago and I'm pretty happy with it. The highest I've seen is a hotspot temperature of 62C after a couple of hours of Superposition, which'll do me nicely.


Im looking so much forward to its arrival. Installed the card yesterday, and im impressed, and sad at the same time.... i think my 11700K OC to 5.2 Ghz on all cores and my 4266 32Gb ram is holding it back, im playing 1440P 165Hz  - now im in this mindset, should i just upgrade to 13900k + 8000mhz ram. or wait for 7xxxX3D... hmmm


----------



## J7SC

Helmbo said:


> Im looking so much forward to its arrival. Installed the card yesterday, and im impressed, and sad at the same time.... i think my 11700K OC to 5.2 Ghz on all cores and my 4266 32Gb ram is holding it back, im playing 1440P 165Hz  - now im in this mindset, should i just upgrade to 13900k + 8000mhz ram. or wait for 7xxxX3D... hmmm


...there goes another one down the slippery slope


----------



## Krzych04650

J7SC said:


> I am feeling a bit better though re. waiting for certain other upgrades. My 5950X / Asus DarkH combo is still more than capable even w/the 4090, and I skipped the whole AM5 / 13900 K 'upgrade' because life isn't all about ego benching.


I'd agree that just buying stuff every time something new comes out is often not justified, especially with relatively small generational leaps that CPUs typically have, but this idea is a very misplaced in this particular situation. 5950X is not a very good gaming CPU, it will drop into 60s and 70s in heavily CPU bound games, not to mention recent RT titles where you are going to be in the 40s and even Frame Gen won't help you. Tuned 13900K can be considered 120 FPS CPU in like 99% of cases, 5950X is nowhere near that. 



dpoverlord said:


> Am I crazy for using a 5930k / X99 Rampage with the incoming 4090


You are. I moved from 6900K to 13900K and the average gain is 2.2x, with some cases going up to 3x. And this is not a difference between 200 and 400 FPS, it is a difference between struggling in low 60s with drops 50s and 40s in a heavily CPU bound games versus running almost everything at 120FPS+. Not to even mention things like Witcher 3 next-gen and or other recent RT titles, you will be in low 30s there. 

Tuned X99 is around the level of Ryzen 3000, maybe a little better and more consistent, but this isn't enough for stable 60 FPS anymore, let alone 120 FPS. I liked my HEDT+SLI too and have a lot of sentiment for it, but it's high time to move on. I don't even know words to describe how bad of an idea it is to put 4090 on that platform.



Helmbo said:


> Im looking so much forward to its arrival. Installed the card yesterday, and im impressed, and sad at the same time.... i think my 11700K OC to 5.2 Ghz on all cores and my 4266 32Gb ram is holding it back, im playing 1440P 165Hz  - now im in this mindset, should i just upgrade to 13900k + 8000mhz ram. or wait for 7xxxX3D... hmmm


It is definitely holding it back, but you don't have to necessarily go all in with 13900K and especially not with 8000 DDR5. If you have a good 4000+ 32 GB DDR4 kit already then you can just buy a budget Z690 DDR4 board with 13700K and if you tune it well it will be very close to maxed out 13900K for very little money. RPL does very well with DDR4 and strictly for gaming there is no need to pay for high end boards and DDR5 kits.

You can also resell your current stuff and get some money back, it should be enough to at least cover the cost of new mobo, so you would really be paying for CPU only.


----------



## Helmbo

Krzych04650 said:


> It is definitely holding it back, but you don't have to necessarily go all in with 13900K and especially not with 8000 DDR5. If you have a good 4000+ 32 GB DDR4 kit already then you can just buy a budget Z690 DDR4 board with 13700K and if you tune it well it will be very close to maxed out 13900K for very little money.



All im interested in, is trying to keep the 0.1% and 1% lows, as high as possible. and 11700k even oc'ed, seems to get these stupid stutters that you can feel. i dont care if the card/cpu can keep 165 Fps, if i get hit to 40-80 fps stutters time to time. because that you can feel. And its super annoying to me atleast. i just want the damn stutters to be a thing of the past already.


----------



## Krzych04650

Helmbo said:


> All im interested in, is trying to keep the 0.1% and 1% lows, as high as possible. and 11700k even oc'ed, seems to get these stupid stutters that you can feel. i dont care if the card/cpu can keep 165 Fps, if i get hit to 40-80 fps stutters time to time. because that you can feel. And its super annoying to me atleast. i just want the damn stutters to be a thing of the past already.


Well that is not going to happen unfortunately, lots if not most games have inherent stuttering that you cannot brute force with hardware, and it is only getting worse over time as lots if not most modern games are suffering from sharder compilation stutter and just inherent stuttering problems in general. DX12 made things a lot worse as well by putting all responsibility onto developers, who at this point are either bunch of incompetent diversity hires who are not gamers themselves or under tremendous time pressure and unrealistic deadlines, so even if they could and wanted to do something well, they are not allowed to. There are some exceptions but very few and decreasing.


----------



## Helmbo

Krzych04650 said:


> Well that is not going to happen unfortunately, lots if not most games have inherent stuttering that you cannot brute force with hardware, and it is only getting worse over time as lots if not most modern games are suffering from sharder compilation stutter and just inherent stuttering problems in general.


I hate that you are right. But even though its true, a better cpu might increase the lows. Maybe the Amd 7xxxX3D can do some magick there. Else i might pull the trigger on 13th gen intel within next week. Now that i have a buyer for my 11th gen system.


----------



## Krzych04650

Helmbo said:


> I hate that you are right. But even though its true, a better cpu might increase the lows. Maybe the Amd 7xxxX3D can do some magick there. Else i might pull the trigger on 13th gen intel within next week. Now that i have a buyer for my 11th gen system.


Yea some of those lows do scale with CPU, even if proportions vs average stay the same, for example my Witcher 2 test location has auto save stutter in it and 0.1% low improved from like 15 FPS to 25 by going from 6900K to 13900K, but it will take like 20 years until that number is above 100


----------



## J7SC

Krzych04650 said:


> I'd agree that just buying stuff every time something new comes out is often not justified, especially with relatively small generational leaps that CPUs typically have, but this idea is a very misplaced in this particular situation. 5950X is not a very good gaming CPU, it will drop into 60s and 70s in heavily CPU bound games, not to mention recent RT titles where you are going to be in the 40s and even Frame Gen won't help you. Tuned 13900K can be considered 120 FPS CPU in like 99% of cases, 5950X is nowhere near that.
> (...)


...whoa ! Over the last 12 hours in this very thread, I posted the update re. the AMD 7xxxX3D and added the metaphor of our 'dog looking at bacon', added the table about the upcoming Intel HEDTs Sapphire Rapids and recommended the 5800X3D (not the 5950X) as an interim 'best gaming' option and have suggested the 13900KS as s.th. I'm interested in, subject to P + E core issues in some apps. With all that, it's not like I'm married to the 5950X...

That said, I don't agree with some of the numbers you posted re. 5950X in gaming. _Once you set them up right, especially re. IF and RAM_, they really work quite well with the 4090 w/ DLSS/3, Frame Insertion, NV Reflex etc. While I only play a few games such as FS 2020 and CP 2077 (for the latter, see built-in bench result below), the 4090 and the aforementioned technologies bought me enough time to wait for CES '23 announcements about upcoming hardware - I don't have to unload a 7950X or 13900K after only a few months of use...that was my main point


----------



## Jack Chim

Faltzer said:


> I would like to share some research that I have done on the gigabyte 4090 waterforce.
> 
> I have tried all the vbioses, the best one was from gigabyte itself, but the gigabyte OC version of the other card they sell.
> 
> It yielded the best results, about 550wat usage at max (tried unigine heaven and the 3dmark tests).
> 
> It also gave the best overclockability, with the other vbioses I got artifacts and crashes at certain overclocks, with the gigabyte OC I managed to stabely use 190 core 1390 mem.
> 
> I could go higher, but then it glitches (artifacts @Heaven benchmark).
> 
> 
> View attachment 2587793


happy for see this

I’m using gaming oc, just like you, I’ve also tried a lot of BIOS. The best one is Gigabyte’s own, but my opinion is the master’s BIOS, not gaming oc. The master’s bios can give me better performance than gaming oc at the same frequency. Higher score, albeit only a little difference, and the timing of the master will be tighter


----------



## J7SC

...a little Superposition action before bedtime...ambient at 25 C, VRAM warm-up run...not sure what they mean by 'yes' to 'GPU Performance Limiters'


----------



## Krzych04650

J7SC said:


> ...whoa ! Over the last 12 hours in this very thread, I posted the update re. the AMD 7xxxX3D and added the metaphor of our 'dog looking at bacon', added the table about the upcoming Intel HEDTs Sapphire Rapids and recommended the 5800X3D (not the 5950X) as an interim 'best gaming' option and have suggested the 13900KS as s.th. I'm interested in, subject to P + E core issues in some apps. With all that, it's not like I'm married to the 5950X...
> 
> That said, I don't agree with some of the numbers you posted re. 5950X in gaming. _Once you set them up right, especially re. IF and RAM_, they really work quite well with the 4090 w/ DLSS/3, Frame Insertion, NV Reflex etc. While I only play a few games such as FS 2020 and CP 2077 (for the latter, see built-in bench result below), the 4090 and the aforementioned technologies bought me enough time to wait for CES '23 announcements about upcoming hardware - I don't have to unload a 7950X or 13900K after only a few months of use...that was my main point
> View attachment 2591132


Heh alright, I am only pointing some important things out, it wasn't meant to be personal. I am waiting for CES too before buying a new monitor, and my current one is certainly not adequate for 4090 either, so it is normal to have something weaker temporarily, it a bit of a transition period for everyone right now. The problem starts when someone tries to convince himself or others that this inadequate stuff, whatever it may be, is adequate. Maybe you didn't try to do that intentionally but your conversation with the other guy sounded a bit like not only 5950X is enough for 4090 but maybe X99 wouldn't be that bad either, which is just not the case, so I just pointed that out.

This topic is always very controversial because most people are convinced that everyone is GPU bound anyway so CPU doesn't matter, but this typically comes from lack of experience with wide variety of games and not using hardware monitoring in games.

In the end you can show me many examples where you are not CPU bound and I can show you many examples where you are, it is up to everyone to decide individually what they want to do, but hearing both sides of the story before deciding is important.


----------



## jootn2kx

In my personal experience with the 4090 I feel even my 5800X3D is coming short in alot of recent games even with frame generation on.
Witcher3 + RT with frame gen on I see the GPU usage still dipping around 60%-70% means these parts are heavily CPU limited.

Callisto protocol even worse, I mean we're going need to serious CPU horsepower to drive current/near future games.
I mean yeah pc games are horrible optimized but I feel this isn't going to change anytime soon.
We need a CPU with 60/70% more horsepower to drive this card, don't think even the 7800X3D will be enough.


----------



## Helmbo

jootn2kx said:


> In my personal experience with the 4090 I feel even my 5800X3D is coming short in alot of recent games even with frame generation on.
> Witcher3 + RT with frame gen on I see the GPU usage still dipping around 60%-70% means these parts are heavily CPU limited.
> 
> Callisto protocol even worse, I mean we're going need to serious CPU horsepower to drive current/near future games.
> I mean yeah pc games are horrible optimized but I feel this isn't going to change anytime soon.
> We need a CPU with 60/70% more horsepower to drive this card, don't think even the 7800X3D will be enough.



I hope the 7800X3D is going to be, best in class for ours 4090's. Else its going to be 13900k/ks


----------



## yzonker

This is an interesting max RAM OC test for anyone debating DDR4 vs DDR5. Of course YMMV on whether you can get to those speeds (and timings) by just buying single samples. 









DDR4 4400 C16 1T Gear1 vs DDR5 8000 C36 performance with...


Hey everyone. I compared the performance of manually OCed and tuned DDR4 vs DDR5 to see if the cost difference of $1180 CAD and reduction in memory capacity by 32GB was justified. I got kind of lucky with my CPU and could OC my Crucial 2x32GB 3200 c16 Micron Rev.B kit to 4400 c16 1T gear1...




www.overclock.net


----------



## alasdairvfr

Helmbo said:


> I hate that you are right. But even though its true, a better cpu might increase the lows. Maybe the Amd 7xxxX3D can do some magick there. Else i might pull the trigger on 13th gen intel within next week. Now that i have a buyer for my 11th gen system.


Windows Defender Exploit Protection Control Flow Guard(CFG) is often the cause of inexplicable stutter in games that have a high average framerate and run really well otherwise. Look it up if you are not familiar. I would never recommend disabling it at the system level but it can be disabled at the application level (i.e. choose which game to disable it for).

Borderlands 3 and Horizon Zero Dawn went to practically zero stutter for me doing this. I didn't test on every stuttery game either.

There is a degree of risk in disabling this security feature as it prevents Defender from blocking exploits that might be present in the application running... again, I suggest you read up on it and make an informed decision.


----------



## Mad1137

Hey guys . I use for tests some bench. But if i OC my card and test with metro exodus everything perfect . But if check with Port royal i got crash ... What can i do ?? Only PR not stable , other banchs perfect .. Should i use lower clocks or ???


----------



## yzonker

Mad1137 said:


> Hey guys . I use for tests some bench. But if i OC my card and test with metro exodus everything perfect . But if check with Port royal i got crash ... What can i do ?? Only PR not stable , other banchs perfect .. Should i use lower clocks or ???


That's just normal I think. I've see the same thing with my 4090 and have seen that comment from other people. My max core OC in PR is lower than pretty much any other benchmark.


----------



## Sheyster

Helmbo said:


> Im looking so much forward to its arrival. Installed the card yesterday, and im impressed, and sad at the same time.... i think my 11700K OC to 5.2 Ghz on all cores and my 4266 32Gb ram is holding it back, im playing 1440P 165Hz  - now im in this mindset, should i just upgrade to 13900k + 8000mhz ram. or wait for 7xxxX3D... hmmm


8000+ MHz memory isn't a given, unless you have an ASUS Apex Z790 (hard to get), good RAM and a good CPU (IMC). Don't count on getting to 8000 with a Maximus Extreme or Hero. Many have tried and failed, a few have gotten lucky.


----------



## Sheyster

ckjim said:


> The master’s bios can give me better performance than gaming oc at the same frequency. Higher score, albeit only a little difference, and the timing of the master will be tighter


I'm using the Galax HOF BIOS as my daily gaming BIOS, at default settings.

Do you happen to have the latest GB Master BIOS? I believe it's the "F2" version. If so can you please post it? You might have to rename the file extension to .TXT first.


----------



## KingEngineRevUp

Helmbo said:


> Im looking so much forward to its arrival. Installed the card yesterday, and im impressed, and sad at the same time.... i think my 11700K OC to 5.2 Ghz on all cores and my 4266 32Gb ram is holding it back, im playing 1440P 165Hz  - now im in this mindset, should i just upgrade to 13900k + 8000mhz ram. or wait for 7xxxX3D... hmmm


Well at least you're better off than this one user I was talking to that was using the GPU with a 4790K.


----------



## Sheyster

yzonker said:


> This is an interesting max RAM OC test for anyone debating DDR4 vs DDR5. Of course YMMV on whether you can get to those speeds (and timings) by just buying single samples.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> DDR4 4400 C16 1T Gear1 vs DDR5 8000 C36 performance with...
> 
> 
> Hey everyone. I compared the performance of manually OCed and tuned DDR4 vs DDR5 to see if the cost difference of $1180 CAD and reduction in memory capacity by 32GB was justified. I got kind of lucky with my CPU and could OC my Crucial 2x32GB 3200 c16 Micron Rev.B kit to 4400 c16 1T gear1...
> 
> 
> 
> 
> www.overclock.net


Unfortunately he used 64GB in the DDR4 system (as opposed to 32 in the DDR5 system), so I don't think the minimum and hence the average FPS numbers are a valid comparison.

This said, DDR4 4200+ is no slouch. If you have a kit that can achieve that speed, save some $$$ and get a DDR4 Z690/790 mobo.


----------



## dboom

Be advised:

























It was fine for 2 weeks. It leaks a lot.
EK full WB.
On 3090 EK WB most of the bolts were in the case after 2 years, on 4090 block i used loctite on some of them. Seals are intact.


----------



## yzonker

Sheyster said:


> Unfortunately he used 64GB in the DDR4 system (as opposed to 32 in the DDR5 system), so I don't think the minimum and hence the average FPS numbers are a valid comparison.
> 
> This said, DDR4 4200+ is no slouch. If you have a kit that can achieve that speed, save some $$$ and get a DDR4 Z690/790 mobo.


That argument was being made in the thread (32 vs 64 Gb), but I guess I'm not convinced that is a performance advantage. Is there any other testing done that shows 64Gb > 32 Gb for games?


----------



## Brads3cents

Xdrqgol said:


> It will not be worth whatever they do for the “Ti” version because they will consume more W and how many FPS do you think it will have compared to 4090..? Definetly not sufficient to be worth the money.
> This product will be aimed only to enthusiast that want to have the latest no matter what.
> 
> I have the G8 and my 4090 runs well enough in 4K for now. Only upgrade worth is next generation 5000 series!


i dont see a ti version regardless. Since AMDs top of the line gpu can only match the 4080. There is no urgency whatsoever on NVidias end. the 4090 is already in outerspace and outpaces the old 3090 by 80% so its not like they need slightly more power to try to convince 3xxx series owners to upgrade either. the 4090 is basically 2 generations worth of uplift vs what is typically expected gen over gen

and then there is the other reason (and a fairly huge one)
right now a 4090ti is completely pointless as lovelace scales with memory and simply throwing more cuda cores into a gpu (among other things) wont amount to much

until faster memory is available there is no point to a 4090ti

Nvidia is MUCH more likely to introduce a 4080ti but this would mean a price drop on the 4080 > price drop on the rest of the lineup just to be able to slot the 80ti into the family. For nvida to even consider introducing the 4080ti into the market they first must finish off 3xxx series inventory

if/when we see a 4090ti, we are looking at a q1 2024 launch. and then you would have a similar situation as last gens 3090ti. you get a couple months out of it before a superior 5090 is released on a smaller node that destroys it in performance and efficiency
Nvidia is better off not introducing it at all as it will exaggerate the gen over gen increase and performance of the 5000 series

when the 4090 was released all the reviewers compared it to the 3090ti instead of the 3090 which makes the 4090 look less impressive than it is


----------



## alasdairvfr

yzonker said:


> That argument was being made in the thread (32 vs 64 Gb), but I guess I'm not convinced that is a performance advantage. Is there any other testing done that shows 64Gb > 32 Gb for games?


I suppose you could track/monitor swap usage and if the smaller memory system is swapping more in normal conditions, there would be a higher chance of performance impact. Not everything swapped gets called back, but back in the pre-ssd days you could really feel it like a ton of bricks when you run out.


----------



## J7SC

Krzych04650 said:


> Heh alright, I am only pointing some important things out, it wasn't meant to be personal. I am waiting for CES too before buying a new monitor, and my current one is certainly not adequate for 4090 either, so it is normal to have something weaker temporarily, it a bit of a transition period for everyone right now. The problem starts when someone tries to convince himself or others that this inadequate stuff, whatever it may be, is adequate. Maybe you didn't try to do that intentionally but your conversation with the other guy sounded a bit like not only 5950X is enough for 4090 but maybe X99 wouldn't be that bad either, which is just not the case, so I just pointed that out.
> (...)


...didn't take it personal, but a lot of comparative reviews I've seen leave everything at stock ...IMO, the 5950X has so many different oc'ing and tweaking options - more than any other CPU I have - that its true potential for gaming and benching took a while to unlock. That and the fact that they really respond very well to fast and tight RAM, ie. 3800 or higher / tighter.



Sheyster said:


> 8000+ MHz memory isn't a given, unless you have an ASUS Apex Z790 (hard to get), good RAM and a good CPU (IMC). Don't count on getting to 8000 with a Maximus Extreme or Hero. Many have tried and failed, a few have gotten lucky.


...not that even 7600 is slow, but 8000 CL24 at 1.4v max is what I'm waiting for  ...I doubt the 13900KS will get there, speaking of which...











Sheyster said:


> I'm using the Galax HOF BIOS as my daily gaming BIOS, at default settings.
> 
> Do you happen to have the latest GB Master BIOS? I believe it's the "F2" version. If so can you please post it? You might have to rename the file extension to .TXT first.


...yeah, I like to try that Master vbios also on my Giga-G-OC, though the Galax will be hard to beat.

The one test I've seen of the Master (DerBauer) would suggest that there is a slightly higher base clock with the Master, but I rarely run things at their base clock. Water-blocks are identical for both as the PCB is mostly identical (apart from an extra connector for LCD screen).


----------



## Brads3cents

yzonker said:


> Well if the 7xxx3D is anything like the 5800x3D, that lack of DDR5 speed won't matter very much. Back when I first got the 5800x3D, I ran the SotTR benchmark (1080p low) at just XMP 3600 and also with my tuned 3800 timings. There was only about 2 fps difference. Ram performance was almost irrelevant (within reason of course).


thats not the point
the point is that the 13900k is significantly faster than advertised because when you tune ram correctly the gap between 7950x and 13900k is very wide
a potential x3d model may not be able to close that gap


----------



## zzztopzzz

After numerous trips to my local MicroCenter, I was finally able to score a MSI Suprim Liquid X this morning. They had 7 on the shelf and I got the last one. Yesterday, they had 10 various makes and I arrived at the store about 10 minutes after the doors opened, no one in the store, and they were sold out. Anyway, expensive yes, but at least at MSRP, and the water cooling is a big plus for me. Shouldn't be much of a problem shoving it all into my Lian case. The card only takes up 2 slots and is about the same physical size as my 3080ti. I'm excited and can't hardly wait to get this thing installed. Like to hear from those presently running this card.


----------



## OC2000

leonman44 said:


> I noticed the warranty sticker on my ASU’s card too and made me worry that I might lose my warranty if I just install the waterblock. How did you manage to remove the warranty sticker without ruining it ?


It was easier than I thought. Hair Dryer and with a hobby scalpel and carefully lift it off in a circular motion bit by bit with more hair dryer in between


----------



## pt0x-

Agent-A01 said:


> Temps? What''s your delta t? Hotspot vram?


My Results

EKWB block with passive backplate on Strix 4090 OC
Mellow OC:

3060MHz @ 1.1v (Curve)
+1125 on VRAM
+100 (max) on core voltage
120% PL

PR stress test result (99.5%)
Vcore max temp: 50.3C
Vcore HS max temp: 60.8C
Vram max temp: 46C
Vrm max temp: 41C
PL max: 103.6%
TBP Max: 516W
Water temp after run maxed out at 28C (sensor on water block out port)









I scored 1 in Port Royal Stress Test


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





Tim I used is TG KE
On the block side I used only Arctic TP3 1mm
On the back side I used EK supplied pads.

Quick Pic:


----------



## leonman44

OC2000 said:


> It was easier than I thought. Hair Dryer and with a hobby scalpel and carefully lift it off in a circular motion bit by bit with more hair dryer in between


Thank you , I will definitely try this! Otherwise if it goes wrong I saw some on the ali for sale


----------



## Snoopy69

I'm looking for the 1000W BIOS


----------



## Nizzen

Snoopy69 said:


> I'm looking for the 1000W BIOS


Do you have Asus strix or Tuf with Elmor Evc tool? If not, you don't need it


----------



## yzonker

Brads3cents said:


> thats not the point
> the point is that the 13900k is significantly faster than advertised because when you tune ram correctly the gap between 7950x and 13900k is very wide
> a potential x3d model may not be able to close that gap


Well that was my point. The 7950x3D will get it speed from the cache probably, not from faster DDR5. And where are the tests that show this huge gap between the 7950x and 13900k at max mem OC? And don't quote Frame Chasers. He doesn't even show the timings used.


----------



## Dragonsyph

Ok so is it with it to watercool a 4090 Tuf oc ? Money don’t matter, just wanna know if you gain anything? I’m already water cooling so it would be easy to do.


----------



## J7SC

Nizzen said:


> Do you have Asus strix or Tuf with Elmor Evc tool? If not, you don't need it


Speaking of Elmor (the fellow sitting down)...


Spoiler


----------



## yzonker

J7SC said:


> Speaking of Elmor (the fellow sitting down)...
> 
> 
> Spoiler


Ordered yours yet? I'm toying with the idea. Do you know, is "Loop 2" in the software the VRAM voltage? Kinda looks like it in derBauer's video.


----------



## Brads3cents

yzonker said:


> Well that was my point. The 7950x3D will get it speed from the cache probably, not from faster DDR5. And where are the tests that show this huge gap between the 7950x and 13900k at max mem OC? And don't quote Frame Chasers. He doesn't even show the timings used.


many people know this by default by comparing low memory settings used with 13900k reviews to their own tweaked memory and comparing the two. there is a huge performance difference
you dont need a reviewer to show you this
since the 13900k is already faster with bunk memory you will only see the difference widen with something like 8200 cl34 which is my personal settings

majority of reviewers shamefully reviewed the 7950 by comparing it to a 13900k with 6000 M/T ram 🤣


----------



## J7SC

yzonker said:


> Ordered yours yet? I'm toying with the idea. Do you know, is "Loop 2" in the software the VRAM voltage? Kinda looks like it in derBauer's video.


No, not yet - doing year-end stuff for business now but will investigate more re. IC in early January....happy w/ the 4090's core performance, really just need the VRAM juiced up just a bit more though it already hits +1540 w/ECC on at 48 C, also wondering about shunts mod for VRAM and how the Galax vs G-OC vbios would react.


----------



## yzonker

Brads3cents said:


> many people know this by default by comparing low memory settings used with 13900k reviewed to their own tweaked memory and comparing the two. there is a huge performance difference
> you dont need a reviewer to show you this
> since the 13900k is already faster with bunk memory you will only see the difference widen with something like 8200 cl34 which is my personal settings


Is SotTR a special case?

7950x








Benchmark Competition: Shadow of the Tomb Raider


happy to pass the much desired 400 :) Please run at lower render res to eliminate GPU bottlenecking as that has been proven to skew the results. Very impressive stuff, I think you would be the fastest on this thread even with "proper" settings!




www.overclock.net





13900k (DDR5 8533)








Benchmark Competition: Shadow of the Tomb Raider


happy to pass the much desired 400 :) Please run at lower render res to eliminate GPU bottlenecking as that has been proven to skew the results. Very impressive stuff, I think you would be the fastest on this thread even with "proper" settings!




www.overclock.net





Still waiting for your evidence.


----------



## yzonker

J7SC said:


> No, not yet - doing year-end stuff for business now but will investigate more re. IC in early January....happy w/ the 4090's core performance, really just need the VRAM juiced up just a bit more though it already hits +1540 w/ECC on at 48 C, also wondering about shunts mod for VRAM and how the Galax vs G-OC vbios would react.


I wonder if Luumi boosted the VRAM voltage for this run. It's a whopping +2300!!!!!!!!!!









I scored 11 615 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}




www.3dmark.com


----------



## Brads3cents

evidence of what?

are you arguing that 7950x is faster?
or that tweaked memory doesn't improve performance on the 13900k?

if by that, you are completely alone on this forum as for either situation im not sure you will find anyone here to agree with you

this isnt reddit

you also produced the worst evidence counter claim possible
there are people showing 405fps with a 6900xt and your showing sugi with an overclocked 4090 producing 409fps
if you want to prove your bias here your doing a good job. everyone is using different version of windows, different settings, some people are turning off explorer and running taso. for all you know this is the engine limitation
that example proves nothings


----------



## yzonker

Brads3cents said:


> evidence of what?
> 
> are you arguing that 7950x is faster?
> or that tweaked memory doesn't improve performance on the 13900k?
> 
> if by that, you are completely alone on this forum as for either situation im not sure you will find anyone here to agree with you
> 
> this isnt reddit


I'm waiting for you to substantiate your claim that the 13900k is so much faster than the 7950x that the 7950x3D may not even be able to catch up. I provided some evidence that they (13900k vs 7950x) are close to being equal in SotTR bench when both platforms are maxed out.


----------



## Nizzen

yzonker said:


> Is SotTR a special case?
> 
> 7950x
> 
> 
> 
> 
> 
> 
> 
> 
> Benchmark Competition: Shadow of the Tomb Raider
> 
> 
> happy to pass the much desired 400 :) Please run at lower render res to eliminate GPU bottlenecking as that has been proven to skew the results. Very impressive stuff, I think you would be the fastest on this thread even with "proper" settings!
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> 13900k (DDR5 8533)
> 
> 
> 
> 
> 
> 
> 
> 
> Benchmark Competition: Shadow of the Tomb Raider
> 
> 
> happy to pass the much desired 400 :) Please run at lower render res to eliminate GPU bottlenecking as that has been proven to skew the results. Very impressive stuff, I think you would be the fastest on this thread even with "proper" settings!
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> Still waiting for your evidence.


different gpu's


----------



## KingEngineRevUp

dboom said:


> Be advised:
> 
> It was fine for 2 weeks. It leaks a lot.
> EK full WB.
> On 3090 EK WB most of the bolts were in the case after 2 years, on 4090 block i used loctite on some of them. Seals are intact.


WOW that sucks! I do an air leak test on my block before I install it into my system. Air is more likely to escape in the tiniest gaps that water wouldn't right away.


----------



## kryptonfly

GRABibus said:


> Gaming 4K @ GPU > 3100MHz possible, even at 60 degrees GPU temp 😊


You'd better to monitor effective clock too, with hwinfo64 through Afterburner. Effective is the key 🙃


J7SC said:


> ...a little Superposition action before bedtime...ambient at 25 C, VRAM warm-up run...not sure what they mean by 'yes' to 'GPU Performance Limiters'
> View attachment 2591134


Just expand the performance limiters tab and you will see. It's usually because of low utilization at idle for example. You lock the gpu with the L key in the Afterburner curve ?


----------



## dboom

KingEngineRevUp said:


> WOW that sucks! I do an air leak test on my block before I install it into my system. Air is more likely to escape in the tiniest gaps that water wouldn't right away.


It was fine until today when i found "water" on my desk.
Loctite was applied on some screws because of this:








3090 full ek after 2 years, those missing screws were in my case. Seen this when i swaped the cases.


----------



## yzonker

Nizzen said:


> different gpu's


Here's Dom on a 3090 (vs 4090 for the 13900k). Still only 3% slower than a maxed out 13900k and that's ignoring the CPU 408 fps shown. (I'm assuming you're referring to the 6900xt's better cpu utilization which is a valid point I was forgetting)









Benchmark Competition: Shadow of the Tomb Raider


Test done on 720p, because at 1080p I get bootlenecked by the GPU. By best score yet with the 5950X :) @tcclaviger : try to lessen "resolution modifier" to ease the load but I think you should have the same average. I tried once and it changed nothing. You can't do more than cpu game average...




www.overclock.net


----------



## GRABibus

kryptonfly said:


> You'd better to monitor effective clock too, with hwinfo64 through Afterburner. Effective is the key 🙃


I know, but regarding my bench scores, I am too worried about it.


----------



## Brads3cents

yzonker said:


> I'm waiting for you to substantiate your claim that the 13900k is so much faster than the 7950x


my claim is that the 13900k is faster than the 7950x by default via any review
and that unlike with AMD, the intel can continue scaling performance with memory M/T
this is very well known and understood
The performance widens from faster to much faster

if memory didnt matter everyone would use the cheapest junkiest memory possible and not bother to even enable xmp

What is the Optimal RAM Speed for DDR5? - YouTube


even in 4k with gpu limited scenarios there is a lot of performance gain at 8000 vs 6000
DDR5 RAM (6000 MHz) vs DDR5 RAM (8000 MHz) || PC GAMES TEST || - YouTube

keep in mind thats in 4k if you want your mind blown see this test in 1080p

regardless the reviews are mostly comparing the 7950x to a 13900k running at a puny 6000M/T
as you can clearly see the performance continues to scale and the reviews grossly underestimate the actual performance potential of the 13900k

so much so that the x3d cpus will need a to make up a large amount of ground just to match the 13900k with good memory


----------



## yzonker

Brads3cents said:


> my claim is that the 13900k is faster than the 7950x by default via any review
> and that unlike with AMD, the intel can continue scaling performance with memory M/T
> this is very well known and understood
> The performance widens from faster to much faster
> 
> if memory didnt matter everyone would use the cheapest junkiest memory possible and not bother to even enable xmp
> 
> What is the Optimal RAM Speed for DDR5? - YouTube
> 
> 
> even in 4k with gpu limited scenarios there is a lot of performance gain at 8000 vs 6000
> DDR5 RAM (6000 MHz) vs DDR5 RAM (8000 MHz) || PC GAMES TEST || - YouTube
> 
> keep in mind thats in 4k if you want your mind blown see this test in 1080p
> 
> regardless the reviews are mostly comparing the 7950x to a 13900k running at a puny 6000M/T
> as you can clearly see the performance continues to scale and the reviews grossly underestimate the actual performance potential of the 13900k
> 
> so much so that the x3d cpus will need a to make up a large amount of ground just to match the 13900k with good memory


Still no evidence. You make it feel like reddit.


----------



## yzonker

Brads3cents said:


> evidence of what?
> 
> are you arguing that 7950x is faster?
> or that tweaked memory doesn't improve performance on the 13900k?
> 
> if by that, you are completely alone on this forum as for either situation im not sure you will find anyone here to agree with you
> 
> this isnt reddit
> 
> you also produced the worst evidence counter claim possible
> there are people showing 405fps with a 6900xt and your showing sugi with an overclocked 4090 producing 409fps
> if you want to prove your bias here your doing a good job. everyone is using different version of windows, different settings, some people are turning off explorer and running taso. for all you know this is the engine limitation
> that example proves nothings


How is that a bad example? I stacked the deck against the 7950x by INCLUDING sugi's run. And yet the 7950x is still close.


----------



## Roacoe717

I had low expectations for The MSI Gaming X Trio because of the 450w power limit and no vapor chamber, I flashed a 600w bios and opened a window to let cold air in. Got great results, even better then my crappy Gigabyte Gaming OC. Might be worthwhile to water cool this card.

Edit: I didn't have rebar enabled, I tried with Nvidia profile inspector but never worked.


----------



## Brads3cents

I just listed multiple videos how performance scales with memory
if you can't deduce things in your head then sorry sucks to be you

you're the accuser why dont YOU provide the evidence that performance _doesn't_ increase with memory
i dont need to argue common knowledge
a 13900k at 8000 M/T is faster than a 13900k at 6000 M/T with the same timings


----------



## yzonker

Brads3cents said:


> I just listed multiple videos how performance scales with memory
> if you can't deduce things in your head then sorry sucks to be you
> 
> you're the accuser why dont YOU provide the evidence that performance _doesn't_ increase with memory
> i dont need to argue common knowledge
> a 13900k at 8000 M/T is faster than a 13900k at 6000 M/T with the same timings


Oh it absolutely does increase. But I'm saying with both platforms maxed out they are still close in performance which is exactly what those SotTR benchmarks indicate. Admiteddly only one game. 

You can make personal attacks if you like but I'm just making a logical argument with at least one benchmark example.


----------



## Brads3cents

yzonker said:


> I wonder if Luumi boosted the VRAM voltage for this run. It's a whopping +2300!!!!!!!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 615 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


im not trying to make any personal attacks
but the evidence is in the pudding

look at Luumi hes running 8000 M/T. He wouldnt be if it was pointless

there is performance to be had
13900k scales with memory but there are some games that only scale with latency and not M/T
although, if you could keep tight timings and keep increasing M/T without changing timings then you will improve latency with the higher M/T rate but its easier said than done

there are games that will show nothing nada
and games that show excellent gains
it will be game by game
i still believe a maxed out 13900k puts some good distance between itself but i have no clue how well the x3d ultimately will be
it may Suprise and blow our socks off. If its that good then intel has a lot of work to do but lets just wait for ces

I originally wanted the x3d but got tried of waiting to build new pc
They might limit it to 7800x3d though

if there really is a 7950 version than that could surely impress


----------



## yzonker

Brads3cents said:


> im not trying to make any personal attacks
> but the evidence is in the pudding
> 
> look at Luumi hes running 8000 M/T. He wouldnt be if it was pointless
> 
> there is performance to be had
> 13900k scales with memory but there are some games that only scale with latency and not M/T
> although, if you could keep tight timings and keep increasing M/T without changing timings then you will improve latency with the higher M/T rate but its easier said than done
> 
> there are games that will show nothing nada
> and games that show excellent gains
> it will be game by game
> i still believe a maxed out 13900k puts some good distance between itself but i have no clue how well the x3d ultimately will be
> it may Suprise and blow our socks off. If its that good then intel has a lot of work to do but lets just wait for ces
> 
> I originally wanted the x3d but got tried of waiting to build new pc
> They might limit it to 7800x3d though
> 
> if there really is a 7950 version than that could surely impress


Luumi runs an Intel platform for 2 reasons. 

1) It kicks butt in some 3DMark benches like the TS CPU test. 
2) He got that pre-production Z790 KP from EVGA, possibly for free.


----------



## KingEngineRevUp

dboom said:


> It was fine until today when i found "water" on my desk.
> Loctite was applied on some screws because of this:
> 
> 3090 full ek after 2 years, those missing screws were in my case. Seen this when i swaped the cases.


Your fans and pump just produce the perfect vibration mode to excite the screws out. That's a rare occurence!


----------



## Krzych04650

yzonker said:


> Is SotTR a special case?
> 
> 7950x
> 
> 
> 
> 
> 
> 
> 
> 
> Benchmark Competition: Shadow of the Tomb Raider
> 
> 
> happy to pass the much desired 400 :) Please run at lower render res to eliminate GPU bottlenecking as that has been proven to skew the results. Very impressive stuff, I think you would be the fastest on this thread even with "proper" settings!
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> 13900k (DDR5 8533)
> 
> 
> 
> 
> 
> 
> 
> 
> Benchmark Competition: Shadow of the Tomb Raider
> 
> 
> happy to pass the much desired 400 :) Please run at lower render res to eliminate GPU bottlenecking as that has been proven to skew the results. Very impressive stuff, I think you would be the fastest on this thread even with "proper" settings!
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> Still waiting for your evidence.


Maybe not exactly special case but very favorable to 7950X since this is one of a few games where 7950X can match 13900K both stock vs stock and tuned vs tuned, best case scenario basically.

The advantage of 13900K is not necessarily that is so massively faster on average, because it is about 10% faster, but the consistency. There aren't many or really any games where RPL would be underperforming and getting beaten by big margins, even in mainstream reviews with gimped RAM, while Zen4 is all over the place, anywhere from matching 13900K to being barely any faster than 5950X, AMD even shown such examples themselves in their presentation so this is officially acknowledged and expected behavior, and since X3D boosts some games but not others, it will only get even more inconsistent. Some games will surely fly away and averages will equalize or even get in 7000X3D favor by single digit percentage, but many others will stay where they are now without any improvement, and this will be vs stock 13900K with some random XMP.

7000X3D won't bring any meaningful gains vs tuned 13900K, nobody should be hoping or expecting that, and to resolve current CPU bottlenecks we need big margins, not some single digits. Hopefully both manufacturers can keep the pace and we will get something nice by the end of 2024, this is a realistic expectation for when some significant gains may come, anything that happens now is just splitting hairs.


----------



## KingEngineRevUp

Roacoe717 said:


> I had low expectations for The MSI Gaming X Trio because of the 450w power limit and no vapor chamber, I flashed a 600w bios and opened a window to let cold air in. Got great results, even better then my crappy Gigabyte Gaming OC. Might be worthwhile to water cool this card.
> 
> Edit: I didn't have rebar enabled, I tried with Nvidia profile inspector but never worked.


Yeah, well... If you water cool, say goodbye to your +1500 Mhz on your memory when your system is idling. At least for me. If the memory gets chilled to 30C and lower, it will lose OC headroom. It can't handle OCs that well. It seems to like to be a little on the warmer side.


----------



## Xdrqgol

Brads3cents said:


> im not trying to make any personal attacks
> but the evidence is in the pudding
> 
> look at Luumi hes running 8000 M/T. He wouldnt be if it was pointless
> 
> there is performance to be had
> 13900k scales with memory but there are some games that only scale with latency and not M/T
> although, if you could keep tight timings and keep increasing M/T without changing timings then you will improve latency with the higher M/T rate but its easier said than done
> 
> there are games that will show nothing nada
> and games that show excellent gains
> it will be game by game
> i still believe a maxed out 13900k puts some good distance between itself but i have no clue how well the x3d ultimately will be
> it may Suprise and blow our socks off. If its that good then intel has a lot of work to do but lets just wait for ces
> 
> I originally wanted the x3d but got tried of waiting to build new pc
> They might limit it to 7800x3d though
> 
> if there really is a 7950 version than that could surely impress





yzonker said:


> Luumi runs an Intel platform for 2 reasons.
> 
> 1) It kicks butt in some 3DMark benches like the TS CPU test.
> 2) He got that pre-production Z790 KP from EVGA, possibly for free.



How is the 5800X3D maxed out compared to 13900K maxed out?


----------



## Sheyster

J7SC said:


> ...yeah, I like to try that Master vbios also on my Giga-G-OC, though the Galax will be hard to beat.


Found it on the TPU unverified list:









Gigabyte RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com


----------



## KingEngineRevUp

So some of you might remember my issue. When my memory temperatures drop below 30C, I will get a green screen if I have a memory OC applied. If temperatures are in their 40s or higher, I have no problem at all. But after I finish a game, I might forget to disable my memory OC... and my card cools down and will green screen eventually...

I found this thread:

__
https://www.reddit.com/r/NiceHash/comments/lm14m2

Someone wrote a script to apply their OC profile when they mine and to shut off their profile when they're not mining. Now my question is... how can I write a script like this but using VRAM temperatures in the IF ELSE statements?

What's the easiest and best way we can do this?


----------



## J7SC

yzonker said:


> I wonder if Luumi boosted the VRAM voltage for this run. It's a whopping +2300!!!!!!!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 615 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com


Luumi is one of the top guys and I subscribe to his YT channel. To have VRAM at 1600 (using the 3DM reading) at -35 C, I bet that card is hard-modded and has a heater...interestingly, it says 'NVidia' as the bios vendor identifier; might even have 'custom' or at least highly-binned Gbps VRAM chips. Luumi has not put anything up on his channel yet re. 4090 LN2...

He's still short of my best Speedway run though unsubbed, no artifacts  really  I mean it


----------



## yzonker

KingEngineRevUp said:


> Yeah, well... If you water cool, say goodbye to your +1500 Mhz on your memory when your system is idling. At least for me. If the memory gets chilled to 30C and lower, it will lose OC headroom. It can't handle OCs that well. It seems to like to be a little on the warmer side.


Which pads did you use?


----------



## KingEngineRevUp

yzonker said:


> Which pads did you use?


I used the supplied EKWB ones. I did some calculations and found a potential candidate to replace the pads with to get higher deltas here.









[Official] NVIDIA RTX 4090 Owner's Club


If you guys hate math, please skip this post immediately. So lets try and think of a way to increase our memory temperatures for better OC while having a water block. For my specific block during gaming loads: Delta T = 22K, Thermal Conductivity (Given by EKWB) = K = 3.5 W/mK, that means...




www.overclock.net





But a easier solution would be to write a MSI script... That's just out of my expertise. Trying to see if it can be done.


----------



## yzonker

J7SC said:


> Luumi is one of the top guys and I subscribe to his YT channel. To have VRAM at 1600 (using the 3DM reading) at -35 C, I bet that card is hard-modded and has a heater...interestingly, it says 'NVidia' as the bios vendor identifier; might even have 'custom' or at least highly-binned Gbps VRAM chips. Luumi has not put anything up on his channel yet re. 4090 LN2...
> 
> He's still short of my best Speedway run though unsubbed, no artifacts  really  I mean it


I know, I'm sub'ed too to his channel. Been waiting for some 4090 content. And I agree, he's obviously got a modded card of some sort to hit that # based on what we've seen with stock cards. I assumed he was using an FE, but it could be some super special modded card though I suppose with a different bios.

I've been hoping for the LN2 guys to post some vids that might give me a clue as to fixing the cold bug on my card. I even added more pads over the VRM under the backplate to see if that would help, but nope...


----------



## dboom

KingEngineRevUp said:


> Your fans and pump just produce the perfect vibration mode to excite the screws out. That's a rare occurence!


The funny part is that all off them are EK


----------



## J7SC

yzonker said:


> I know, I'm sub'ed too to his channel. Been waiting for some 4090 content. And I agree, he's obviously got a modded card of some sort to hit that # based on what we've seen with stock cards. I assumed he was using an FE, but it could be some super special modded card though I suppose with a different bios.
> 
> I've been hoping for the LN2 guys to post some vids that might give me a clue as to fixing the cold bug on my card. I even added more pads over the VRM under the backplate to see if that would help, but nope...


...Luumi seems very close to KingPin re. working together...I really should see if I can find the GPUZ screen with the vbios ID for that Speedway run...


----------



## GRABibus

Brads3cents said:


> my claim is that the 13900k is faster than the 7950x by default via any review
> and that unlike with AMD, the intel can continue scaling performance with memory M/T
> this is very well known and understood
> The performance widens from faster to much faster
> 
> if memory didnt matter everyone would use the cheapest junkiest memory possible and not bother to even enable xmp
> 
> What is the Optimal RAM Speed for DDR5? - YouTube
> 
> 
> even in 4k with gpu limited scenarios there is a lot of performance gain at 8000 vs 6000
> DDR5 RAM (6000 MHz) vs DDR5 RAM (8000 MHz) || PC GAMES TEST || - YouTube
> 
> keep in mind thats in 4k if you want your mind blown see this test in 1080p
> 
> regardless the reviews are mostly comparing the 7950x to a 13900k running at a puny 6000M/T
> as you can clearly see the performance continues to scale and the reviews grossly underestimate the actual performance potential of the 13900k
> 
> so much so that the x3d cpus will need a to make up a large amount of ground just to match the 13900k with good memory


Of course 13900K is faster….
But, as he can run much higher DDR5 speeds than 7950X, he has again an advantage in gaming.

If you check this video, you will see that 13900K and 7950X are very closed as long as they run both same RAM settings :


----------



## Nizzen

GRABibus said:


> Of course 13900K is faster….
> But, as he can run much higher DDR5 speeds than 7950X, he has again an advantage in gaming.
> 
> If you check this video, you will see that 13900K and 7950X are very closed as long as they run both same RAM settings :


Youtube?


----------



## yzonker

J7SC said:


> ...Luumi seems very close to KingPin re. working together...I really should see if I can find the GPUZ screen with the vbios ID for that Speedway run...


Doesn't appear to be on HWBot.


----------



## GRABibus

Nizzen said:


> Youtube?


Russian ?
Propaganda…😂


----------



## dboom

GRABibus said:


> Of course 13900K is faster….
> But, as he can run much higher DDR5 speeds than 7950X, he has again an advantage in gaming.
> 
> If you check this video, you will see that 13900K and 7950X are very closed as long as they run both same RAM settings :


MEGA L.O.L.
Run AMD for 2years at MAX and run Intel for 2years at MAX. For me AMD is garbage, my biggest mistake since TB1700++ @3200+. Not to mention uber expensive RAM for it. But WAIT, this is 4090RTX topic, so ..AMD/Intel fanboys, GET LOST!!!
I scored 11 536 in Speed Way with AMD CPU, L O L. get lost again. Never ever buy an AMD CPU! It is GREAT while running it with defaults.


----------



## GRABibus

dboom said:


> MEGA L.O.L.
> Run AMD for 2years at MAX and run Intel for 2years at MAX. For me AMD is garbage, my biggest mistake since TB1700++ @3200+. Not to mention uber expensive RAM for it. But WAIT, this is 4090RTX topic, so ..AMD/Intel fanboys, GET LOST!!!
> I scored 11 536 in Speed Way with AMD CPU, L O L. get lost again.





dboom said:


> MEGA L.O.L.
> Run AMD for 2years at MAX and run Intel for 2years at MAX. For me AMD is garbage, my biggest mistake since TB1700++ @3200+. Not to mention uber expensive RAM for it. But WAIT, this is 4090RTX topic, so ..AMD/Intel fanboys, GET LOST!!!
> I scored 11 536 in Speed Way with AMD CPU, L O L. get lost again. Never ever buy an AMD CPU! It is GREAT while running it with defaults.


No fan boys here.
Just sharing.

Mistakes help to learn, so I am happy for you.


----------



## yzonker

KingEngineRevUp said:


> I used the supplied EKWB ones. I did some calculations and found a potential candidate to replace the pads with to get higher deltas here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [Official] NVIDIA RTX 4090 Owner's Club
> 
> 
> If you guys hate math, please skip this post immediately. So lets try and think of a way to increase our memory temperatures for better OC while having a water block. For my specific block during gaming loads: Delta T = 22K, Thermal Conductivity (Given by EKWB) = K = 3.5 W/mK, that means...
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> But a easier solution would be to write a MSI script... That's just out of my expertise. Trying to see if it can be done.





KingEngineRevUp said:


> I used the supplied EKWB ones. I did some calculations and found a potential candidate to replace the pads with to get higher deltas here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [Official] NVIDIA RTX 4090 Owner's Club
> 
> 
> If you guys hate math, please skip this post immediately. So lets try and think of a way to increase our memory temperatures for better OC while having a water block. For my specific block during gaming loads: Delta T = 22K, Thermal Conductivity (Given by EKWB) = K = 3.5 W/mK, that means...
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> But a easier solution would be to write a MSI script... That's just out of my expertise. Trying to see if it can be done.


The issue is that the script/app has to be running as admin to change GPU settings, so you'll get a UAC prompt every time unless you disable UAC. If you want to live with the UAC prompts, then you could just run in a batch file with your game,

<Afterburner path>MSIAfterburner.exe -profile<#>

About the best I can come up with right now. nvidia-smi command require admin also, so no help there.


----------



## Blameless

Well, my Windforce works, and I was able to inject putty (with a 12 gauge luer lock dispensing needle) where Gigabyte omitted the last pads for the memory VRM without having to dismantle the card.

Unfortunately, I need to dig out my EEPROM programmer as my power went out as I was erasing and reflashing the firmware on my test bench's motherboard...


----------



## yzonker

GRABibus said:


> Of course 13900K is faster….
> But, as he can run much higher DDR5 speeds than 7950X, he has again an advantage in gaming.
> 
> If you check this video, you will see that 13900K and 7950X are very closed as long as they run both same RAM settings :


Somebody came up with some benchmark #'s. lol

Anyway the result he has is confusing though. He's showing 409 fps average in 1080p. Seems a bit high relative to what we've seen in the SotTR bench thread.


----------



## yzonker

Blameless said:


> Well, my Windforce works, and I was able to inject putty (with a 12 gauge luer lock dispensing needle) where Gigabyte omitted the last pads for the memory VRM without having to dismantle the card.
> 
> Unfortunately, I need to dig out my EEPROM programmer as my power went out as I was erasing and reflashing the firmware on my test bench's motherboard...


Oh that sucks. My greatest fear, although I do have a UPS on mine, but that's not a 100% guarantee...


----------



## ttnuagmada

So am I reading you guys right? My ****ty VRAM OC might be because I used good thermal pads and am watercooling?


----------



## yzonker

ttnuagmada said:


> So am I reading you guys right? My ****ty VRAM OC might be because I used good thermal pads and am watercooling?


Yup, if you used high W/mK pads in particular, you're likely losing some. On air I could do +1800 24/7, but on water and good pads (thermal putty in my case) I can only do 1600-1700 depending on water temp. When the room is very cool (18-20C) and the machine has been off it will sometimes black screen on the desktop at +1600 even.


----------



## Xdrqgol

dboom said:


> MEGA L.O.L.
> Run AMD for 2years at MAX and run Intel for 2years at MAX. For me AMD is garbage, my biggest mistake since TB1700++ @3200+. Not to mention uber expensive RAM for it. But WAIT, this is 4090RTX topic, so ..AMD/Intel fanboys, GET LOST!!!
> I scored 11 536 in Speed Way with AMD CPU, L O L. get lost again. Never ever buy an AMD CPU! It is GREAT while running it with defaults.


How old are you ? 12 ? 
To have such a disturbing post - are you ok? 👀


----------



## Xdrqgol

yzonker said:


> Yup, if you used high W/mK pads in particular, you're likely losing some. On air I could do +1800 24/7, but on water and good pads (thermal putty in my case) I can only do 1600-1700 depending on water temp. When the room is very cool (18-20C) and the machine has been off it will sometimes black screen on the desktop at +1600 even.


it did happen to me trying to run +1700 around 25C to get a black screen and next time when it was warmer (40C) - i went all the way up to +1900 without issues
Never ever expected this to be like that …


----------



## ttnuagmada

yzonker said:


> Yup, if you used high W/mK pads in particular, you're likely losing some. On air I could do +1800 24/7, but on water and good pads (thermal putty in my case) I can only do 1600-1700 depending on water temp. When the room is very cool (18-20C) and the machine has been off it will sometimes black screen on the desktop at +1600 even.


I didnt do much testing beyond making sure the card ran before putting the block on it and throwing it in my loop, so I don't know what it was doing on air. I artifact/crash at anything above +1200 though with it water cooled. I used the thermaltake pads. Highest memory temps I've seen is 50C, though this is with fans down low. With fans cranked while I was benchmarking I dont think VRAM got much over 40.


----------



## yzonker

ttnuagmada said:


> I didnt do much testing beyond making sure the card ran before putting the block on it and throwing it in my loop, so I don't know what it was doing on air. I artifact/crash at anything above +1200 though with it water cooled. I used the thermaltake pads. Highest memory temps I've seen is 50C, though this is with fans down low. With fans cranked while I was benchmarking I dont think VRAM got much over 40.


You could remount with cheap pads but the cold soak problem will still be there. That's where the fan stop on the air coolers really pays off. 

And you probably wouldn't be more than 1300-1400 at best with the air cooler anyway.


----------



## KingEngineRevUp

ttnuagmada said:


> So am I reading you guys right? My ****ty VRAM OC might be because I used good thermal pads and am watercooling?


Welcome to the club 

Did you test when you were on air or did you go straight to water cooling?

Just saw that you didn't. Well there's really no way to know then.


----------



## ttnuagmada

KingEngineRevUp said:


> Welcome to the club
> 
> Did you test when you were on air or did you go straight to water cooling?


I didn't do any serious testing before water cooling it. Mostly just made sure it worked.


----------



## tubs2x4

KingEngineRevUp said:


> Your fans and pump just produce the perfect vibration mode to excite the screws out. That's a rare occurence!


That’s why they make loctite


----------



## Nizzen

GRABibus said:


> Russian ?
> Propaganda…😂


One of the few tubers that actual tries to use tuned memory 
I talked with them a few times, and they are great people.


----------



## bmagnien

Xdrqgol said:


> How is the 5800X3D maxed out compared to 13900K maxed out?


The 5800x3d ‘maxed out’ is technically an undervolt lol (-30 all core voltage offset) paired with 3800c14 or better bdie at 1:1 IF. Unless you have a mobo that supports bclk overclocking in which case you can gain a couple % clock increase in return for overall system stability. I don’t think it beats a truly maxed 13900k paired with top binned Hynix, in anything, but in some cherry picked games (MSFS?) it can come close.

Re: a previous post about CPU bottlenecks in recent games even with a 5800x3D ( and i imagine even a fully maxed 13900kf) - I strongly agree. In Witcher 3, Callisto Protocol, and SpiderMan ray-tracing is extremely cpu heavy - thankfully dlss3 frame gen bypasses this for the most part, but with callisto being an AMD sponsored game it isn’t an option there.

In other games like MSFS, it’s not raytracing but other cpu intensive workloads. I can run everything maxed except for ‘Terrain Detail’ - anything over 200 there causes massive cpu bottleneck (gpu usage dips to 40%), and not even frame gen can overcome this in my experience on a 5800x3d.

in summary - with the release of the 4090 and anything beyond, the notion of the cpu being unimportant at high res gaming has definitively been buried. There isn’t a cpu on the market that doesn’t become a bottleneck under the right conditions with this card, even at 4K. A blessing and a curse 😆


----------



## alasdairvfr

KingEngineRevUp said:


> So some of you might remember my issue. When my memory temperatures drop below 30C, I will get a green screen if I have a memory OC applied. If temperatures are in their 40s or higher, I have no problem at all. But after I finish a game, I might forget to disable my memory OC... and my card cools down and will green screen eventually...
> 
> I found this thread:
> 
> __
> https://www.reddit.com/r/NiceHash/comments/lm14m2
> 
> Someone wrote a script to apply their OC profile when they mine and to shut off their profile when they're not mining. Now my question is... how can I write a script like this but using VRAM temperatures in the IF ELSE statements?
> 
> What's the easiest and best way we can do this?





yzonker said:


> The issue is that the script/app has to be running as admin to change GPU settings, so you'll get a UAC prompt every time unless you disable UAC. If you want to live with the UAC prompts, then you could just run in a batch file with your game,
> 
> <Afterburner path>MSIAfterburner.exe -profile<#>
> 
> About the best I can come up with right now. nvidia-smi command require admin also, so no help there.


A powershell script tied to task scheduler could do the trick, might have to play around with it but there are ways to elevate it's privileges to bypass UAC - opening a little hole in security but its not a big surface area.

Could try the poll of nvidia-smi or integrate with hwinfo64 somehow. This script has it sleeping for 30000s which means it will almost certainly miss a drop in memory temps if you quit running the gpu sooner than that. Alternatively, lower tech but probably the easiest way woud be to have a list of games you play, if those are seen running, sleep 60 (let the gpu warm up a bit) then apply the appropriate profile for that game. Poll every so often (once every few minutes) then when the game is found no longer running, set profile 1 or whatever is the safe memory clock at idle.


----------



## J7SC

yzonker said:


> You could remount with cheap pads but the cold soak problem will still be there. That's where the fan stop on the air coolers really pays off.
> 
> And you probably wouldn't be more than 1300-1400 at best with the air cooler anyway.


On my 4090 (Giga-G-OC + Bykski), I did my by now usual combo of Gelid GC Extreme for the core and TG-10 thermal putty for the VRAM >> that has proven to work great on prior-gen cards, but w/ 4090 VRAM, well, you know...🥴. That said, when VRAM is cold (high 20s C) I can still add + 1418 on the slider, but VRAM has to be around 48 C to reach +1563 and higher on the slider. That's not the end of the world though given the solid core performance.

---
On the earlier CPU 'discussions', I have high hopes re. 7950X3D (if it indeed comes to market) and I plan to pair it with 2x 32GB 6400 DDR5 - that should work great w/ 200 MB of cache. The problem (for desktop users) with AMD lately has been their success, especially in the lucrative workstation and enterprise server markets. When I got my TR 2950X (still running strong after _four years of max oc w/_340W, btw) they were reasonably priced for their performance, as were the corresponding motherboards. But then success came in spades for AMD, so they never released chips like the TR 5990X (see Skatterbencher's test below). That's basically 4x 5950X, but with quad channel RAM as the cherry on top (see my lowly 2950X's quad channel Aida below), so with Zen3 instead of Zen2 cores / IMC, the 5990X would have been killer.

But since Intel had nothing in the HEDT market since the 10980XE, the 5990X never came to market, either. Instead, AMD released the hyper-expensive Threadripper Pro 5995WX w/ eight-channel RAM (which apparently is overclockable on some boards ) for workstations...I wonder what they will do with the upcoming Zen4 core based Threadripper CPUs, since their AMD Epyc counterparts have 12 channel DDR5 RAM with Aida reads as high as 600 GB/s. I like Intel just as much as AMD (and still have more Intel than AMD CPUs in use), but the Intel P+E core combo is a bit tricky for some apps (beyond modern gaming and benching). So with Intel, I probably have to wait for the Sapphire Rapids-64 L / W-2400 desktops.


----------



## yzonker

J7SC said:


> On my 4090 (Giga-G-OC + Bykski), I did my by now usual combo of Gelid GC Extreme for the core and TG-10 thermal putty for the VRAM >> that has proven to work great on prior-gen cards, but w/ 4090 VRAM, well, you know...🥴. That said, when VRAM is cold (high 20s C) I can still add + 1418 on the slider, but VRAM has to be around 48 C to reach +1563 and higher on the slider. That's not the end of the world though given the solid core performance.
> 
> ---
> On the earlier CPU 'discussions', I have high hopes re. 7950X3D (if it indeed comes to market) and I plan to pair it with 2x 32GB 6400 DDR5 - that should work great w/ 200 MB of cache. The problem (for desktop users) with AMD lately has been their success, especially in the lucrative workstation and enterprise server markets. When I got my TR 2950X (still running strong after _four years of max oc w/_340W, btw) they were reasonably priced for their performance, as were the corresponding motherboards. But then success came in spades for AMD, so they never released chips like the TR 5990X (see Skatterbencher's test below). That's basically 4x 5950X, but with quad channel RAM as the cherry on top (see my lowly 2950X's quad channel Aida below), so with Zen3 instead of Zen2 cores / IMC, the 5990X would have been killer.
> 
> But since Intel had nothing in the HEDT market since the 10980XE, the 5990X never came to market, either. Instead, AMD released the hyper-expensive Threadripper Pro 5995WX w/ eight-channel RAM (which apparently is overclockable on some boards ) for workstations...I wonder what they will do with the upcoming Zen4 core based Threadripper CPUs, since their AMD Epyc counterparts have 12 channel DDR5 RAM with Aida reads as high as 600 GB/s. I like Intel just as much as AMD (and still have more Intel than AMD CPUs in use), but the Intel P+E core combo is a bit tricky for some apps (beyond modern gaming and benching). So with Intel, I probably have to wait for the Sapphire Rapids-64 L / W-2400 desktops.
> View attachment 2591225


No, a loss of 100-200 on the VRAM is next to nothing for gaming. I honestly don't care about that much at all. And I'm able to bench at +1800 again with the heater which works fine and I don't have idle black screen problems either when I'm using it.

Yup, the hybrid architecture kinda sucks. I've definitely had problems with it, mostly in benchmarks. You probably saw the games we have been playing with Firestrike by turning off somewhat random numbers of e-cores. I've even had problems with the TS CPU bench running inconsistently. Ironically Win10 is very consistent but always a little slower (like 300-500 pts).

I'm not sure if that's really an issue with Intel's architecture or Microsoft just hasn't ever gotten Win11 working correctly. 22H2 is totally borked for running TS CPU last time I tried it. Been staying on 21H2 for that reason.


----------



## Gking62

so using nvflash for the first time, yes I said that, is this the proper syntax (Galax vbios)? I bugged out before flashing to check here first, thanks to all.

C:\nvflash_5.792>nvflash64 --protectoff
NVIDIA Firmware Update Utility (Version 5.792.0)
Copyright (C) 1993-2022, NVIDIA Corporation. All rights reserved.

Setting EEPROM protection complete.

C:\nvflash_5.792>nvflash64 -6 252087.rom
NVIDIA Firmware Update Utility (Version 5.792.0)
Copyright (C) 1993-2022, NVIDIA Corporation. All rights reserved.

Checking for matches between display adapter(s) and image(s)...

Reading EEPROM (this operation may take up to 30 seconds)

WARNING: Firmware image PCI Subsystem ID (1B4C.A06C)
does not match adapter PCI Subsystem ID (1043.889C).

You are intending to override PCI Subsystem ID.
Are you sure you want to continue?
Press 'y' to confirm (any other key to abort): tNothing changed!



ERROR: User aborted!

C:\nvflash_5.792>


----------



## Gking62

yzonker said:


> Yup, if you used high W/mK pads in particular, you're likely losing some. On air I could do +1800 24/7, but on water and good pads (thermal putty in my case) I can only do 1600-1700 depending on water temp. When the room is very cool (18-20C) and the machine has been off it will sometimes black screen on the desktop at +1600 even.


Regrettably, I'm in the same boat making sure all was in working order and then installing my block, on top of it I never gave thought to use the stock EK pads on the mem chips, may do a reapplication at some point but here is a pic of best I can do without artifacting during PR stress test...


----------



## Helmbo

Sheyster said:


> 8000+ MHz memory isn't a given, unless you have an ASUS Apex Z790 (hard to get), good RAM and a good CPU (IMC). Don't count on getting to 8000 with a Maximus Extreme or Hero. Many have tried and failed, a few have gotten lucky.


Well... as fast Ram as possible then  . but waiting for CES to see what AMD have to show with there upcomming X3D lineup. If it flops.. well 13900KS i suppose.


----------



## elbramso

dboom said:


> MEGA L.O.L.
> Run AMD for 2years at MAX and run Intel for 2years at MAX. For me AMD is garbage, my biggest mistake since TB1700++ @3200+. Not to mention uber expensive RAM for it. But WAIT, this is 4090RTX topic, so ..AMD/Intel fanboys, GET LOST!!!
> I scored 11 536 in Speed Way with AMD CPU, L O L. get lost again. Never ever buy an AMD CPU! It is GREAT while running it with defaults.


This Speedway score is fake - just saying


----------



## leonman44

yzonker said:


> Yup, if you used high W/mK pads in particular, you're likely losing some. On air I could do +1800 24/7, but on water and good pads (thermal putty in my case) I can only do 1600-1700 depending on water temp. When the room is very cool (18-20C) and the machine has been off it will sometimes black screen on the desktop at +1600 even.


So did your overall gaming performance actually decrease or benchmark scores vs air cooling ?


----------



## Nizzen

yzonker said:


> No, a loss of 100-200 on the VRAM is next to nothing for gaming. I honestly don't care about that much at all. And I'm able to bench at +1800 again with the heater which works fine and I don't have idle black screen problems either when I'm using it.
> 
> Yup, the hybrid architecture kinda sucks. I've definitely had problems with it, mostly in benchmarks. You probably saw the games we have been playing with Firestrike by turning off somewhat random numbers of e-cores. I've even had problems with the TS CPU bench running inconsistently. Ironically Win10 is very consistent but always a little slower (like 300-500 pts).
> 
> I'm not sure if that's really an issue with Intel's architecture or Microsoft just hasn't ever gotten Win11 working correctly. 22H2 is totally borked for running TS CPU last time I tried it. Been staying on 21H2 for that reason.


If you had problems with p-cores + e-cores, you had most likely had problems with 8 + "normal" cores too 
Many games/benchmarks don't like more than 16 threads, and is performing worse with that many cpu threads. 
I know because of using 7980xe 18 core for 6 years and threadripper 😅


----------



## elbramso

leonman44 said:


> So did your overall gaming performance actually decrease or benchmark scores vs air cooling ?


can't answer for him but my score got slightly better on water. Mem oc went down but core oc went up. Although I have to say that speedway really favors mem over core. My best speedway run was with a water temp of 23c.


----------



## J7SC

Nizzen said:


> If you had problems with p-cores + e-cores, you had most likely had problems with 8 + "normal" cores too
> Many games/benchmarks don't like more than 16 threads, and is performing worse with that many cpu threads.
> I know because of using 7980xe 18 core for 6 years and threadripper 😅


...knock yourself > out  ...then get up with Process Lasso


----------



## Nizzen

J7SC said:


> ...knock yourself > out  ...then get up with Process Lasso


Yes, that's why this program was made! Great peace of software  Fixed Threadripper and gaming 😁


----------



## J7SC

Nizzen said:


> Yes, that's why this program was made! Great peace of software  Fixed Threadripper and gaming 😁


...best to just 'lasso' those e-cores and lock them away if you want to go fast with p-cores😁


----------



## leonman44

elbramso said:


> can't answer for him but my score got slightly better on water. Mem oc went down but core oc went up. Although I have to say that speedway really favors mem over core. My best speedway run was with a water temp of 23c.


What I am trying to figure out if the waterblock is worth it , in general I am gaming but if I am not going to gain anything or even lose then it’s best to just keep the air cooler on and keep that almost 400 euros that the water cooling components costs.


----------



## elbramso

leonman44 said:


> What I am trying to figure out if the waterblock is worth it , in general I am gaming but if I am not going to gain anything or even lose then it’s best to just keep the air cooler on and keep that almost 400 euros that the water cooling components costs.


which card do u have? I'm using the Alphacool Block for 159€ and it does a good job. After all the card is more silent and does take less power when connected to my loop. Further my undervolt setup that I use tof games like cyberpunk is way more consistent. Although the fans from the stock cooler do a good job and are quite silent when below 70% load, my loop is big enough to operate more silent


----------



## leonman44

elbramso said:


> which card do u have? I'm using the Alphacool Block for 159€ and it does a good job. After all the card is more silent and does take less power when connected to my loop. Further my undervolt setup that I use tof games like cyberpunk is way more consistent. Although the fans from the stock cooler do a good job and are quite silent when below 70% load, my loop is big enough to operate more silent


I have the Asus Tuf Oc model , I saw that waterblock too but my system is mostly aura sync and ek supports that , my goal is to get as much as I can from my card basically overclock and power limit unlock I don’t care about noise or efficiency. I have a dual rad custom loop but I am blasting the fans to keep the water temp deference below 10c from my ambient temp. 

I always liked the waterblocks because they reduce so much more the temps and I was able to keep a stable clock speed without throttling. 

But most people seems to say that memory oc gives more performance than core in this gen , so in the end do we have a performance decrease by using a waterblock?


----------



## J7SC

leonman44 said:


> I have the Asus Tuf Oc model , I saw that waterblock too but my system is mostly aura sync and ek supports that , my goal is to get as much as I can from my card basically overclock and power limit unlock I don’t care about noise or efficiency. I have a dual rad custom loop but I am blasting the fans to keep the water temp deference below 10c from my ambient temp.
> 
> I always liked the waterblocks because they reduce so much more the temps and I was able to keep a stable clock speed without throttling.
> 
> But most people seems to say that memory oc gives more performance than core in this gen , so in the end do we have a performance decrease by using a waterblock?


While VRAM performance plays an elevated role with the 4090s compared to earlier gens, core speed certainly still matters a lot. FYI, I slapped a $122 Bykski block on my card (RGB works fine w/Asus Aura btw) because I am simply uncomfortable with a 600W (or 670W+ w/certain vbios) air-cooled card in a case, given its heat output and also fan noise. However, I already had a 1320x64 triple-core rad loop set up for a water-cooled GPU anyway. Games such as Cyberpunk 2077 w/ray tracing on will really heat things up, even at 540W or so.

All that said, the particular GDDR6X VRAM chips used by the 4090s seems to perform best around 48 C and up (which is easily reached on mine even w/watercooling). What can happen though is that when you really maxed your VRAM speed for gaming, it can freeze your monitor if your GPU goes into idle for a longer period and the VRAM cools down too much (we're just talking about max oc here). In my case, VRAM performance actually improved with water-cooling due to a hotspot issue traced back to a bad mount. Now, even with 'cold' VRAM (30 C or below), I can still add +1418 on the MSI AB slider for VRAM, but once past 48 C, it is more like +1563, and higher still with 60 C plus. But by the time VRAM reaches that temp, your core will take speed bin hits due to temps anyway. In short, nothing wrong and a lot of things right with water-cooling your 4090 - just don't go out of your way with high W/mK custom thermal pads for the VRAM.


----------



## leonman44

J7SC said:


> While VRAM performance plays an elevated role with the 4090s compared to earlier gens, core speed certainly still matters a lot. FYI, I slapped a $122 Bykski block on my card (RGB works fine w/Asus Aura btw) because I am simply uncomfortable with a 600W (or 670W+ w/certain vbios) air-cooled card in a case, given its heat output and also fan noise. However, I already had a 1320x64 triple-core rad loop set up for a water-cooled GPU anyway. Games such as Cyberpunk 2077 w/ray tracing on will really heat things up, even at 540W or so.
> 
> All that said, the particular GDDR6X VRAM chips used by the 4090s seems to perform best around 48 C and up (which is easily reached on mine even w/watercooling). What can happen though is that when you really maxed your VRAM speed for gaming, it can freeze your monitor if your GPU goes into idle for a longer period and the VRAM cools down too much (we're just talking about max oc here). In my case, VRAM performance actually improved with water-cooling due to a hotspot issue traced back to a bad mount. Now, even with 'cold' VRAM (30 C or below), I can still add +1418 on the MSI AB slider for VRAM, but once past 48 C, it is more like +1563, and higher still with 60 C plus. But by the time VRAM reaches that temp, your core will take speed bin hits due to temps anyway. In short, nothing wrong and a lot of things right with water-cooling your 4090 - just don't go out of your way with high W/mK custom thermal pads for the VRAM.


I see , but won’t the same happen on air as well ? Finally the VRAM should cool even on air just slower than the watercooled card. I though about using even lower W/mK thermal pads but it will happen the same again if that’s the case it’s going to be only matter of time.
Maybe only with poor thermal pads and forcing maximum performance from nvidias control panel is going to keep the temps higher all the time.


----------



## KingEngineRevUp

leonman44 said:


> but won’t the same happen on air as well ?


Depends. Water wicks away heat fast so that's why the memory is low. On air, with a fan curve tuning down and the memory not being directly under fins, temperatures will be higher even when idling.

I'm not on air anymore, but this is someone else idling GPU-Z sensor. Memory is at 48C here in this isntance.


----------



## leonman44

KingEngineRevUp said:


> Depends. Water wicks away heat fast so that's why the memory is low. On air, with a fan curve tuning down and the memory not being directly under fins, temperatures will be higher even when idling.
> 
> I'm not on air anymore, but this is someone else idling GPU-Z sensor. Memory is at 48C here in this isntance.


Ok now i am torn apart even further as theres clearly not an clear answer if the core gain/stability under water is more beneficial than aircooling higher vram speeds in gaming , to watercool or not to watercool ? Thats the big question


----------



## Helmbo

leonman44 said:


> Ok now i am torn apart even further as theres clearly not an clear answer if the core gain/stability under water is more beneficial than aircooling higher vram speeds in gaming , to watercool or not to watercool ? Thats the big question



Honest opinion. And im a big watercool idiot  - i see no reason to watercool this card, unless just for the sake of the hobby. I bought a block, even though not needed for this card, but i already investet alot into my loop, so a 170$ Alphacool waterblock was just for the sake of me tinkering with my hardware.


----------



## leonman44

Helmbo said:


> Honest opinion. And im a big watercool idiot  - i see no reason to watercool this card, unless just for the sake of the hobby. I bought a block, even though not needed for this card, but i already investet alot into my loop, so a 170$ Alphacool waterblock was just for the sake of me tinkering with my hardware.


True answer , i have invested way too much too and god i hate the hard tubing watercooling , its very hard to change components anymore or even just paste my cpu and the distro plate on the od11 xl is a nightmare i cant drain my loop properly its almost impossible. I paid a lot for them so i continue to use them but they are not practical if i was aircooled i would have installed the card already in a few minutes or with soft tubing and a regular reservoir just within a day. Now i need a whole week to rework the whole system.

But without watercooling would we be real overclockers? i guess no , theres no way to blast a lot of voltage on the current components without a good custom loop.


----------



## Helmbo

leonman44 said:


> True answer , i have invested way too much too and god i hate the hard tubing watercooling , its very hard to change components anymore or even just paste my cpu and the distro plate on the od11 xl is a nightmare i cant drain my loop properly its almost impossible. I paid a lot for them so i continue to use them but they are not practical if i was aircooled i would have installed the card already in a few minutes or with soft tubing and a regular reservoir just within a day. Now i need a whole week to rework the whole system.
> 
> But without watercooling would we be real overclockers? i guess no , theres no way to blast a lot of voltage on the current components without a good custom loop.


I quitted hardtubing and RBG along time ago... i went with fuctionality over look. So softubing, and normal distilled water here.


----------



## yzonker

J7SC said:


> ...knock yourself > out  ...then get up with Process Lasso





Nizzen said:


> Yes, that's why this program was made! Great peace of software  Fixed Threadripper and gaming 😁


Except I've tried several combinations before, but having everything enabled is always the fastest. And it has worked up until yesterday, so something else changed. 

I've tried:

Process Lasso restrict to 8P+8E (HT on): slower
Disable 8 e-cores: slower
Disable 8 e-cores and HT off: slower


----------



## yzonker

elbramso said:


> can't answer for him but my score got slightly better on water. Mem oc went down but core oc went up. Although I have to say that speedway really favors mem over core. My best speedway run was with a water temp of 23c.


Yes, the increase in core clock outweighs the loss in VRAM clock in most games/benches. It's just that the gains are smaller than in past generations.


----------



## leonman44

Are these pads bad enough to save my ram oc ? or do i have to look for even worse?










Ekwb pads are rated at 3.5 W/mk vs these at 1.5 W/mk that hopefully are going to be even worse than that. Is this a good idea?


----------



## GRABibus

Nizzen said:


> If you had problems with p-cores + e-cores, you had most likely had problems with 8 + "normal" cores too
> Many games/benchmarks don't like more than 16 threads, and is performing worse with that many cpu threads.
> I know because of using 7980xe 18 core for 6 years and threadripper 😅


This is why 7800X3D will be a much better choice than 7950X3D for gaming.


----------



## StreaMRoLLeR

> core speed certainly still matters a lot.


@J7SC No its not unless you have golden bin which can do 3100mhz plus on Galax Bios. There is only 1 fps difference in DyingLight 2 agaisnt 2820 effective vs 3030 desired. Also


yzonker said:


> Except I've tried several combinations before, but having everything enabled is always the fastest. And it has worked up until yesterday, so something else changed.
> 
> I've tried:
> 
> Process Lasso restrict to 8P+8E (HT on): slower
> Disable 8 e-cores: slower
> Disable 8 e-cores and HT off: slower


Do you get reduction in NS ( aida,intel mlc) for using 8P-8E 13900K ?


----------



## alasdairvfr

I whipped up a little powershell script for people to try out, this will change your afterburner profile depending what apps/games you are running and put it back to your "idle" profile when the games aren't running. This should help folks struggling with cold memory issues not wanting to forget and come back to a crashed system. There is a way to integrate with task scheduler or you can run the script in powershell and minimize it.

I looked into polling memory temp numbers instead of apps running and there is no easy way. Looked into nvidia-smi but it doesn't have VRAM temp. HWINFO64 would work but requires pro version for CLI integration and couldn't get the shared memory function to work easily (it's also limited to 12h in the free version). This method should work and allow you to have multiple lists of apps that will automatically get the appropriate profile set if an app from that list is running.

I'm not sure if there is much / any impact or performance having the profile set itself repeatedly, this was just a fun thing I tried based on a post yesterday from @KingEngineRevUp 

The powershell script must be run as administrator, or if using task scheduler make sure to run with highest privileges otherwise you will get an afterburner uac popup every few mins.



Code:


# Usage:  Replace the values of applicationList1 and applicationList2 with your games or compute applications that you want to overclock
# The process name is usually the application without .exe but check taskmgr for the actual process name under the details tab
# There are 2 app lists in case you have different OC profiles
# Can add more than 2 per list if need be, just placeholders here using common apps for testing
# Profile1 should be your safe idle profile, otherswise change '-profile1' to the profile of your choice in the last section

# Set the path to the MSI Afterburner executable
$msiAfterburnerPath = "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe"

# Set the names of the applications that you want to monitor
$applicationList1 = @("GPU-z", "cmd")
$applicationList2 = @("cpuz", "chime")

while ($true) {
    #Queries running processes against list1 and sets to OC profile2 after 2min wait period for memory to warm up
    if (Get-Process -Name $applicationList1 -ErrorAction SilentlyContinue) {
        Start-sleep -seconds 120
        Start-process $msiAfterburnerPath -argumentlist '-profile2'
        }
    #Queries running processes against list2 and sets to OC profile3 after 2min wait period for memory to warm up
    Elseif  (Get-Process -Name $applicationList2 -ErrorAction SilentlyContinue) {
        Start-sleep -seconds 120
        Start-process $msiAfterburnerPath -argumentlist '-profile3'
        }
    Else {
    #If no GPU intensive programs specified are running, it will set OC profile1
        Start-process $msiAfterburnerPath -argumentlist '-profile1'
    }
    Start-sleep -seconds 60
}


----------



## yzonker

StreaMRoLLeR said:


> @J7SC No its not unless you have golden bin which can do 3100mhz plus on Galax Bios. There is only 1 fps difference in DyingLight 2 agaisnt 2820 effective vs 3030 desired. Also
> 
> Do you get reduction in NS ( aida,intel mlc) for using 8P-8E 13900K ?


Don't know, I haven't tested that.


----------



## Gking62

Flashed Galax vbios on my Strix 4090 OC this am, all 5x5. Thanks to @yzonker for posting the link (in sig)...


----------



## lucasj1974

bmagnien said:


> *The 5800x3d ‘maxed out’ is technically an undervolt lol (-30 all core voltage offset) paired with 3800c14 or better bdie at 1:1 IF. Unless you have a mobo that supports bclk overclocking in which case you can gain a couple % clock increase in return for overall system stability. I don’t think it beats a truly maxed 13900k paired with top binned Hynix, in anything, but in some cherry picked games (MSFS?) it can come close.*
> 
> Re: a previous post about CPU bottlenecks in recent games even with a 5800x3D ( and i imagine even a fully maxed 13900kf) - I strongly agree. In Witcher 3, Callisto Protocol, and SpiderMan ray-tracing is extremely cpu heavy - thankfully dlss3 frame gen bypasses this for the most part, but with callisto being an AMD sponsored game it isn’t an option there.
> 
> In other games like MSFS, it’s not raytracing but other cpu intensive workloads. I can run everything maxed except for ‘Terrain Detail’ - anything over 200 there causes massive cpu bottleneck (gpu usage dips to 40%), and not even frame gen can overcome this in my experience on a 5800x3d.
> 
> in summary - with the release of the 4090 and anything beyond, the notion of the cpu being unimportant at high res gaming has definitively been buried. There isn’t a cpu on the market that doesn’t become a bottleneck under the right conditions with this card, even at 4K. A blessing and a curse 😆


To me, this data points to the 7000x3d chips being easily capable at beating intel's current best offerings (even with tuned memory) simply because last gen's x3d chip is soooo competitive in gaming....we will soon find out.

I rolled the dice on AMD....


----------



## coelacanth

J7SC said:


> ...best to just 'lasso' those e-cores and lock them away if you want to go fast with p-cores😁
> View attachment 2591263


-250 C, is that liquid helium rather then LN2?


----------



## pt0x-

First real try to get some points with my strix. Galax bios is doing its job!

#71 speed way









I scored 11 156 in Speed Way


Intel Core i9-12900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## Gking62

So I've tried several settings, best I can do thus far (below) aside from regrettably having too good of padding on my mem chips (14.8 W/mK), is it my system ram (Kingston 5600 64GB) that's holding me back, bad bin? not sure what else I can do here.









I scored 10 843 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com


----------



## yzonker

Gking62 said:


> So I've tried several settings, best I can do thus far (below) aside from regrettably having too good of padding on my mem chips (14.8 W/mK), is it my system ram (Kingston 5600 64GB) that's holding me back, bad bin? not sure what else I can do here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 10 843 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com


Mostly the low VRAM OC that is holding you back probably.  Run it in 1440p or less desktop res. with only one monitor connected.


----------



## Sheyster

J7SC said:


> I have high hopes re. 7950X3D (if it indeed comes to market) and I plan to pair it with 2x 32GB 6400 DDR5 - that should work great w/ 200 MB of cache.


That would be a killer combo for sure. I will be watching as well. If this CPU actually lives up to expectations that may be the way to go for me as well.


----------



## yzonker

I ran the GotG benchmark to test core and VRAM OC. Although the gains are somewhat less than they should be due to the game capping the last scene at my monitors max refresh (120hz). Mem OC wins again, but by a small margin. Wish I could unlock the framerate. Showing a total of 5 fps gain, but I see about 8 fps during gameplay. Still at least shows how important memory OC is relative to core at least.

Defaults, 2815 core










Core OC only, 3045 mhz core










Mem OC only, +1700










Both core and mem, (3045 core, +1700 mem)


----------



## Sheyster

Gking62 said:


> So I've tried several settings, best I can do thus far (below) aside from regrettably having too good of padding on my mem chips (14.8 W/mK), is it my system ram (Kingston 5600 64GB) that's holding me back, bad bin? not sure what else I can do here.


Is +775 the most you can do on memory? That's really low, especially for a Strix.  Did you try it before installing the block, with the stock cooler?


----------



## Gking62

yzonker said:


> Mostly the low VRAM OC that is holding you back probably. Run it in 1440p or less desktop res. with only one monitor connected.











I scored 10 861 in Speed Way


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com








Sheyster said:


> Is +775 the most you can do on memory? That's really low, especially for a Strix.  Did you try it before installing the block, with the stock cooler?


seems so, disappointingly is, sadly no, wished I'd have done more DD so anything +800 or more I artifact on PR stress test, I just don't get it. I do routinely run memtest_vulcan to verify settings, interestingly enough I can pass with +1000 but again, I artifact under PR stress.

In any event, I'm happy for now with these settings (attached)...


----------



## Xdrqgol

bmagnien said:


> The 5800x3d ‘maxed out’ is technically an undervolt lol (-30 all core voltage offset) paired with 3800c14 or better bdie at 1:1 IF. Unless you have a mobo that supports bclk overclocking in which case you can gain a couple % clock increase in return for overall system stability. I don’t think it beats a truly maxed 13900k paired with top binned Hynix, in anything, but in some cherry picked games (MSFS?) it can come close.
> 
> Re: a previous post about CPU bottlenecks in recent games even with a 5800x3D ( and i imagine even a fully maxed 13900kf) - I strongly agree. In Witcher 3, Callisto Protocol, and SpiderMan ray-tracing is extremely cpu heavy - thankfully dlss3 frame gen bypasses this for the most part, but with callisto being an AMD sponsored game it isn’t an option there.
> 
> In other games like MSFS, it’s not raytracing but other cpu intensive workloads. I can run everything maxed except for ‘Terrain Detail’ - anything over 200 there causes massive cpu bottleneck (gpu usage dips to 40%), and not even frame gen can overcome this in my experience on a 5800x3d.
> 
> in summary - with the release of the 4090 and anything beyond, the notion of the cpu being unimportant at high res gaming has definitively been buried. There isn’t a cpu on the market that doesn’t become a bottleneck under the right conditions with this card, even at 4K. A blessing and a curse 😆


Thanks for sum-up!

well looking at this info - but also in general on all reviews- 5800X3D vs 13900K

5800X3D - was released during early stages of development, i think everyone knows that and if you don’t know - yes - it was not a finished product.

If last generation AM4 ( not fully developed) can catch and at times win over a fully 13900k maxed out - I think it becomes pretty obvious the following scenario:

7000X3D with the % generational IPC improvement + the X3D processed being more mature might leave intel a bit behind in gaming and in the scenario where 5800X3D catches or wins over intel! Also, 5800x3D has shown better memory controller compared to 5000 series, so the hope here is that the 7000X3D will show an improvement also. 

*13900K/S will continue to be the one to go for benchmarks so nothing to see here.


----------



## Nico67

Gking62 said:


> I scored 10 861 in Speed Way
> 
> 
> Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> seems so, disappointingly is, sadly no, wished I'd have done more DD so anything +800 or more I artifact on PR stress test, I just don't get it. I do routinely run memtest_vulcan to verify settings, interestingly enough I can pass with +1000 but again, I artifact under PR stress.
> 
> In any event, I'm happy for now with these settings (attached)...


My Strix was not much better, as I can reliably only get +1000 in games. Memtest_Vulcan passes at +1150 and gets errors at +1200, but I see artefacts at any thing over +1000. On air it was much the same, so I don't think cold memory made much difference being so low.
It does seem like either there are some average mem chips (maybe one that can't do more than +1000), or some cards got remarked 24gbs chips, due to supply issues?


----------



## GAN77

Nico67 said:


> or some cards got remarked 24gbs chips, due to supply issues?











[Official] NVIDIA RTX 4090 Owner's Club


it's the 5800X3D limiting your frames in that game. AMD CPUs don't do raytracing as well as intel does - if you had a 13700k+ you would see gains. If you check your GPU usage it likely dropped below 99% during your bench. My card hits a hardcap of around 80 fps in the middle of the city with...




www.overclock.net


----------



## Nico67

Xdrqgol said:


> Thanks for sum-up!
> 
> well looking at this info - but also in general on all reviews- 5800X3D vs 13900K
> 
> 5800X3D - was released during early stages of development, i think everyone knows that and if you don’t know - yes - it was not a finished product.
> 
> If last generation AM4 ( not fully developed) can catch and at times win over a fully 13900k maxed out - I think it becomes pretty obvious the following scenario:
> 
> 7000X3D with the % generational IPC improvement + the X3D processed being more mature might leave intel a bit behind in gaming and in the scenario where 5800X3D catches or wins over intel! Also, 5800x3D has shown better memory controller compared to 5000 series, so the hope here is that the 7000X3D will show an improvement also.
> 
> *13900K/S will continue to be the one to go for benchmarks so nothing to see here.


Same reasons I am waiting for 7800x3d  It should easily win in some games and be very competitive in others, but will lose in some as well, not as much as the non x3d.
Whether the 7950x3d would be worth it for gaming is still questionable, although it should have 200meg cache and a few 100 more mhz clock, games still have issues with that many cores and inter ccd latency. Shutting down one ccd will lose 100meg cache, but still have the higher boost. Process Lasso it to 3-4 cores/threads per ccd again would have issues with the second ccd running slower.
I think AMD should release a 7700x3d and a gamer 7800x3d that has binned ccds like the good ones on 7950... Pretty sure most people would opt to pay a bit more for a great clocking 8 core 3d chip, when they don't need the extra cores.


----------



## Nico67

GAN77 said:


> [Official] NVIDIA RTX 4090 Owner's Club
> 
> 
> it's the 5800X3D limiting your frames in that game. AMD CPUs don't do raytracing as well as intel does - if you had a 13700k+ you would see gains. If you check your GPU usage it likely dropped below 99% during your bench. My card hits a hardcap of around 80 fps in the middle of the city with...
> 
> 
> 
> 
> www.overclock.net


Yep, your post was what I based that on 

What interests me is how that would happen, as I am pretty should Nvidia sell the AIB's the GPU and mem in packages, so would it just be random which packages got the "better" chips. You would think if it was a known thing, they would use these packages for higher spec cards, so all Strix would be good, but that doesn't seem to be the case.


----------



## J7SC

Some posts here suggesting that 4090 clocks don't matter, only VRAM 🤣? I don't think so...

What is certainly true is that the N4 node 76.3 billion transistor 4090s have a massively hungry core that needs to be fed via big onboard cache and plenty of capable VRAM, meaning +1500 (24 Gbps). That said, core clocks most certainly do matter, also.

Below are some runs for Cyberpunk 2077, starting out with base settings (stock clocks, PL, voltage) and then working my way up. NV Panel settings were on 'optimal power' rather than 'prefer max performance' and on textures set to quality to keep things comparable between runs (Banzai runs at over 90 fps average, but with max power). Max Watt were in the 530W - 590 W range and CyberPunk 2077 settings were the same (4K RTX Ultra, DLSS on Quality). Ambient temp was ~ 25 C.










...the last two w/ VRAM 1500 with increasing core speed (last one with core at 3150):









I don't know where max core speed is yet for Cyberpunk 2077 (ie. with VFcurve not used above, I have shown _effective_ core clocks at well over 32xx).

As mentioned before, I like to just compete mostly against my own previous best results. What is more 3DM HoF (and other benchmark tables) are, sadly, pretty polluted by now. FYI, I mark my results in screenshots re. artifacts, so while some of those below are not subbed yet for other reasons, I can assure you that the Speedway score of 11931 had some 'pretty interesting visuals', disco lights and all 












coelacanth said:


> -250 C, is that liquid helium rather then LN2?


...yes, liquid Helium 🥶


----------



## yzonker

J7SC said:


> Some posts here suggesting that 4090 clocks don't matter, only VRAM 🤣? I don't think so...
> 
> What is certainly true is that the N4 node 76.3 billion transistor 4090s have a massively hungry core that needs to be fed via big onboard cache and plenty of capable VRAM, meaning +1500 (24 Gbps). That said, core clocks most certainly do matter, also.
> 
> Below are some runs for Cyberpunk 2077, starting out with base settings (stock clocks, PL, voltage) and then working my way up. NV Panel settings were on 'optimal power' rather than 'prefer max performance' and on textures set to quality to keep things comparable between runs (Banzai runs at over 90 fps average, but with max power). Max Watt were in the 530W - 590 W range and CyberPunk 2077 settings were the same (4K RTX Ultra, DLSS on Quality). Ambient temp was ~ 25 C.
> 
> View attachment 2591440
> 
> 
> ...the last two w/ VRAM 1500 with increasing core speed (last one with core at 3150):
> View attachment 2591441
> 
> 
> I don't know where max core speed is yet for Cyberpunk 2077 (ie. with VFcurve not used above, I have shown _effective_ core clocks at well over 32xx).
> 
> As mentioned before, I like to just compete mostly against my own previous best results. What is more 3DM HoF (and other benchmark tables) are, sadly, pretty polluted by now. FYI, I mark my results in screenshots re. artifacts, so while some of those below are not subbed yet for other reasons, I can assure you that the Speedway score of 11931 had some 'pretty interesting visuals', disco lights and all
> View attachment 2591442
> 
> 
> 
> 
> 
> ...yes, liquid Helium 🥶


I almost pointed this out in my previous post, that the gains will be a little different for each card depending on the relative strengths of the core/mem. If I had a bit more core clock like you do, they would end up being very close to equal I would say, at least in CP. 

I ran my same test on CP2077 to see how it compared and got a similar result to my GotG test. 

Although I don't understand the results entirely as you are showing a much larger increase in fps from just increasing the core. I only saw an increase from 85fps to 87 fps by going from default (~2800) to 3090 on the core with +1700 mem.

Oh and I also forgot to mention before that everything was run on the Galax bios.

Default,










Core: 3090, mem default










Core: default, mem +1700









Core: 3090, mem: +1700


----------



## Gking62

J7SC said:


> Some posts here suggesting that 4090 clocks don't matter, only VRAM 🤣? I don't think so...
> 
> What is certainly true is that the N4 node 76.3 billion transistor 4090s have a massively hungry core that needs to be fed via big onboard cache and plenty of capable VRAM, meaning +1500 (24 Gbps). That said, core clocks most certainly do matter, also.
> 
> Below are some runs for Cyberpunk 2077, starting out with base settings (stock clocks, PL, voltage) and then working my way up. NV Panel settings were on 'optimal power' rather than 'prefer max performance' and on textures set to quality to keep things comparable between runs (Banzai runs at over 90 fps average, but with max power). Max Watt were in the 530W - 590 W range and CyberPunk 2077 settings were the same (4K RTX Ultra, DLSS on Quality). Ambient temp was ~ 25 C.
> 
> View attachment 2591440
> 
> 
> ...the last two w/ VRAM 1500 with increasing core speed (last one with core at 3150):
> View attachment 2591441
> 
> 
> I don't know where max core speed is yet for Cyberpunk 2077 (ie. with VFcurve not used above, I have shown _effective_ core clocks at well over 32xx).
> 
> As mentioned before, I like to just compete mostly against my own previous best results. What is more 3DM HoF (and other benchmark tables) are, sadly, pretty polluted by now. FYI, I mark my results in screenshots re. artifacts, so while some of those below are not subbed yet for other reasons, I can assure you that the Speedway score of 11931 had some 'pretty interesting visuals', disco lights and all
> View attachment 2591442
> 
> 
> 
> 
> 
> ...yes, liquid Helium 🥶


here's mine and max frames vary per run, high as 176, all settings maxed Psycho etc, and btw this is with HT off, on doesn't appear to affect it either way, also are you all running with +100 Core Voltage? can we edit to allow for higher?....


----------



## J7SC

yzonker said:


> I almost pointed this out in my previous post, that the gains will be a little different for each card depending on the relative strengths of the core/mem. If I had a bit more core clock like you do, they would end up being very close to equal I would say, at least in CP.
> 
> I ran my same test on CP2077 to see how it compared and got a similar result to my GotG test.
> 
> Although I don't understand the results entirely as you are showing a much larger increase in fps from just increasing the core. I only saw an increase from 85fps to 87 fps by going from default (~2800) to 3090 on the core with +1700 mem.
> 
> Oh and I also forgot to mention before that everything was run on the Galax bios.
> 
> Default,
> 
> View attachment 2591456
> 
> 
> Core: 3090, mem default
> 
> View attachment 2591457
> 
> 
> Core: default, mem +1700
> View attachment 2591458
> 
> 
> Core: 3090, mem: +1700
> 
> View attachment 2591459


Nice ! With the stock G-G-OC vbios oc'ed, tops was in the 85 - 86 fps range compared to the Galax delivering 89+ fps. At bone stock (no oc for either, with sliders at +-0, shown above), the Galax vbios was about 10 fps higher than the stock G-G-OC, by far the biggest gain -- most likely also as its 'stock PL' is much higher with the Galax, as is its 'base clock', and VRAM runs 2 C warmer with the Galax. I'll do more runs soon to add higher VRAM oc than I already used. Bottom line though is: Just loading the Galax for gaming is probably enough to go from Jekyll to Hide mode


----------



## yzonker

J7SC said:


> Nice ! With the stock G-G-OC vbios oc'ed, tops was in the 85 - 86 fps range compared to the Galax delivering 89+ fps. At bone stock (no oc for either, with sliders at +-0, shown above), the Galax vbios was about 10 fps higher than the stock G-G-OC, by far the biggest gain -- most likely also as its 'stock PL' is much higher with the Galax, as is its 'base clock', and VRAM runs 2 C warmer with the Galax. I'll do more runs soon to add higher VRAM oc than I already used. Bottom line though is: Just loading the Galax for gaming is probably enough to go from Jekyll to Hide mode


Well I thought that's what you were showing in the first screenshot (Giga vs Galax). Now that I think about it, pretty sure I was in the 70's with the stock bios back when I first got the card. I would have assumed I had run different settings but maybe not. Lol. 

Galax = the awesome


----------



## Alemancio

*It looks like I had a dud. *

Gigabyte Gaming OC would barely do 3015mhz @ 1.1V and Ram would crash even at +600 Mhz. Sold it and got an Asus TUF and its doing 3075MHz @ 1.1V and +1,200MHz. I was hoping to keep the gigabyte but its worse in every aspect.


----------



## yzonker

Alemancio said:


> *It looks like I had a dud. *
> 
> Gigabyte Gaming OC would barely do 3015mhz @ 1.1V and Ram would crash even at +600 Mhz. Sold it and got an Asus TUF and its doing 3075MHz @ 1.1V and +1,200MHz. I was hoping to keep the gigabyte but its worse in every aspect.


Just shows it's all about the lottery. Doesn't matter which card you buy, just a matter of luck...


----------



## Alemancio

yzonker said:


> Just shows it's all about the lottery. Doesn't matter which card you buy, just a matter of luck...


I think this is really true for this gen. There's no point to go to premium models like Strix or Suprim unless you do competitive OC and need the 1000+Amps for core.


----------



## Miao

leonman44 said:


> True answer , i have invested way too much too and god i hate the hard tubing watercooling , its very hard to change components anymore or even just paste my cpu and the distro plate on the od11 xl is a nightmare i cant drain my loop properly its almost impossible. I paid a lot for them so i continue to use them but they are not practical if i was aircooled i would have installed the card already in a few minutes or with soft tubing and a regular reservoir just within a day. Now i need a whole week to rework the whole system.
> 
> But without watercooling would we be real overclockers? i guess no , theres no way to blast a lot of voltage on the current components without a good custom loop.


or you can make an ugly mix like mine...
















some fittings are temporary, but the quick release ones instead... they will be ugly, but they are a whole other life, in fact I think so much that I will end up eliminating even those few pieces of acrylic tubes left in favor of other quick releases.

ps: i'm just starting to play with my FE, probably i'll flash the galax bios soon, but meanwhile someone knows what is a decent gpu clock with the stock bios?
atm i'm stuck @3060MHz 1,095v (50mv less just to avoid the vlimit...) and not seem a great result to me.


----------



## Sheyster

yzonker said:


> Just shows it's all about the lottery. Doesn't matter which card you buy, just a matter of luck...


Indeed, I was thinking my GB-G-OC would be temporary until I could find a Strix at MSRP. I've pretty much just decided to keep it.


----------



## Sheyster

Miao said:


> ps: i'm just starting to play with my FE, probably i'll flash the galax bios soon, but meanwhile someone knows what is a decent gpu clock with the stock bios?


Unfortunately you can't flash the FE cards with third party vBIOS.

I would say a card that can do 3100+ is a decent card.


----------



## J7SC

Sheyster said:


> Indeed, I was thinking my GB-G-OC would be temporary until I could find a Strix at MSRP. I've pretty much just decided to keep it.


...same here - just got the cheapest RTX 4090 I could lay my hands on locally (US$ 1,619) and the cheapest water-block / backplate (US$122) as I wanted to improve upon my FS2020 w/DLSS3/FI/NV_Reflex. The fact that it turned out to be a great sample is icing on the cake. 

Good thing that I finished the final touches on that dual setup...









...because I just had Amazon deliver some fittings an hour ago so that (as mentioned earlier) I can graft the 3090 Strix into the old / Dec '18 Threadripper build to make it a testbed for some 8K and 16K rendering and effects a friend of mine wants to test out for his business.

The 3x NV GPUs (2x 2080 Ti, 1x 3090) will have a combined 19,200 CUDAs, 872 TMUs, 200 ROPs, 872 tensor cores, 150 ray tracing units and 46 GB VRAM. I still think that the single 4090 is going to beat that with its 16,384 CUDAs, 512 TMUs, 176 ROPs, 512 Tensor cores, 128 ray tracing units and 24 GB VRAM (never mind its additional hard-wired codecs).


----------



## Snoopy69

Nizzen said:


> Do you have Asus strix or Tuf with Elmor Evc tool? If not, you don't need it


Yes, i have


----------



## Brads3cents

Miao said:


> or you can make an ugly mix like mine...
> View attachment 2591468
> 
> View attachment 2591469
> 
> some fittings are temporary, but the quick release ones instead... they will be ugly, but they are a whole other life, in fact I think so much that I will end up eliminating even those few pieces of acrylic tubes left in favor of other quick releases.
> 
> ps: i'm just starting to play with my FE, probably i'll flash the galax bios soon, but meanwhile someone knows what is a decent gpu clock with the stock bios?
> atm i'm stuck @3060MHz 1,095v (50mv less just to avoid the vlimit...) and not seem a great result to me.


You can flash the galax bios to the FE?
News to me…


----------



## Blameless

Since I need to wait for a 1.8v converter for my flash programmer before my intended system can be made operational again, I've been experimenting with a decade old system, mostly using FurMark and Memtest Vulkan, plus a few games at absurdly high settings via DSR, to bypass CPU/platform limitations.

One thing I have noticed is that I am not seeing any hint of the special properties oft reported for the GALAX 666w BIOS, other than power limit, vs the stock (F2) firmware of my Gigabyte Windforce. Memory OCing potential and performance are identical and even the effective clocks are the same with the same V/F curve dialed in, power limits permitting.

Anyway, I'm rather impressed with this Windforce sample so far. Memory seems stable with a +1620MHz offset (I like using multiples of the 27MHz internal reference clock, which translates into 108MHz for the I/O clock that MSI AB reports) all the way to 90C or so. I can also get the memory VRM to consume 125w in Memtest_Vulkan, so I'm glad I puttied that single bare VRM phase that Gigabyte neglected. Core is a bit harder to gauge on this ancient system, but I expect to be able to hold 2800MHz effective clock, or a bit higher, at 1000mV, without hitting power limits I feel comfortable with. I'm sure I could do higher in most things (a few quick tests at ~3GHz were promising), but I'm wary of using more than 550-600w on this VRM, especially when I have the card powered by a ten year old Seasonic X-750 (which is still a solid unit, but probably not something I should be running too much FurMark with a maxed out power limit on).

One issue I am having with the Galax BIOS is the default fan curve...it allows for ~300 rpm more at the top end vs the stock firmware, but it ramps up too slowly and allows VRAM heavy tests to run at 30% fan speed, which leads to overly hot memory. It also lacks the zero rpm feature, which, ironically enough, makes the memory a bit too cold if the system has been idle for a while. This is easy enough to fix in software, but I'll probably try firmware from the Gaming OC to see if it has a more suitable fan curve (I am entirely content with 'only' 600w power limit).



Gking62 said:


> so using nvflash for the first time, yes I said that, is this the proper syntax (Galax vbios)? I bugged out before flashing to check here first, thanks to all.
> 
> C:\nvflash_5.792>nvflash64 --protectoff


I never needed the --protectoff switch, just the override subsystem one.


----------



## Shoggoth

J7SC said:


> Below are some runs for Cyberpunk 2077, starting out with base settings (stock clocks, PL, voltage) and then working my way up. NV Panel settings were on 'optimal power' rather than 'prefer max performance' and on textures set to quality to keep things comparable between runs (Banzai runs at over 90 fps average, but with max power). Max Watt were in the 530W - 590 W range and CyberPunk 2077 settings were the same (4K RTX Ultra, DLSS on Quality). Ambient temp was ~ 25 C.





yzonker said:


> I almost pointed this out in my previous post, that the gains will be a little different for each card depending on the relative strengths of the core/mem. If I had a bit more core clock like you do, they would end up being very close to equal I would say, at least in CP.


I hope I don't come across as attempting to gloat, I'm just genuinely puzzled here....

I ran the CP2077 bench yesterday or the day before and got 102 avg, 62 min. and 126.5 max with everything RT maxed out and DLSS Quality. That's with a Gainward Phantom GS on water, absolutely stock, only with power limit maxed out to it's paltry +11% limit. That doesn't seem right compared to your tweaked results....


----------



## J7SC

Shoggoth said:


> I hope I don't come across as attempting to gloat, I'm just genuinely puzzled here....
> 
> I ran the CP2077 bench yesterday or the day before and got 102 avg, 62 min. and 126.5 max with everything RT maxed out and DLSS Quality. That's with a Gainward Phantom GS on water, absolutely stock, only with power limit maxed out to it's paltry +11% limit. That doesn't seem right compared to your tweaked results....


hey, all the power to you , just not sure if it is apples to apples, or to mandarins...same settings (ie. 4k at actual size, full refresh) we showed in our bench screenies ? Got bench result screenies of your runs ?


----------



## leonman44

Guys what water sensors do you have for your loops , I have 2 Ek sensors and from cold boot they both report 3-4c more than the ambient temp , that can’t be true.


----------



## Streetsport

I had some issues with my strix oc that I feel I should share here. The card would only hit 15,000 in port royal. After changing bios on the card and motherboard, reinstalling windows and drivers and trying every windows setting I could think of, I notice in gpuz that my pcie speed was x1 gen 3. Went into the bios and it was greyed out and couldnt be adjusted. It ended up being that the card wasnt making full contact to all pins in the pcie slot. Probably due to the weight of the card and the sloppiness of the pcie slot and latch. I ended up using a riser cable to test the card and hit around 28,000 in port royal. Went back to gpuz and saw that the speed was only x8 gen 4 though. Turns out that on asus z690 and z790 boards, if you install an m.2 in the top gen5 slot it drops the top pcie slot to x8. This wont matter when gen5 m.2 and gpu's are released because they run at full speed on x8 gen5. Moved the m.2 and now im over 30,000 in port royal with 150 core and 800 mem.

system specs are
13900k/strix z690-e/strix 4090 oc/128gb dominator 5200/hx1500i


----------



## jootn2kx

I'm trying to replicate the issue about vram "instability" when it's running cold above +1500mhz.
I don't think it's only for waterblocks because i'm having simelar issue and I don't use a waterblock.
Experience random black screen issues once a while when memory is running cool so it seems.
I just had a black screen couple of minutes ago when memory was idling +1900 (in intro loading screen) memory was not heating up yet.
Last time I had similar instability/crash I was just browsing on internet more than 1 week ago.

The funny thing is I have been hammering the card for 20hours with benchmarks and games this week with +1900 memory without any artifacts, instabilities or crashes.
It seems only to happen above +1500 memory in afterburner, when memory is running cool and very randomly.

Unsure it's the same issues with people here that reporting this when using waterblocks I think it's some kind of bug.
Overclocking on the 4000 series is feels different than previous gen lol.

Just my 2 cents


----------



## lawson67

Alemancio said:


> I think this is really true for this gen. There's no point to go to premium models like Strix or Suprim unless you do competitive OC and need the 1000+Amps for core.


You don't even need to buy a Strix or Suprim to do a 1000+Amps a Zotac Amp extreme should be able to cope with 1320 Amps 24 phase 55x24, yet some consider it to be a budget card and waste there money buying a Strix etc without understanding that Unless your buying a card with guaranteed chipped bin like an Ultimate edition etc its all luck, the Strix in the UK right now is £2,400 and my Zotac is £1,900 and the memory on my Zotac does not have errors until i pass 1950mhz and the core is happy gaming all day long at 3120mhz and no my fans are not noisy and I have 0 coil whine


----------



## Miao

Sheyster said:


> Unfortunately you can't flash the FE cards with third party vBIOS.
> 
> I would say a card that can do 3100+ is a decent card.


noooooooooooooooooooooooooooooooooooo!!!!!


Brads3cents said:


> You can flash the galax bios to the FE?
> News to me…


it's my first FE, I didn't remember (or i didn't know...) that i can't!!!

f*ck!

bah, i still love her anyway

and I paid her 700€ less than a Strix (that was my first choice, the FE was the second)

sure if it didn't even have a 600w bios it would have been a much worse problem, in this case I'll try to get over it :-|


btw why isn't possible to flash the fe???


----------



## Brads3cents

because Nvida doesnt want people messing with their cards in that manner.

You know what though? The FE is such a beast. From what ive been seeing FE cards often have some of the top bins.. Ive posted multiple top 20 3dmark scores with the stock air cooler and stock bios


----------



## alasdairvfr

J7SC said:


> Some posts here suggesting that 4090 clocks don't matter, only VRAM 🤣? I don't think so...
> 
> What is certainly true is that the N4 node 76.3 billion transistor 4090s have a massively hungry core that needs to be fed via big onboard cache and plenty of capable VRAM, meaning +1500 (24 Gbps). That said, core clocks most certainly do matter, also.
> 
> Below are some runs for Cyberpunk 2077, starting out with base settings (stock clocks, PL, voltage) and then working my way up. NV Panel settings were on 'optimal power' rather than 'prefer max performance' and on textures set to quality to keep things comparable between runs (Banzai runs at over 90 fps average, but with max power). Max Watt were in the 530W - 590 W range and CyberPunk 2077 settings were the same (4K RTX Ultra, DLSS on Quality). Ambient temp was ~ 25 C.
> 
> View attachment 2591440
> 
> 
> ...the last two w/ VRAM 1500 with increasing core speed (last one with core at 3150):
> View attachment 2591441
> 
> 
> I don't know where max core speed is yet for Cyberpunk 2077 (ie. with VFcurve not used above, I have shown _effective_ core clocks at well over 32xx).
> 
> As mentioned before, I like to just compete mostly against my own previous best results. What is more 3DM HoF (and other benchmark tables) are, sadly, pretty polluted by now. FYI, I mark my results in screenshots re. artifacts, so while some of those below are not subbed yet for other reasons, I can assure you that the Speedway score of 11931 had some 'pretty interesting visuals', disco lights and all
> View attachment 2591442
> 
> 
> 
> 
> 
> ...yes, liquid Helium 🥶





yzonker said:


> I almost pointed this out in my previous post, that the gains will be a little different for each card depending on the relative strengths of the core/mem. If I had a bit more core clock like you do, they would end up being very close to equal I would say, at least in CP.
> 
> I ran my same test on CP2077 to see how it compared and got a similar result to my GotG test.
> 
> Although I don't understand the results entirely as you are showing a much larger increase in fps from just increasing the core. I only saw an increase from 85fps to 87 fps by going from default (~2800) to 3090 on the core with +1700 mem.
> 
> Oh and I also forgot to mention before that everything was run on the Galax bios.
> 
> Default,
> 
> View attachment 2591456
> 
> 
> Core: 3090, mem default
> 
> View attachment 2591457
> 
> 
> Core: default, mem +1700
> View attachment 2591458
> 
> 
> Core: 3090, mem: +1700
> 
> View attachment 2591459



What actual ingame settings are you using? the results screen doesn't tell a great story, plus CP2077 is notorious for not applying settings without a full game restart.

Here's me using Galax bios +1500 memory and running at 3045 on the core. I find the effective clock stretching is still bad in this game (despite being much better with this bios so bumping core is probably a waste of time


----------



## alasdairvfr

Did some benching on the Galax BIOS and achieved some new best scores. Not all benches saw improvement but these did




Spoiler



NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASRock X570 Taichi (3dmark.com)










NVIDIA GeForce RTX 4090 video card benchmark result - AMD Ryzen 9 5950X,ASRock X570 Taichi (3dmark.com)


----------



## X909

FYI, I had 4 cards in the system and non of them could really do >3060 Mhz @ 1.1V without pixel failures. Two cards were not able to hold 3 Ghz in Heaven 4k without pixel popping at 1.1V.
Just to keep expectations realistic... for sure, benching is possible with 3.1 Ghz...


----------



## yzonker

X909 said:


> FYI, I had 4 cards in the system and non of them could really do >3060 Mhz @ 1.1V without pixel failures. Two cards were not able to hold 3 Ghz in Heaven 4k without pixel popping at 1.1V.
> Just to keep expectations realistic... for sure, benching is possible with 3.1 Ghz...


Yea this forum tends to bring the best bins of everything to the surface, so people can really start feeling like they got a dud when in reality what they have is pretty good overall. Just not the best.


----------



## KingEngineRevUp

yzonker said:


> Yea this forum tends to bring the best bins of everything to the surface, so people can really start feeling like they got a dud when in reality what they have is pretty good overall. Just not the best.


Yeah on top of that, what's actually 100% stable? Certainly not at everyone's highest benchmark scores. I'm sure everyone has to dial it back a little too the point performances are within 1-2% of one another. Not like there's a 5-10% gap.


----------



## Andischocke

Hey guys and girls,
got myself a MSI 4090 Trio. Wanted to check how it performs and ran Timespy on the 3DMark Demo version.
Now I am a bit confused as to why there are two peaks. Can someone why there are two "normal distributions" and how my gpu performed in gereral. 
Everything is in stock settings, except EXPO for the RAM. I would be grateful for every input.


----------



## J7SC

X909 said:


> FYI, I had 4 cards in the system and non of them could really do >3060 Mhz @ 1.1V without pixel failures. Two cards were not able to hold 3 Ghz in Heaven 4k without pixel popping at 1.1V.
> Just to keep expectations realistic... for sure, benching is possible with 3.1 Ghz...





yzonker said:


> Yea this forum tends to bring the best bins of everything to the surface, so people can really start feeling like they got a dud when in reality what they have is pretty good overall. Just not the best.


...I remember in the early days of this forum even before the 4090s were released that there were 'rumours' of some of these cards breaking the 3 GHz barrier - it was met with some (at the time) healthy skepticism. Now folks seem to be depressed when they 'only' get ~ 3050 MHz.

A forum like this is almost by definition competitive when it gets to posted bench results. That brings with it some online behavior I could do without...but at the same time, it also brings out lots of good tips (ie. the initial Galax vbios posts; tips on VRAM temps).

Anyhow, there is a natural if narrow spread of peak performance in s.th. as complicated as a N4 76.3 billion transistors - as long as it meets / usually exceeds the performance numbers published by the card vendor, everything else is gravy. This is nothing new. When doing Quad-SLI in the distant past, I would end up with four cards that had an ASIC quality rating ranging from 69% to 93%, all purchased at the same day from the same store (some of it with sequential serial numbers)... At the end of the day, it is nice to have a card that performs 'above average', but you probably learn more about oc'ing with the naturally slower cards.


----------



## Krzych04650

KingEngineRevUp said:


> Yeah on top of that, what's actually 100% stable? Certainly not at everyone's highest benchmark scores. I'm sure everyone has to dial it back a little too the point performances are within 1-2% of one another. Not like there's a 5-10% gap.


Things like Port Royal are quite crashy, it is not like you can run massively higher clocks than in games. There are some games that are going to crash on Port Royal stable settings, but this is ~30 MHz difference. Much bigger problem is that the scores in those benchmarks are manipulated with various tricks so you cannot really compare to those scores if you just run those benchmarks on default.

What makes the biggest difference is VRAM. My card can only do ~1200 stable, could do +1350 on air, I run +1100 because 1200 is borderline and can crash if water is too cold, and I am consistently seeing users getting ~3% higher scores even in games if their cards can do +1500 or more.

So the difference between weak 3000/+1000 bin and good 3100+/1500+ bin is going to be around 5% at the high-end. If you have like a middle of the pack 3060/+1200 card like mine, then it is 3%. Basically the difference between 29K Port Royal and almost 30K Port Royal.

Not that much of a big deal but the card should have had 24 Gbps or at least 23 Gbps memory stock.


----------



## Gking62

KingEngineRevUp said:


> Yeah on top of that, what's actually 100% stable? Certainly not at everyone's highest benchmark scores. I'm sure everyone has to dial it back a little too the point performances are within 1-2% of one another. Not like there's a 5-10% gap.


precisely, well said.


----------



## Blameless

+1620MHz on the memory on my Windforce sample is stable across the entire range of memory junction temps from 30C to 90C. +1728 is stable up to ~88C, but spits out errors immediately much below 40C or so. I'll probably leave it at +1620MHz so I know the card won't be too cold for stability, even if it's been freshly booted.

Anyway, does anyone have a Gigabyte RTX 4090 Gaming (non-OC) BIOS? I can't find dumps for this anywhere and I'd like to have the Gaming OC fan curve without the built-in clock offset. I can't extract the rom from the update listed on Gigabyte's site and the utility doesn't want to flash to a part with a different ID.


----------



## GforceGTXer

Just received my 4090 FE and wondering if I got a dud. I can OC to 3045 core and 11900 mem with all sliders maxed for benchmarks, and for game stability I am at 3000 - 3015 core and 11800 mem.


----------



## StreaMRoLLeR

X909 said:


> FYI, I had 4 cards in the system and non of them could really do >3060 Mhz @ 1.1V without pixel failures. Two cards were not able to hold 3 Ghz in Heaven 4k without pixel popping at 1.1V.
> Just to keep expectations realistic... for sure, benching is possible with 3.1 Ghz...


My dude please dont be sad for not doing 3060mhz etc. etc. There is practically NOT EVEN 0.1 fps difference for 3015 mhz vs 3060mhz

If your chip can do 3000mhz stable and +1500 on MEM you are settled. Your card is good. Those 3100mhz guys represent %0.01 of community so dont worry about it. My chip can do 3030 on PR, 3075 in Timespy and 3015 in games for example


----------



## Sheyster

Streetsport said:


> system specs are
> 13900k/strix z690-e/strix 4090 oc/128gb dominator 5200/hx1500i


This M.2_1/8X PCIE issue is known with the Strix 790-E and Maximus Extreme boards, due to that M.2 slot being Gen5 capable and having to share bandwidth with the PCIE #1 slot. This is not an issue with Strix-F, Hero and Apex as their M.2_1 slots are Gen4.



Miao said:


> noooooooooooooooooooooooooooooooooooo!!!!!
> 
> it's my first FE, I didn't remember (or i didn't know...) that i can't!!!


FE 4090 is a great card with a 600w BIOS, don't sweat it!



Blameless said:


> Anyway, does anyone have a Gigabyte RTX 4090 Gaming (non-OC) BIOS?


As far as I know there is only the GB OC version available, which is a good thing. ASUS TUF/TUF OC and MSI Trio/Trio X is a scam, glad to see GB steering clear of that BS money grab.



StreaMRoLLeR said:


> If your chip can do 3000mhz stable and +1500 on MEM you are settled.


My card can do 3120 in everything, but I'm peeved about the +1400 max memory OC. That is until I see some folks posting they can't get more than +600 memory OC with a Strix, then I'm fine with it!


----------



## ttnuagmada

GforceGTXer said:


> Just received my 4090 FE and wondering if I got a dud. I can OC to 3045 core and 11900 mem with all sliders maxed for benchmarks, and for game stability I am at 3000 - 3015 core and 11800 mem.


Not too far off from my FE. I seem to be game stable at 3075/11700. This is watercooled btw.


----------



## GforceGTXer

ttnuagmada said:


> Not too far off from my FE. I seem to be game stable at 3075/11700. This is watercooled btw.


I'm not even close to that. To be game stable across every game, I need to keep the core at 3015 and below. Sucks. Vram maxes out at +1200 for gaming as well.


----------



## yzonker

GforceGTXer said:


> I'm not even close to that. To be game stable across every game, I need to keep the core at 3015 and below. Sucks. Vram maxes out at +1200 for gaming as well.


Probably depends a lot on the games tested too. If a person's game library includes Metro Exodus EE and games similar to it that are very picky on OC, then that will drive down the every game stable OC.


----------



## GforceGTXer

yzonker said:


> Probably depends a lot on the games tested too. If a person's game library includes Metro Exodus EE and games similar to it that are very picky on OC, then that will drive down the every game stable OC.


True, I can run 3030 all day in MW2 and 3045 in need for speed unbound, but can't go over 3015 for Sackboy A Big Adventure lol


----------



## J7SC

More work related than gaming / benching, but I started to use Blender (spoiler), and also Octane (incl. RTX)...haven't clocked the 4090 fully yet for Blender and only 'default' for Octane as I check heat and peak PL first, but Octane in particular lasts a very long time, and both sniff out 'temporary peak bench settings'. Again, more useful for rendering work but those will tell you a lot about your GPU's oc stability in general...


Spoiler














 
*...ALSO:* Best Wishes for 2023 - may the Core be cold and the VRAM warm


----------



## Sheyster

GforceGTXer said:


> I'm not even close to that. To be game stable across every game, I need to keep the core at 3015 and below. Sucks. Vram maxes out at +1200 for gaming as well.


3000/+1200 is a good gaming OC. Unless you're a competitive bencher I would not fret about it.


----------



## Panchovix

GforceGTXer said:


> I'm not even close to that. To be game stable across every game, I need to keep the core at 3015 and below. Sucks. Vram maxes out at +1200 for gaming as well.


Same here, prob 3030Mhz in all games but +1100 on the VRAM, it sucks  the average VRAM overclock in the forum seem to be +1500Mhz or so, and prob 3075-090Mhz on the core.

It is what it is I guess, I was kinda lucky with my past 3080 but my 4090 was not lucky at all.


----------



## bleomycin

Anyone have any experience with TG-NSP50 Thermal Putty? Most of the popular options listed in this thread are pretty much sold out or only available in huge expensive quantities. I found 1 pound of this stuff for $60 on digikey which seems to be the most reasonable price for a putty I've been able to track down that isn't from aliexpress. It's rated for 5.4W/mK which isn't great but I don't feel that automatically means its crap?

Screenshot from Snarks Domain on youtube, the only source I've found for testing various thermal putties.










The EVGA putty is certainly interesting because it's readily available but it sounds like you have to spend at least $50 + shipping to get enough of it for a single application on a 4090? My 4090 FE is getting swapped to an EKWB block soon and I want to use a putty so it's a one and done deal without any headaches. It seems that basically any putty that isn't complete garbage will outperform stock or probably ekwb included thermal pads?


----------



## yzonker

bleomycin said:


> Anyone have any experience with TG-NSP50 Thermal Putty? Most of the popular options listed in this thread are pretty much sold out or only available in huge expensive quantities. I found 1 pound of this stuff for $60 on digikey which seems to be the most reasonable price for a putty I've been able to track down that isn't from aliexpress. It's rated for 5.4W/mK which isn't great but I don't feel that automatically means its crap?
> 
> Screenshot from Snarks Domain on youtube, the only source I've found for testing various thermal putties.
> 
> View attachment 2591616
> 
> 
> The EVGA putty is certainly interesting because it's readily available but it sounds like you have to spend at least $50 + shipping to get enough of it for a single application on a 4090? My 4090 FE is getting swapped to an EKWB block soon and I want to use a putty so it's a one and done deal without any headaches. It seems that basically any putty that isn't complete garbage will outperform stock or probably ekwb included thermal pads?


Have you read the last few pages of this thread with all the talk of lower VRAM OC's with good pads/putty because it runs too cold? 

Bottom line, use the supplied pads or cheap putty.


----------



## Smackaroy

What techniques are people doing to their 4090 in order for them to get 29000? 

A month ago I managed to get a score of 28200 which Landed me in the top 100 but that score is long off the leader board now.http://www.3dmark.com/pr/1901708


----------



## Nizzen

Smackaroy said:


> What techniques are people doing to their 4090 in order for them to get 29000?
> 
> A month ago I managed to get a score of 28200 which Landed me in the top 100 but that score is long off the leader board now.http://www.3dmark.com/pr/1901708


Running unstable memory 🤣
Pro tip: stop benchmarking Port Royal. Too many results that are flawed.


----------



## yzonker

Smackaroy said:


> What techniques are people doing to their 4090 in order for them to get 29000?
> 
> A month ago I managed to get a score of 28200 which Landed me in the top 100 but that score is long off the leader board now.http://www.3dmark.com/pr/1901708


The bios in my sig is all that has changed recently. Page back through this thread to see the benefits.


----------



## InvictuSZN

Has anyone here flashed the 666W bios from Galax on the *Founders Edition*? I dont wanna risk anything since it has no bios switch lol


----------



## Blameless

bleomycin said:


> Anyone have any experience with TG-NSP50 Thermal Putty?


I've used TG-NSP80 and TG-PP-10, which perform very similarly in most applications. NSP50 should be similar, though less viscous, and a bit less performant. The non-silicone putties are a pain to clean up, not that the silicone ones are particularly easy.



bleomycin said:


> Screenshot from Snarks Domain on youtube, the only source I've found for testing various thermal putties.
> 
> View attachment 2591616


The use of putty with shims is odd, as it's not intended for very small gaps certainly not gaps small enough where the shim is the overwhelming component of total thermal resistance...probably why the GPU temp was elevated with the shims. If I'm using shims on part, I just use normal thermal paste along with it.

On a side note, that EVGA putty looks pretty impressive...though not for a 4090.


----------



## alasdairvfr

J7SC said:


> More work related than gaming / benching, but I started to use Blender (spoiler), and also Octane (incl. RTX)...haven't clocked the 4090 fully yet for Blender and only 'default' for Octane as I check heat and peak PL first, but Octane in particular lasts a very long time, and both sniff out 'temporary peak bench settings'. Again, more useful for rendering work but those will tell you a lot about your GPU's oc stability in general...
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2591600
> 
> 
> 
> 
> *...ALSO:* Best Wishes for 2023 - may the Core be cold and the VRAM warm


Finally someone beat my score from like a month ago!

Almost touched you but ran out of time. Galax adds about 300 pts give or take it seems as my prev highest was about 14005 or so


----------



## J7SC

alasdairvfr said:


> Finally someone beat my score from like a month ago!
> 
> Almost touched you but ran out of time. Galax adds about 300 pts give or take it seems as my prev highest was about 14005 or so
> 
> View attachment 2591631


...sorry about that, my first time subbing with Blender - getting a baseline with that and Octane to see what the 2x 2080 Ti + 1x 3090 Strix can do (if I figure out the multi-GPU settings in Blender and Octane)


----------



## Blameless

Back on the Galax BIOS. 600w power limit was proving a touch low for my tests and I've been able to work around the fan issues I've been having with MSI AB's firmware control mode.

It looks like I'll be able to use +1728 on the memory because the card is up to a stable temp by the time it finishes loading the OC on Windows' start up and holding 30% fan speed till 50C ensures it cannot get too cold under any realistic scenario.

Currently I'm targeting 2850MHz core (at 1000mV) and +1728MHz memory for my likely gaming clocks...with a power limit high enough that not even PoE w/GI will trigger it.


----------



## motivman

yzonker said:


> Have you read the last few pages of this thread with all the talk of lower VRAM OC's with good pads/putty because it runs too cold?
> 
> Bottom line, use the supplied pads or cheap putty.


Going from artic TP-3 thermal pads on my ram to stock pads meant I could now run +1700 stable on water with temps as low as 10C (I am on a chiller). Before adding my Chiller and using TP-3 pads, I couldn't even run +1500.. go figure. Use the worst pads possible for your vram this gen.


----------



## KedarWolf

alasdairvfr said:


> Finally someone beat my score from like a month ago!
> 
> Almost touched you but ran out of time. Galax adds about 300 pts give or take it seems as my prev highest was about 14005 or so
> 
> View attachment 2591631








Blender - Open Data Benchmark


Blender Open Data is a platform to collect, display and query the results of hardware and software performance tests - provided by the public.




opendata.blender.org





14843


----------



## alasdairvfr

J7SC said:


> ...sorry about that, my first time subbing with Blender - getting a baseline with that and Octane to see what the 2x 2080 Ti + 1x 3090 Strix can do (if I figure out the multi-GPU settings in Blender and Octane)


Never apologize for besting my score! Benching is one of the great things about having high end kit!

I ran another couple, I think this version is higher than my previous high score










This is on version 3.3.0, my score is the 14006 one (dozens of pages back on this thread):









Looks like more people woke up to Blender recently since 3.4.0:


----------



## alasdairvfr

KedarWolf said:


> Blender - Open Data Benchmark
> 
> 
> Blender Open Data is a platform to collect, display and query the results of hardware and software performance tests - provided by the public.
> 
> 
> 
> 
> opendata.blender.org
> 
> 
> 
> 
> 
> 14843


Damn and I was just looking at your #1!

Amazing, what were your GPU settings? I was +1650 on mem and +245 on core which was about 3145-3150 on core.


----------



## bleomycin

motivman said:


> Going from artic TP-3 thermal pads on my ram to stock pads meant I could now run +1700 stable on water with temps as low as 10C (I am on a chiller). Before adding my Chiller and using TP-3 pads, I couldn't even run +1500.. go figure. Use the worst pads possible for your vram this gen.


Wow very interesting, guess that's good news I get to save $50-$100 on thermal putty and use the included ekwb pads for once. I plan to use ptm7950 on the gpu die. Considering how expensive that stuff is and the lead time + shipping cost from moddiy I didn't want to risk any issues with thermal pads being incorrect thickness and causing trouble but sounds like I should just send it and hope for the best.


----------



## motivman

bleomycin said:


> Wow very interesting, guess that's good news I get to save $50-$100 on thermal putty and use the included ekwb pads for once. I plan to use ptm7950 on the gpu die. Considering how expensive that stuff is and the lead time + shipping cost from moddiy I didn't want to risk any issues with thermal pads being incorrect thickness and causing trouble but sounds like I should just send it and hope for the best.


If you can, use Thermalright TFX on the core. its very thick and hard to spread, but my gpu hotspot temp has not budged ever since I installed it, compared to Gelid Extreme and MX4, hotspot temps slowly increase over time.


----------



## Penguininattack

Dats what happens when u lm card.


----------



## Miao

InvictuSZN said:


> Has anyone here flashed the 666W bios from Galax on the *Founders Edition*? I dont wanna risk anything since it has no bios switch lol


look at my previous post and the replys, it seems that we can't flash the FE :-|
but I can't tell you exactly why. 

edit: probably because of the "board id mismatch", there where modded version of nvflash, but not a new one, is this the reason???


----------



## J7SC

alasdairvfr said:


> Never apologize for besting my score! Benching is one of the great things about having high end kit!
> 
> I ran another couple, I think this version is higher than my previous high score
> 
> View attachment 2591639
> 
> 
> This is on version 3.3.0, my score is the 14006 one (dozens of pages back on this thread):
> View attachment 2591640
> 
> 
> Looks like more people woke up to Blender recently since 3.4.0:
> 
> 
> View attachment 2591642


...for now, I am just trying to get some base data for a multi-GPU render machine, so I have been running a variety of candidates today for the first time to get a feel for the different base lines. Will have to go back for each and try some higher settings for each - that and Octane benchmark which is a lot nastier (& much longer). Adding the 3090 into one of the loops tomorrow, so no results for that yet.


----------



## Krzych04650

Smackaroy said:


> What techniques are people doing to their 4090 in order for them to get 29000?
> 
> A month ago I managed to get a score of 28200 which Landed me in the top 100 but that score is long off the leader board now.http://www.3dmark.com/pr/1901708


29000 isn't very hard. This is where well tuned average 4090 bin lands. You only need like +1200 mem and a bit above 3000 core.

Main component is GALAX 666W BIOS, it has like 600 points more than any other BIOS for me. Second is ReBAR, it adds like 200 points or something. Watercooling also improved the score slightly despite lowering VRAM overclock, in games as well, card just works better. There surely are some other tricks I didn't use.

On stock 450W Inno3D I could only do like 27400, with 630W Neptune I could cross 28K, but I need Galax without all the phantom limiters to reach 29K. I can do up to 29300 but this is not game stable. It feels like a completely different card compared to stock BIOS and air.


----------



## Blameless

Is there any way to get a zero RPM feature to work on the Galax firmware's fan control?

So far I cannot set a PWM signal below 30% and 30% is almost a thousand RPM on my Windforce...it's not audible, but it is producing some rather chilly GDDR6X temps during long periods of idle, even with 'prefer maximum performance' enabled.


----------



## borant

motivman said:


> Going from artic TP-3 thermal pads on my ram to stock pads meant I could now run +1700 stable on water with temps as low as 10C (I am on a chiller). Before adding my Chiller and using TP-3 pads, I couldn't even run +1500.. go figure. Use the worst pads possible for your vram this gen.


What are idle/load memory temperatures?


----------



## elbramso

Still think it's kinda wierd to use terrible mem thermal pads for OC but well... That's the world we live in today^^
Wondering if 2,5W/mK is bad enough to be good.


----------



## tomerturbo

hello everyone i have asus strix rtx 4090 oc i overclocked the card in games the hot spot get to 75c and in 3DMARK TIME SPY AND PORT ROYAL THE HOT SPOT GET TO 82C DOES IT RISKY WHAT IS THE SAFE TEMP FOR HOT SPOT?


----------



## leonman44

Lol my 4090 tuf won’t lock inside the Z690 Hero motherboard pci express no matter how much I try it hits the motherboards cover.


----------



## yzonker

I'm guessing I'm at the bottom of the Kedarwolf runs. lol 

I ran that at 3105 core and +1820 VRAM.


----------



## leonman44

Uninstalled the cover but now it hits on the fittings of the distro plate RIP , I can’t fit it. I must wait for the water block.

i will manually lock the card by raising the motherboards clip , if it works it works.


----------



## yzonker

leonman44 said:


> Uninstalled the cover but now it hits on the fittings of the distro plate RIP , I can’t fit it. I must wait for the water block.


Temporary vertical mount or another machine just to test it out before disassembly?


----------



## yzonker

BenchOS for the almost win...


----------



## rahkmae

I don't know what's wrong with my EKW Strix waterblock, not possible paly Elite Dangerous VR anymore, temp go up to 80 in station, this is not normal...3 times installed block and nothing better...is possible with this block is bad? Air cooled was 10 times better temp. I have 3 rads and new pump, pump must be working max setting 4500 and all vents also in max, changed also liquid in system, nothing help, also temp go very quickly up and town 10 degrees.


----------



## yzonker

rahkmae said:


> I don't know what's wrong with my EKW Strix waterblock, not possible paly Elite Dangerous VR anymore, temp go up to 80 in station, this is not normal...3 times installed block and nothing better...is possible with this block is bad? Air cooled was 10 times better temp. I have 3 rads and new pump, pump must be working max setting 4500 and all vents also in max, changed also liquid in system, nothing help, also temp go very quickly up and town 10 degrees.


Gotta be poor core contact/pressure. What pads did you use? Was it bad when you first mounted it or did it degrade over time?


----------



## leonman44

Nnnnope and no way to vertical mount it as my bottom rad is so fat in push/pull config that I lose the bottom pci express slot so it simply can’t fit.

I will raise the clip manually and as I see it it should make full contact with the board , I know that this is not the correct way but I have no other choice the water block will arrive in 20-30 days at best. It’s simply unacceptable that the Asus products are not fully compatible between them.


----------



## yzonker

leonman44 said:


> Nnnnope and no way to vertical mount it as my bottom rad is so fat in push/pull config that I lose the bottom pci express slot so it simply can’t fit.
> 
> I will raise the clip manually and as I see it it should make full contact with the board , I know that this is not the correct way but I have no other choice the water block will arrive in 20-30 days at best. It’s simply unacceptable that the Asus products are not fully compatible between them.


Well I was thinking something much more janky like just set it up outside the case if there's a riser cable long enough.


----------



## leonman44

yzonker said:


> Well I was thinking something much more janky like just set it up outside the case if there's a riser cable long enough.


That could do it but sadly I don’t own something like that , I always believed that risers may result in a minimal perf loss so I never bought one. Never seen such issues before.


----------



## yzonker

leonman44 said:


> That could do it but sadly I don’t own something like that , I always believed that risers may result in a minimal perf loss so I never bought one. Never seen such issues before.


If I were in your situation, I would at least want to test the card stock. So I would likely just buy a riser cable to use temporarily.


----------



## Blameless

elbramso said:


> Wondering if 2,5W/mK is bad enough to be good.


You should be able to buy a 30*30cm sheet of 1W/mK Laird green foam for less than 20 eurodollarpounds in most of the western world.


----------



## andreiga76

Using my Gigabyte 4090 Master, I got this 11011 on Speedway, default bios, max voltage/power, +190 core (Master bios has higher default clock curve), +1850 mem, air stock cooler.
I can go higher on memory and clocks, but starting to have some crashes in RT games like Metro EE / Control / Quake 2 RTX.
Is this a good card and should I expect more if I watercool it or leave it as it is?
Speedway link

Tried the Galax 666 bios, I get more but with a lot more noise and power usage, I don't like it because I cannot get fans below 30% with Afterburner and it is quite loud, it seems this bios has issues with Master's fans.


----------



## Carillo

andreiga76 said:


> Using my Gigabyte 4090 Master, I got this 11011 on Speedway, default bios, max voltage/power, +190 core (Master bios has higher default clock curve), +1850 mem, air stock cooler.
> I can go higher on memory and clocks, but starting to have some crashes in RT games like Metro EE / Control / Quake 2 RTX.
> Is this a good card and should I expect more if I watercool it or leave it as it is?
> Speedway link
> 
> Tried the Galax 666 bios, I get more but with a lot more noise and power usage, I don't like it because I cannot get fans below 30% with Afterburner and it is quite loud, it seems this bios has issues with Master's fans.


Have you checked that your memory scales with 1850 offset? I bet you get higher scores with +1500


----------



## J7SC

What to do on New Year's day ? A spot of benching of course... I did a few more runs with Blender and up to just under 14,600. Then I noticed that I was right in the sweet range for VRAM and Core temps, so time to take some of the sand out of the bag  

...as usual, I run stock Win 10 OS, regular water-cooled setup with ~24 C ambient and the trusty 5950X...










@Blameless ...I find the Galax vbios to be quite different than the G-G-OC...not so much in outright clocks, but stability (PLL?) and VRAM runs 2 C warmer . Anyway, below are the bench settings (top) and then the game settings I use. With no extra voltage, the core will hit up to 3045, a 'bit more' with the full meal deal package. Are you using 'curve' or the nvidia-smi.exe utility for your gaming control ?










@alasdairvfr ...noticed that you're nibbling on my FS Ultra a few days ago! That bench scares me re. power envelope w/Galax vbios...also Speedway is potentially badly-written code; needs system reboot to clear things out of memory.


----------



## rahkmae

yzonker said:


> Gotta be poor core contact/pressure. What pads did you use? Was it bad when you first mounted it or did it degrade over time?


were before grizzly pads but i changed ekw orig pads, pads is not problem, i think this little screws does not give proper contact, there have always been screws with springs and now not...


----------



## motivman

borant said:


> What are idle/load memory temperatures?


result with 16C water.


----------



## Blameless

J7SC said:


> @Blameless ...I find the Galax vbios to be quite different than the G-G-OC...not so much in outright clocks, but stability (PLL?) and VRAM runs 2 C warmer . Anyway, below are the bench settings (top) and then the game settings I use. With no extra voltage, the core will hit up to 3045, a 'bit more' with the full meal deal package. Are you using 'curve' or the nvidia-smi.exe utility for your gaming control ?


I'm using MSI AB's curve editor.

The Galax BIOS likely has a different load-line or VRM PWM frequency, but I'm not noticing a consistent difference in clocks or performance that can't be explained via power limit, and temperatures are within margin of error on my stock cooled Windforce. I didn't play around too much with the Gaming OC BIOS. It did allow for zero rpm mode, but the curve was even stranger than the Galax firmware and it was causing my middle fan to lag behind the others significantly.

Right now the Galax seems to be the best bet as I'm shooting for settings that will run anything I reasonably can without hitting any limiters and is also unconditionally stable. Only the lack of a zero rpm fan mode, and the potential for the GDDR6X to fall to temps lower than is stable in memtest_vulkan, makes it less than ideal.


----------



## bleomycin

motivman said:


> If you can, use Thermalright TFX on the core. its very thick and hard to spread, but my gpu hotspot temp has not budged ever since I installed it, compared to Gelid Extreme and MX4, hotspot temps slowly increase over time.


Thanks for the info! That was my original plan and I actually bought a tube of TFX after much research, glad to hear it's been working out well for you. How did you apply it? Did you spread a very thin layer over the entire core? A thick layer on the entire core? Something else? I used thermal grizzly and mx-4 on my 2080ti and the hotspot temp rose drastically over time. Never again.

With my card going in a hardline build the pain of needing to disassemble it basically ever again makes me extremely motivated to go with whatever product has the highest chance of success on the very first shot. This is why I was considering PTM7950 + thermal putty but it sounds like this stupid memory has thrown a wrench into those plans! I don't fully trust EK to have provided the correct thickness thermal pads out of the box.


----------



## yzonker

Blameless said:


> I'm using MSI AB's curve editor.
> 
> The Galax BIOS likely has a different load-line or VRM PWM frequency, but I'm not noticing a consistent difference in clocks or performance that can't be explained via power limit, and temperatures are within margin of error on my stock cooled Windforce. I didn't play around too much with the Gaming OC BIOS. It did allow for zero rpm mode, but the curve was even stranger than the Galax firmware and it was causing my middle fan to lag behind the others significantly.
> 
> Right now the Galax seems to be the best bet as I'm shooting for settings that will run anything I reasonably can without hitting any limiters and is also unconditionally stable. Only the lack of a zero rpm fan mode, and the potential for the GDDR6X to fall to temps lower than is stable in memtest_vulkan, makes it less than ideal.


Of course I don't have any fans connected (waterblock), but GPUZ shows 0% when my card (TUF) is idling with the Galax bios. You don't get that, or is it just when you use the AB custom curve that it won't do fan stop? Just fiddling with the AB custom curve, turning on "Override zero fan speed with hardware curve" got it to go back to 0% at idle.


----------



## J7SC

Blameless said:


> I'm using MSI AB's curve editor.
> 
> The Galax BIOS likely has a different load-line or VRM PWM frequency, but I'm not noticing a consistent difference in clocks or performance that can't be explained via power limit, and temperatures are within margin of error on my stock cooled Windforce. I didn't play around too much with the Gaming OC BIOS. It did allow for zero rpm mode, but the curve was even stranger than the Galax firmware and it was causing my middle fan to lag behind the others significantly.
> 
> Right now the Galax seems to be the best bet as I'm shooting for settings that will run anything I reasonably can without hitting any limiters and is also unconditionally stable. Only the lack of a zero rpm fan mode, and the potential for the GDDR6X to fall to temps lower than is stable in memtest_vulkan, makes it less than ideal.


Yeah, Galax clocks are not appreciably different that Gi-G-OC stock, but clearly, it is much more stable at higher clocks (load line, PLL, other?), noting that I usually have the NV panel set to 'optimal power' rather than prefer 'maximum power'.

On VRAM, I mentioned before that there seem to be select 'memory holes' (with either vbios) as there are some values around +1700 which work for my setup (per above), but immediately above or below in terms of slider adjustments for VRAM > no cigar.

Below is a comp run with memtest_vulkan of both Galax and Gigabyte at +1526...I had GPUZ open, and the Galax peaked 2 C higher than the G-G-OC (which is not a bad thing for my VRAM w/TG-10 thermal putty).

FYI, laid down a few very good runs win Octane bench...that bench is loooong, but also a good tool to test out VRAM stability


----------



## alasdairvfr

J7SC said:


> @alasdairvfr ...noticed that you're nibbling on my FS Ultra a few days ago! That bench scares me re. power envelope w/Galax vbios...also Speedway is potentially badly-written code; needs system reboot to clear things out of memory.
> View attachment 2591742


i limited power to 110% but it hit that 605w no problem. giga oc bios never really passed 570-580.
Also noticed a failed run in Speedway can really muck up the system. actually i flashed off galax last night as I was getting random reboots at idle. was showing up as an interconnect whea on core 14 but my system had a few weeks uptime prior to flashing, then 3 random reboots in 24h.

12h later still seems okay.


----------



## J7SC

alasdairvfr said:


> i limited power to 110% but it hit that 605w no problem. giga oc bios never really passed 570-580.
> Also noticed a failed run in Speedway can really muck up the system. actually i flashed off galax last night as I was getting random reboots at idle. was showing up as an interconnect whea on core 14 but my system had a few weeks uptime prior to flashing, then 3 random reboots in 24h.
> 
> 12h later still seems okay.


hmm, couldn't you restrict clocks (core, VRAM) w/Galax on boot-up / Windows via profile 1 in MSI AB which should also stick around for 'sleep' mode ?


----------



## alasdairvfr

I'm looking to try MLPerf out in the next few days, i think it will make for a fun, slightly more involved benchmark for highend systems. anyone with multigpu rigs should be able to have interesting results

MLCommons 

A guide (a bit older but shows the rough steps):
MLPerf: Getting your feet wet with benchmarking ML workloads | by Ramesh Radhakrishnan | Analytics Vidhya | Medium

I'll be doing via WSL Ubuntu 20.04 but better performance can be had on a spare nvme with native Linux ofc


----------



## Blameless

yzonker said:


> Of course I don't have any fans connected (waterblock), but GPUZ shows 0% when my card (TUF) is idling with the Galax bios. You don't get that, or is it just when you use the AB custom curve that it won't do fan stop? Just fiddling with the AB custom curve, turning on "Override zero fan speed with hardware curve" got it to go back to 0% at idle.


I cannot seem to set below 30%, even with custom curves in MSI AB. I'll play with it some more later, but I don't want MSI AB running in the background, so I'd prefer to keep the firmware control mode, which also doesn't seem to take below 30%.



J7SC said:


> On VRAM, I mentioned before that there seem to be select 'memory holes' (with either vbios) as there are some values around +1700 which work for my setup (per above), but immediately above or below in terms of slider adjustments for VRAM > no cigar.


I'm not sure where the timing straps fall on the 4090 yet, but I generally have success at avoiding VRAM frequency holes on NVIDIA cards when sticking to whole number multipliers of the memory reference clock. In this case that should be +1512MHz, +1620MHz, and +1728MHz.


----------



## alasdairvfr

J7SC said:


> hmm, couldn't you restrict clocks (core, VRAM) w/Galax on boot-up / Windows via profile 1 in MSI AB which should also stick around for 'sleep' mode ?


im actually not sure what clock/core combo is safe or if that was even the issue. if its a cpu/memory issue (idle crash whea is a cpu/memory issue under normal circumstances) then i will work to resolve that and flash back to galax.

its just weird since I have been running solid for months and this one bios seems to correlate


----------



## motivman

bleomycin said:


> Thanks for the info! That was my original plan and I actually bought a tube of TFX after much research, glad to hear it's been working out well for you. How did you apply it? Did you spread a very thin layer over the entire core? A thick layer on the entire core? Something else? I used thermal grizzly and mx-4 on my 2080ti and the hotspot temp rose drastically over time. Never again.
> 
> With my card going in a hardline build the pain of needing to disassemble it basically ever again makes me extremely motivated to go with whatever product has the highest chance of success on the very first shot. This is why I was considering PTM7950 + thermal putty but it sounds like this stupid memory has thrown a wrench into those plans! I don't fully trust EK to have provided the correct thickness thermal pads out of the box.


soak the tube in boiling water for about 10-15 minutes, then spread a very thick layer over GPU die.


----------



## J7SC

alasdairvfr said:


> I'm looking to try MLPerf out in the next few days, i think it will make for a fun, slightly more involved benchmark for highend systems. anyone with multigpu rigs should be able to have interesting results
> 
> MLCommons
> 
> A guide (a bit older but shows the rough steps):
> MLPerf: Getting your feet wet with benchmarking ML workloads | by Ramesh Radhakrishnan | Analytics Vidhya | Medium
> 
> I'll be doing via WSL Ubuntu 20.04 but better performance can be had on a spare nvme with native Linux ofc


That looks *very interesting*. I have got only my toes wet re. machine learning for work related apps, but like to get much more into it. I've got a Ubuntu setup, though not with the 4090 yet.


----------



## andreiga76

Carillo said:


> Have you checked that your memory scales with 1850 offset? I bet you get higher scores with +1500


First I tried +0, +500, +1000, 1300, 1500, 1600, 1700, 1800, 1900 and 2000 to see how it goes in performance and when it crashes, every time I gained some performance, at 2000 crashed after a few runs in Speedway (it has a +60% load on mem controller).
I ran successfully at 1900, but unstable in some RTX games while playing for some hours for a few days.
For the last week I used 1850 and no issues, let's hope it stays this way.
What I found strange what happens after +1800, between +1500 and +1800 it goes up with every step but not much, if I set +1850 I see a much bigger delta comparing to previous steps (+50 vs +100 on Speedway scores).
I tested also with ECC on to see when it starts to go down at perf, at +1900 I see a very little decrease or stays the same, confirmed by games I think because errors start to appear.


----------



## Carillo

andreiga76 said:


> First I tried +0, +500, +1000, 1300, 1500, 1600, 1700, 1800, 1900 and 2000 to see how it goes in performance and when it crashes, every time I gained some performance, at 2000 crashed after a few runs in Speedway (it has a +60% load on mem controller).
> I ran successfully at 1900, but unstable in some RTX games while playing for some hours for a few days.
> For the last week I used 1850 and no issues, let's hope it stays this way.
> What I found strange what happens after +1800, between +1500 and +1800 it goes up with every step but not much, if I set +1850 I see a much bigger delta comparing to previous steps (+50 vs +100 on Speedway scores).
> I tested also with ECC on to see when it starts to go down at perf, at +1900 I see a very little decrease or stays the same, confirmed by games I think because errors start to appear.


Sound like you got some really good memory silicon then, congrats


----------



## alasdairvfr

J7SC said:


> That looks *very interesting*. I have got only my toes wet re. machine learning for work related apps, but like to get much more into it. I've got a Ubuntu setup, though not with the 4090 yet.


I have been playing around with ML for a few months now. my old rig is bow a FT ML system doing RL (AWS DeepRacer) on the 3080ti + my old x370 board and 3900xt.

I also do a little dabbling with it at work which caused me to look in to how one would bench/stress an ML system or cluster.

if you dont have Linux on a 4090 system, WSL + Docker desktop for Windows you can get the job done easily enough


----------



## J7SC

alasdairvfr said:


> I have been playing around with ML for a few months now. my old rig is bow a FT ML system doing RL (AWS DeepRacer) on the 3080ti + my old x370 board and 3900xt.
> 
> I also do a little dabbling with it at work which caused me to look in to how one would bench/stress an ML system or cluster.
> 
> if you dont have Linux on a 4090 system, WSL + Docker desktop for Windows you can get the job done easily enough


Thanks !  ...as much as I hate Windows, that's where the 4090 sits. But I'll also try one of the 2080 Tis in Ubuntu.


----------



## leonman44

Ok guys I did it , I could press the card up to a point until I could see that the plastic clip just starting to move and then I manually flipped it with my finger. It looks like it’s fully inserted but it may be just a very tiny amount still to go in.









Booted into windows flawlessly and the reworked pc tower looks good.








The new cable is very stiff it was hard to bend it like that but it’s not the adapter and i made sure that the connector is fully inserted on all corners.

currently uninstalling the Nvidia GPU drivers using DDU and installing the latest Nvidia drivers.


----------



## StreaMRoLLeR

andreiga76 said:


> First I tried +0, +500, +1000, 1300, 1500, 1600, 1700, 1800, 1900 and 2000 to see how it goes in performance and when it crashes, every time I gained some performance, at 2000 crashed after a few runs in Speedway (it has a +60% load on mem controller).
> I ran successfully at 1900, but unstable in some RTX games while playing for some hours for a few days.
> For the last week I used 1850 and no issues, let's hope it stays this way.
> What I found strange what happens after +1800, between +1500 and +1800 it goes up with every step but not much, if I set +1850 I see a much bigger delta comparing to previous steps (+50 vs +100 on Speedway scores).
> I tested also with ECC on to see when it starts to go down at perf, at +1900 I see a very little decrease or stays the same, confirmed by games I think because errors start to appear.


Dont recommend going over 1500 for 24/7 gaming. Here 1 user reported BSOD and crashing after daily 1700 mem for gaming. He then dialed back to 1500 and voila ! no bsod. You might degrade that golden memory at 1900 

Keep it for benchmark


----------



## Benni231990

StreaMRoLLeR said:


> Dont recommend going over 1500 for 24/7 gaming. Here 1 user reported BSOD and crashing after daily 1700 mem for gaming. He then dialed back to 1500 and voila ! no bsod. You might degrade that golden memory at 1900
> 
> Keep it for benchmark


I can Confirm that i can easy +1850 for bench but 24/7 gaming i can +1600 with no BSOD or flicker or black screen


----------



## leonman44

Just after ~20 benchmarks i think that this is relatively close to what this card can do: I scored 10 923 in Speed Way

Do you think my card is at least average ?

Also the coil whine is killing me i launched god of war and with only 200 fps the card screams so hard ! i opened my soundbar but i could still hear it.
Is this also normal for these cards? My 2080 was completely silenced , only on uncapped menus i could hear something minor.


----------



## yzonker

leonman44 said:


> Just after ~20 benchmarks i think that this is relatively close to what this card can do: I scored 10 923 in Speed Way
> 
> Do you think my card is at least average ?
> 
> Also the coil whine is killing me i launched god of war and with only 200 fps the card screams so hard ! i opened my soundbar but i could still hear it.
> Is this also normal for these cards? My 2080 was completely silenced , only on uncapped menus i could hear something minor.


Yea Asus has coil whine issues this generation. My TUF isn't loud, but I can hear it when the room is quiet.


----------



## Nizzen

yzonker said:


> Yea Asus has coil whine issues this generation. My TUF isn't loud, but I can hear it when the room is quiet.


My 4090 strix had a bit coil whine at the start. Now it's almost impossible to hear even with open testbench. Same happen with my old 3090 strix. Coilwhine from the start, but now it's gone. I don't know why.


----------



## J7SC

For gaming rather than benching, I typically use the stock bios (yeah to dual bios cards) with no extra PL or voltage, core clocks at ~ 3030 and VRAM at 1500 (24 Gbps). Runs nice and cool while handling all the games a-ok at ultra settings.


----------



## Talon2016

leonman44 said:


> Lol my 4090 tuf won’t lock inside the Z690 Hero motherboard pci express no matter how much I try it hits the motherboards cover.
> View attachment 2591680
> 
> View attachment 2591681


I had this exact issue with my Z690 Hero board and a TUF 4080 and Zotac 4080. Neither would lock in the slot. Latch would never close. I actually thought I was going to break the slot or card trying. To force it in.

I think the manager at MC thought I was making it up. Glad to see I’m not alone and it wasnt just me.

My FE 4090 slotted no issue.


----------



## StreaMRoLLeR

Nizzen said:


> My 4090 strix had a bit coil whine at the start. Now it's almost impossible to hear even with open testbench. Same happen with my old 3090 strix. Coilwhine from the start, but now it's gone. I don't know why.


For me MSI’s HCI coils do whine like crazy when cold booted> in game menu> first 2-3mins into game > whine is reduced to minimum. Must be with temps ?


----------



## leonman44

Talon2016 said:


> I had this exact issue with my Z690 Hero board and a TUF 4080 and Zotac 4080. Neither would lock in the slot. Latch would never close. I actually thought I was going to break the slot or card trying. To force it in.
> 
> I think the manager at MC thought I was making it up. Glad to see I’m not alone and it wasnt just me.
> 
> My FE 4090 slotted no issue.


In the end of the day it fits just fine it just that the latch needs to be pushed with your finger back. I believe it’s not a motherboards fault but rather the backplates of these cards being wider than the previous gen. I even uninstalled the motherboards chip cooler that hit with the card and then it started to hit on my distro plates fittings. When I install my waterblock I expect it to 100% fully insert the slot.



yzonker said:


> Yea Asus has coil whine issues this generation. My TUF isn't loud, but I can hear it when the room is quiet.





Nizzen said:


> My 4090 strix had a bit coil whine at the start. Now it's almost impossible to hear even with open testbench. Same happen with my old 3090 strix. Coilwhine from the start, but now it's gone. I don't know why.


Guys I recorded it , did anyone had something like that during normal gameplay?

My 4090’s Coil Whine 

if it goes away I will keep it , but daaamn this card really screams. I should send it to the voice.


----------



## StreaMRoLLeR

leonman44 said:


> In the end of the day it fits just fine it just that the latch needs to be pushed with your finger back. I believe it’s not a motherboards fault but rather the backplates of these cards being wider than the previous gen. I even uninstalled the motherboards chip cooler that hit with the card and then it started to hit on my distro plates fittings. When I install my waterblock I expect it to 100% fully insert the slot.
> 
> 
> 
> 
> Guys I recorded it , did anyone had something like that during normal gameplay?
> 
> My 4090’s Coil Whine
> 
> if it goes away I will keep it , but daaamn this card really screams. I should send it to the voice.


It will never " go away " its observed to lessen in time or you can do uV + OC keep the card below 350w to reduce CW.

Honestly the audio log seems worst so far. My Suprim sounded like that in first week


----------



## Mozart23

Guys I need your help to decide. Would you return a card with max OC 190core and 1400mem? It's the Suprim X. Overall I like it a lot and it has pretty quiet coil whine which is nice. But I'm little bugged that it can't do 1500mem :/. Am I spoiled or smtng xd? I still have 2 days to return it and try another one, but I just don't know, if it is worth the hassle.


----------



## elbramso

Mozart23 said:


> Guys I need your help to decide. Would you return a card with max OC 190core and 1400mem? It's the Suprim X. Overall I like it a lot and it has pretty quiet coil whine which is nice. But I'm little bugged that it can't do 1500mem :/. Am I spoiled or smtng xd? I still have 2 days to return it and try another one, but I just don't know, if it is worth the hassle.


So u'd like to return a card because it's lagging +100 on the mem? Wow hardware needs to be pretty perfect these days...
My advice, keep it - the next card u get might only hit +150core und +750mem...


----------



## StreaMRoLLeR

Mozart23 said:


> Guys I need your help to decide. Would you return a card with max OC 190core and 1400mem? It's the Suprim X. Overall I like it a lot and it has pretty quiet coil whine which is nice. But I'm little bugged that it can't do 1500mem :/. Am I spoiled or smtng xd? I still have 2 days to return it and try another one, but I just don't know, if it is worth the hassle.


lol

You have a no coil whine Suprim and you are sad ? Please man dont be ridiculous  You have the best pcb + build quality +most silent gpu and no coil whine card.Go thank RNG Jesus for it.

For 1400 MEM its above average


----------



## Nico67

rahkmae said:


> I don't know what's wrong with my EKW Strix waterblock, not possible paly Elite Dangerous VR anymore, temp go up to 80 in station, this is not normal...3 times installed block and nothing better...is possible with this block is bad? Air cooled was 10 times better temp. I have 3 rads and new pump, pump must be working max setting 4500 and all vents also in max, changed also liquid in system, nothing help, also temp go very quickly up and town 10 degrees.


Put a razor blade across the gpu contact pad on the block and see if you can see light under the edges. Mine was bad as they usually are, so I lapped it until it was good. Check the four standoffs around the gpu are tight too, and may even need to lap those a little.
EK are pretty bad when it comes down to gpu pad finish, the rest doesn't matter so much.


----------



## elbramso

Nico67 said:


> Put a razor blade across the gpu contact pad on the block and see if you can see light under the edges. Mine was bad as they usually are, so I lapped it until it was good. Check the four standoffs around the gpu are tight too, and may even need to lap those a little.
> EK are pretty bad when it comes down to gpu pad finish, the rest doesn't matter so much.


Ouch, u really had to lap an already overpriced block? That sounds awful tbh


----------



## Mozart23

StreaMRoLLeR said:


> lol
> 
> You have a no coil whine Suprim and you are sad ? Please man dont be ridiculous  You have the best pcb + build quality +most silent gpu and no coil whine card.Go thank RNG Jesus for it.
> 
> For 1400 MEM its above average


Thank you. I know I must sound like an idiot. It's just the card was pretty expensive and I wanted it to be perfect in every way, because I want to keep that card for some time. I have to stop browsing these forums for some time. It's like everybody can hit 1800mem and over 3100MHz on core and it makes me feel like I lost the lottery hard, which I would not actually mind if the card was not that expensive.


----------



## Nico67

elbramso said:


> Ouch, u really had to lap an already overpriced block? That sounds awful tbh


I'd say very few gpu block are good, out side those starting to use cold plate inserts. EK is still the most accessible and there water channelling is good at least


----------



## elbramso

Mozart23 said:


> Thank you. I know I must sound like an idiot. It's just the card was pretty expensive and I wanted it to be perfect in every way, because I want to keep that card for some time. I have to stop browsing these forums for some time. It's like everybody can hit 1800mem and over 3100MHz on core and it makes me feel like I lost the lottery hard, which I would not actually mind if the card was not that expensive.


I absolutely feel this^^

My system is always great until someone else comes online and smacks my benchmark scores or whatever 🤣
My Suprim is great but kinda noisy and I know it's gonna annoy me once I start gaming again


----------



## mirkendargen

Nico67 said:


> I'd say very few gpu block are good, out side those starting to use cold plate inserts. EK is still the most accessible and there water channelling is good at least


So far there have been multiple reports of messed up EK blocks, one or two (i forget which) reports of not-the-greatest performing Phanteks blocks (unclear how popular these are to decide if it's systemic) and 0 reports of bad Bykski blocks (for 4090's).

Seems to me everyone should just save $100+ and go with Bykski.


----------



## leonman44

Mozart23 said:


> Thank you. I know I must sound like an idiot. It's just the card was pretty expensive and I wanted it to be perfect in every way, because I want to keep that card for some time. I have to stop browsing these forums for some time. It's like everybody can hit 1800mem and over 3100MHz on core and it makes me feel like I lost the lottery hard, which I would not actually mind if the card was not that expensive.


And what is the real difference in gaming anyway ? If you want to break benchmark records then it won’t do it , in your position If gaming is all you want I wouldn’t even bother. My card can’t even do 3000 on core gaming stable and I just don’t care the cards performance is massive it’s the coil whine that destroys me


----------



## StreaMRoLLeR

Mozart23 said:


> Thank you. I know I must sound like an idiot. It's just the card was pretty expensive and I wanted it to be perfect in every way, because I want to keep that card for some time. I have to stop browsing these forums for some time. It's like everybody can hit 1800mem and over 3100MHz on core and it makes me feel like I lost the lottery hard, which I would not actually mind if the card was not that expensive.


Over +1500 mhz ram overclock is actually dangerous. So in theory if you want to achieve best results you would max stay at 1500mhz ( gaming purpose not benchmark). Your mem can do 1400mhz. So its only 100mhz loss. Keep in mind there is 1 user here reported getting Bsod and crashes after playing at +1700mhz for 2 week. Then he had to dial back for 1500mhz. Also +190 core and no cw is real winner


----------



## pantsoftime

Hi guys. I just got a new Gigabyte Gaming OC up and running yesterday. So far I've had some reasonable luck - I can bench stable up to +2000 on mem but closer to +1500 for games (still getting this dialed in). The core currently maxes out around 3045MHz - I've got a waterblock on the way so I'll work on this more when I can cool it properly. 

Anyways - I did want to report that I flashed to the Aorus Master BIOS and saw a small but measurable improvement in benchmark scores. For anyone else using Gigabyte BIOSes I do recommend the Aorus Master BIOS over the Gaming OC BIOS. Others here have reported similar results but I wanted to confirm I that see the same improvements.


----------



## Mozart23

leonman44 said:


> And what is the real difference in gaming anyway ? If you want to break benchmark records then it won’t do it , in your position If gaming is all you want I wouldn’t even bother. My card can’t even do 3000 on core gaming stable and I just don’t care the cards performance is massive it’s the coil whine that destroys me


Sorry to hear that. I had massive coil whine on my 2080Ti and so I know how frustrating it is


----------



## Mozart23

StreaMRoLLeR said:


> Over +1500 mhz ram overclock is actually dangerous. So in theory if you want to achieve best results you would max stay at 1500mhz ( gaming purpose not benchmark). Your mem can do 1400mhz. So its only 100mhz loss. Keep in mind there is 1 user here reported getting Bsod and crashes after playing at +1700mhz for 2 week. Then he had to dial back for 1500mhz. Also +190 core and no cw is real winner


Thank you again. I really appreciate it. I don't feel that bad about my card anymore.


----------



## rahkmae

Nico67 said:


> Put a razor blade across the gpu contact pad on the block and see if you can see light under the edges. Mine was bad as they usually are, so I lapped it until it was good. Check the four standoffs around the gpu are tight too, and may even need to lap those a little.
> EK are pretty bad when it comes down to gpu pad finish, the rest doesn't matter so much.


what temperatures do you have in PR test def settings?


----------



## Krzych04650

mirkendargen said:


> So far there have been multiple reports of messed up EK blocks, one or two (i forget which) reports of not-the-greatest performing Phanteks blocks (unclear how popular these are to decide if it's systemic) and 0 reports of bad Bykski blocks (for 4090's).
> 
> Seems to me everyone should just save $100+ and go with Bykski.


Yea I just watched one video yesterday where Phanteks block had 29C delta at just 500W and it had some missing/misaligned standoffs for the backplate so he could only use half the screws  As good as this block looks, 30C delta is approaching AIO territory.

I am very satisfied with my Alphacool ES block. Very nice and unique design, it comes with single slot IO bracket, nice quality, everything fits perfectly, 18C delta at 500W and 23C delta at 620W which is the highest I can get in any game, basically no problems. Maybe except that the ports are too close together so you cannot connect two regular 10/16 compression fittings next to each other, you have to use something else for at least one port, which is a strange oversight and will surely catch some people off guard.


----------



## elbramso

Krzych04650 said:


> Yea I just watched one video yesterday where Phanteks block had 29C delta at just 500W and it had some missing/misaligned standoffs for the backplate so he could only use half the screws  As good as this block looks, 30C delta is approaching AIO territory.
> 
> I am very satisfied with my Alphacool ES block. Very nice and unique design, it comes with single slot IO bracket, nice quality, everything fits perfectly, 18C delta at 500W and 23C delta at 620W which is the highest I can get in any game, basically no problems. Maybe except that the ports are too close together so you cannot connect two regular 10/16 compression fittings next to each other, you have to use something else for at least one port, which is a strange oversight and will surely catch some people off guard.
> View attachment 2591839
> View attachment 2591840
> View attachment 2591841
> View attachment 2591842


PoE


----------



## Artjsalina5

Wow, all this talk about lapping EK blocks and large phanteks coolant core deltas. I have the standard alphacool aurora block, and it performs absolutely stellar with a nice clean look to boot. The only problem I have with it is that the thermal pads that came with it are so good that the memory doesn't break 40C, unless I intentionally heat up my coolant to 37C or so, which I won't do.

This loses me about 300mhz on memory OC, but eh, who cares. Card is fast already. The card now has better core OC headroom due to the cooler core temps, so that's nice, I guess... But then there's a faint coil whine, so undervolted OC with 80% PL is proper for daily use.

Interested in trying that thermaltake XTX next time. I currently have a 7K Core-Hotspot delta, which is great in my opinion, but will probably go up over time due to pump out. Put the tube in boiling water? Really?


----------



## Blameless

Did some coil whine testing on this Windforce.

Utterly inaudible below ~600 fps in an open test bench at ~1m. Can hear it in Night Raid (2k fps), but only barely. Still mild at 6-7k FPS (shadermark 2.1, from 2004). Probably the least coil while I've had on a GPU in a decade.


----------



## motivman

pantsoftime said:


> Hi guys. I just got a new Gigabyte Gaming OC up and running yesterday. So far I've had some reasonable luck - I can bench stable up to +2000 on mem but closer to +1500 for games (still getting this dialed in). The core currently maxes out around 3045MHz - I've got a waterblock on the way so I'll work on this more when I can cool it properly.
> 
> Anyways - I did want to report that I flashed to the Aorus Master BIOS and saw a small but measurable improvement in benchmark scores. For anyone else using Gigabyte BIOSes I do recommend the Aorus Master BIOS over the Gaming OC BIOS. Others here have reported similar results but I wanted to confirm I that see the same improvements.


flash galax 666w bios and call it a day once you get in on a waterblock. Not sure how the galax bios plays with the gigabyte stock fans.


----------



## Pk1

Krzych04650 said:


> Yea I just watched one video yesterday where Phanteks block had 29C delta at just 500W and it had some missing/misaligned standoffs for the backplate so he could only use half the screws  As good as this block looks, 30C delta is approaching AIO territory.
> 
> I am very satisfied with my Alphacool ES block. Very nice and unique design, it comes with single slot IO bracket, nice quality, everything fits perfectly, 18C delta at 500W and 23C delta at 620W which is the highest I can get in any game, basically no problems. Maybe except that the ports are too close together so you cannot connect two regular 10/16 compression fittings next to each other, you have to use something else for at least one port, which is a strange oversight and will surely catch some people off guard.
> View attachment 2591839
> View attachment 2591840
> View attachment 2591841
> View attachment 2591842


That block looks gorgeous! Is that the same as this? (Just not MSI version). Alphacool Eisblock Aurora Acryl GPX-N RTX 4090 Suprim with Backplate


----------



## Artjsalina5

Pk1 said:


> That block looks gorgeous! Is that the same as this? (Just not MSI version). Alphacool Eisblock Aurora Acryl GPX-N RTX 4090 Suprim with Backplate


No it is this one. ES Block


----------



## J7SC

Krzych04650 said:


> Yea I just watched one video yesterday where Phanteks block had 29C delta at just 500W and it had some missing/misaligned standoffs for the backplate so he could only use half the screws  As good as this block looks, 30C delta is approaching AIO territory.
> 
> I am very satisfied with my Alphacool ES block. Very nice and unique design, it comes with single slot IO bracket, nice quality, everything fits perfectly, 18C delta at 500W and 23C delta at 620W which is the highest I can get in any game, basically no problems. Maybe except that the ports are too close together so you cannot connect two regular 10/16 compression fittings next to each other, you have to use something else for at least one port, which is a strange oversight and will surely catch some people off guard.
> View attachment 2591839
> View attachment 2591840
> View attachment 2591841
> View attachment 2591842


That Alphacool ES block looks and appears to work great ! 

As to the ports being a bit too close together, I have run into that on multiple occasions with other water-cooling gear, ie. certain CPU blocks and also mini-reservoirs (ie. Swiftech MCRES Rev 2)....very annoying during assemblage.


----------



## Pk1

Artjsalina5 said:


> No it is this one. ES Block


Thanks for the info! I have the Byski block coming but I'm not in love with the look of it. Kinda disappointed with the waterblocks available for MSI. The Alphacools look great though.


----------



## Artjsalina5

Pk1 said:


> Thanks for the info! I have the Byski block coming but I'm not in love with the look of it. Kinda disappointed with the waterblocks available for MSI. The Alphacools look great though.


Yes, alphacool this generation really matured in design. Their 30x0 series blocks were chock full of orings and channels like a distro plate. They have good performance too. By all accounts I've seen, they outperform the EK block for a fraction of the price... That being said, I still ordered the HK pro.

Quite disappointed with the Granzon block for the FE however. The design was slick, but the problems with the 12VHPWR connector, the flimsy IO shield, incorrect installation image, and misaligned mounting holes make it a hard pass. That's not to mention the stupid 1.8mm pads Bykski equips it with.


----------



## J7SC

....I kind of like the look of the Bykski block (here on my Giga-G-OC) though I probably would have been just as happy with the A;phacool...fyi, performance of the 4090_Bykski combo is great so far


----------



## KingEngineRevUp

StreaMRoLLeR said:


> Keep in mind there is 1 user here reported getting Bsod and crashes after playing at +1700mhz for 2 week. Then he had to dial back for 1500mhz.


This could have been anything, even a driver update or his house is colder now than it was before.


----------



## StreaMRoLLeR

KingEngineRevUp said:


> This could have been anything, even a driver update it his house is colder now than it was before.


Maybe but He reported back and said after dialing to 1500mhz, all of his problems were gone. I would link it but its at least 30 pages back


----------



## KingEngineRevUp

StreaMRoLLeR said:


> No. He reported back and said after dialing to 1500mhz, all of his problems were gone.


Again, a updated diver can undo a stable overclock OR even your house being colder can cause a unstable OC on the memory.

I'm sure there are people doing +1700 here done launch who probably don't have a problem. 

I just don't think one person experiencing this means it's 100% fact right now. It just means it's a topic of interest to look out for, to see if more users experience the same or not.


----------



## Nd4spdvn

Mozart23 said:


> Guys I need your help to decide. Would you return a card with max OC 190core and 1400mem? It's the Suprim X.


Man, a Suprim X with +190 Core is like 3075-3090 requested clocks. +1400 Mem is also good. Overall I'd say a balanced card where it matters -> games! My Suprim X was also expensive (sic) and good for +165 core and JUST +972 Mem. Reading this forum I felt the same kind of depression as you did, but in the end I started to see the light: very quiet card (still aircooled), built to last, no coil whine that I can hear in my open bench setup, and still a very good performer in games especially after flashing it with the Galax bios (which gave me an extra core clock bin stability, so now it hits 3075-3090 in benches and games do not go down lower than 3060). My 2c, keep it and enjoy life.


----------



## Panchovix

Guys, just as reminder, how much hotspot above core temperature is normal? (On air at least), is 18°C-20°C good or there is something wrong with my mounting pressure? I can get for example 65°C core but 83-85°C hotspot. (TUF 4090)


----------



## Nd4spdvn

Panchovix said:


> Guys, just as reminder, how much hotspot above core temperature is normal? (On air at least), is 18°C-20°C good or there is something wrong with my mounting pressure? I can get for example 65°C core but 83-85°C hotspot. (TUF 4090)


Can tell you in my case, Suprim X 4090, on air, I've seen a variation of 4-12C max, with max of 12C seen in games stressing the entire GPU sections (raster, tensor, RT, MC), like Cyberpunk. Minimum of 4C I've seen in raster only scenarios like Heaven bench despite the core being loaded 100%. On my previous card, EVGA FTW3 3080ti did not see more than 7C delta.


----------



## yzonker

Panchovix said:


> Guys, just as reminder, how much hotspot above core temperature is normal? (On air at least), is 18°C-20°C good or there is something wrong with my mounting pressure? I can get for example 65°C core but 83-85°C hotspot. (TUF 4090)


Yea that's on the high side but not high enough to hurt anything IMO. Probably could use a repaste though.


----------



## Krzych04650

Panchovix said:


> Guys, just as reminder, how much hotspot above core temperature is normal? (On air at least), is 18°C-20°C good or there is something wrong with my mounting pressure? I can get for example 65°C core but 83-85°C hotspot. (TUF 4090)


It varies depending on load but 10 to 15C delta is expected, 20C is a tad on the high side but possible with heavy OC and prolonged load. It doesn't matter really.


----------



## yzonker

KingEngineRevUp said:


> Again, a updated diver can undo a stable overclock OR even your house being colder can cause a unstable OC on the memory.
> 
> I'm sure there are people doing +1700 here done launch who probably don't have a problem.
> 
> I just don't think one person experiencing this means it's 100% fact right now. It just means it's a topic of interest to look out for, to see if more users experience the same or not.


Yea I've been running 1600-1800 ever since I got the card. Only issue is if it's too cold. I've even had 1600 to instantly black screen when I set it right after booting up the machine. I think room temp was only 19C or so. 

I've just been using the AB hot keys to apply +1700 after starting a game.


----------



## KingEngineRevUp

yzonker said:


> I've just been using the AB hot keys to apply +1700 after starting a game.


Yeah I need to do this too


----------



## zzztopzzz

Just like to add a few words about my recent 4090 installation. After scouring IE ads, Craig's List, and eBay since last summer, and more fruitless trips to my local MicroCenter than I care to count, I finally was able to grab my 4090 of choice; a MSI RTX4090 Suprim Liquid X. It was a little more than I wanted to invest but I did get it at the MSRP and not for some scalpers price. Although expensive, at least I don't have to deal with rounding up this and that for liquid cooling.

Fortunately, I have a new Lian Lancool III case. What a godsend this thing is for the builder even though it is a little larger than the norm and slightly heavier. My old EVGA 3080ti took up about 2 1/2 slots. The Suprim is smaller than the 3080ti at slightly more than 1 slot and a little shorter. Once I had the card installed I was taken aback by the fact that I hadn't fully planned for the card's dual fan assembly. Fortunately, at the bottom of the case, near where the PS is installed, there is a metal partition like mount for 3 internal fans. This metal bracket is held to the chassis with just 2 small machine screws at either end and easily accessible. After removing this mounting plate there was nothing more to do other than mount the Suprim's fan assembly to the the existing mount holes using the the 4 screws that came with the card. Upon reinstalling the fan bracket assembly, it looked like the MSI fan kit was part of the Lian case. The water hoses tucked nicely between the card and the opposite end of the mounting bracket and away from the fan assembly. Of course, one has to deal with the 12VHPWR power _connector_ kludge, but without bending the wiring I was able to secure the 2 water lines next to the power connector with a zip-tie.

Upon firing it up for the first time, it's always a welcome relief to see the ASRock boot logo pop up and finally get into the desktop. I watched the card's fan assemble for a good while, and the fans never came on. All of the wiring is integrated into the card itself and there are no external connections. I went ahead and ran Time Spy and after a minute or so the fans finally kicked-in. The hoses were only slightly warm and the power connector never got warm - nothing was ever hot to the touch. I haven't noticed any coil whine and all 4 case fans and the dual card fans are as quiet as can be.

Nice build if I do say so myself.


----------



## Panchovix

Nd4spdvn said:


> Can tell you in my case, Suprim X 4090, on air, I've seen a variation of 4-12C max, with max of 12C seen in games stressing the entire GPU sections (raster, tensor, RT, MC), like Cyberpunk. Minimum of 4C I've seen in raster only scenarios like Heaven bench despite the core being loaded 100%. On my previous card, EVGA FTW3 3080ti did not see more than 7C delta.





yzonker said:


> Yea that's on the high side but not high enough to hurt anything IMO. Probably could use a repaste though.





Krzych04650 said:


> It varies depending on load but 10 to 15C delta is expected, 20C is a tad on the high side but possible with heavy OC and prolonged load. It doesn't matter really.


Thanks guys, sadly I've already repasted and still the same, though way better than the stock 25-30°C delta I had. Bought a carbonaut pad and tomorrow will see how it goes, last thing to do before watercooling I guess.

Can't do RMA since opening the card voids warranty here in Chile (and got denied when I try before opening because the temps were "fine"), so my only ways is to keep testing or change to watercooling. I suspect my cooler is faulty.


----------



## man from atlantis

__
https://www.reddit.com/r/nvidia/comments/100gxul


----------



## lawson67

Panchovix said:


> Thanks guys, sadly I've already repasted and still the same, though way better than the stock 25-30°C delta I had. Bought a carbonaut pad and tomorrow will see how it goes, last thing to do before watercooling I guess.
> 
> Can't do RMA since opening the card voids warranty here in Chile (and got denied when I try before opening because the temps were "fine"), so my only ways is to keep testing or change to watercooling. I suspect my cooler is faulty.


Just put some LM on it then you don't need to worry about repasting again it as it never dries up, ive always used Lm for the last 4 years on my cards and i put LM on my Zotac Amp Extreme and my Hotspot is never 10c over my Edge temp, ive opened all my cards some ive had to RMA but never had any problems, if there any security stickers just use a hairdryer and a very small screwdriver and lift them off then put them back on before RMA, i even take a picture of the stickers orientation before i take them off so i put them back exactly in the right place lol


----------



## KingEngineRevUp

Panchovix said:


> Thanks guys, sadly I've already repasted and still the same, though way better than the stock 25-30°C delta I had. Bought a carbonaut pad and tomorrow will see how it goes, last thing to do before watercooling I guess.
> 
> Can't do RMA since opening the card voids warranty here in Chile (and got denied when I try before opening because the temps were "fine"), so my only ways is to keep testing or change to watercooling. I suspect my cooler is faulty.


Before you do any of that, run a benchmark on the card for 10-15 minutes, get it warm. Get it out, then try torquing down all of the screws more, specially the 4 around the GPU die.

I bet you can compress the pads better and get a little better contact.


----------



## Panchovix

lawson67 said:


> Just put some LM on it then you don't need to worry about repasting again it as it never dries up, ive always used Lm for the last 4 years on my cards and i put LM on my Zotac Amp Extreme and my Hotspot is never 10c over my Edge temp, ive opened all my cards some ive had to RMA but never had any problems, if there any security stickers just use a hairdryer and a very small screwdriver and lift them off then put them back on before RMA, i even take a picture of the stickers orientation before i take them off so i put them back exactly in the right place lol


I'm not sure if I can use bare LM on my 4090 TUF, since it can oxidize? (If I'm not wrong). I think the 4090 TUF cooler is aluminum so that may happen, but if it was cooper there wouldn't be issues. (Correct me if I'm wrong though)



KingEngineRevUp said:


> Before you do any of that, run a benchmark on the card for 10-15 minutes, get it warm. Get it out, then try torquing down all of the screws more, specially the 4 around the GPU die.
> 
> I bet you can compress the pads better and get a little better contact.


Thanks, I've already opened and closed the card, but it doesn't hurt to try again to be honest.


----------



## Miao

Mozart23 said:


> Thank you. I know I must sound like an idiot. It's just the card was pretty expensive and I wanted it to be perfect in every way, because I want to keep that card for some time. I have to stop browsing these forums for some time. It's like everybody can hit 1800mem and over 3100MHz on core and it makes me feel like I lost the lottery hard, which I would not actually mind if the card was not that expensive.


no not all, you just look at some of the best silicones in this forum and just focus on the best ones, ignoring the rest, it's human.

my 4090 does 3060/+1200 and my 13900k has 97sp, what should i do? suicide?

however if it's just for benching, you can't make any real records on over ambient temp anyway, so... keep it 
and then always remember that you could find even a worse one


----------



## GforceGTXer

Anyone else notice that port royal stress tests always crash when overclocking this card, but any other game or stress test I run with higher clocks is perfectly fine and always passes? I even did the speedway stress test and furmark without issue and those use way more juice.


----------



## KingEngineRevUp

Panchovix said:


> I'm not sure if I can use bare LM on my 4090 TUF, since it can oxidize? (If I'm not wrong). I think the 4090 TUF cooler is aluminum so that may happen, but if it was cooper there wouldn't be issues. (Correct me if I'm wrong though)
> 
> 
> Thanks, I've already opened and closed the card, but it doesn't hurt to try again to be honest.


This you don't need to open the card up. Just warm up the thermal pads and see if you can torque and tighten the screws more. So at least it's an easy attempt.


----------



## elbramso

lawson67 said:


> Just put some LM on it then you don't need to worry about repasting again it as it never dries up, ive always used Lm for the last 4 years on my cards and i put LM on my Zotac Amp Extreme and my Hotspot is never 10c over my Edge temp, ive opened all my cards some ive had to RMA but never had any problems, if there any security stickers just use a hairdryer and a very small screwdriver and lift them off then put them back on before RMA, i even take a picture of the stickers orientation before i take them off so i put them back exactly in the right place lol


That is actually a bad advice if u ask me. LM needs a bit more preparation than just put it on the DIE and ur good to go^^ Further LM wouldn't make a bad DIE contact good out of sudden...
I'm not saying LM is bad but it's a total waste on an air-cooled card tbh. And don't thin that Asus (or any other company) would not find out u opened the card just because the sticker is ok... they know which paste they used ;-)


----------



## leonman44

GforceGTXer said:


> Anyone else notice that port royal stress tests always crash when overclocking this card, but any other game or stress test I run with higher clocks is perfectly fine and always passes? I even did the speedway stress test and furmark without issue and those use way more juice.


For me it’s the other way around, I can run 3000 on core on port Royal and higher mem frequency but in god of war I had to decrease at 2955. And it’s not only this card I could bench higher but with every card I owned , there are games out there that are more stressful than every benchmark. Even if you may crash in a game every 2 hours or so you are still unstable.


----------



## J7SC

Per earlier post re. rendering performance, I finally integrated the 3090 Strix into the older (Dec 2018) Threadripper build that already has 2x 2080 Ti to compare it to the RTX 4090. For now, the single 4090 is just a little faster (but not by much) in Octane bench compared to the twins and their new 3090 pal. Once I figure out how the heat and power holds up (dual 1300W HPC PSUs, 3x D5s etc), I can oc the 3x 'older gen' as well...1700W+, I reckon . All that said, it also does show how much progress the AdaL RTX4K actually represents given its power habits vs performance - including in work apps


----------



## leonman44

I just tested to see:

1) if i can get more mem oc with warmer vram temps
2) if the card will crash on desktop when the vram gets cold while still keeping that highest benchable vram frequency.










Highest temps during stress testing + very low rpm : 70c gpu , 82,8c hotspot , 76c memory temp

The card gained nothing , 1500mhz is what i can do in games stable , 1550mhz on benchmarks , 1575 and it crashed.

Then i kept the 1550mhz on desktop and blasted the cooler at 100% so i can cool the vram as much as possible , 26c is as low as it got in the known <30c desktop crash territory , but still nothing happened.

Therefore it seems that not all cards are affected from this just like my unit does not.


----------



## andreiga76

motivman said:


> flash galax 666w bios and call it a day once you get in on a waterblock. Not sure how the galax bios plays with the gigabyte stock fans.


I have the Gigabyte Master and Galax bios is not working fine with the fans, cannot get below 30% at idle and fans are very loud, 1500 rpm which is too much for these fans. Otherwise, Galax bios is much better than Master bios at the same power level.


----------



## 2080tiowner

is it possible to flash Galax bios on a Gainward rtx 4090 phantom ? thanks !


----------



## xAD3r1ty

From my own experience my gainward 4090 can't even hit 3000mhz, so guess i lost the core lottery, starts crashing around 2900+ , I wanted to undervolt anyways so i'm at 2715mhz at 975mv , and 2460mhz at 875mv for really low power, at least my memory is stable at +1650, i can game and bench at +1900 and the scores increase surprisingly but memtest vulkan throws errors until +1675, so don't forget to check for stability if you're going to game only


----------



## xAD3r1ty

2080tiowner said:


> is it possible to flash Galax bios on a Gainward rtx 4090 phantom ? thanks !


I wouldn't bother, these aren't made to run at or near 600w, i think they share same pcb with the palit one


----------



## jediblr

andreiga76 said:


> I have the Gigabyte Master and Galax bios is not working fine with the fans, cannot get below 30% at idle and fans are very loud


same thing on my MSI 4090, so its not GB only


----------



## 2080tiowner

xAD3r1ty said:


> I wouldn't bother, these aren't made to run at or near 600w, i think they share same pcb with the palit one


Yes, it's same pcb as palit gamerock ;-)


----------



## Krzych04650

2080tiowner said:


> is it possible to flash Galax bios on a Gainward rtx 4090 phantom ? thanks !


You can flash any BIOS to any custom card this time around. Only concern could potentially be with Asus since it has additional HDMI port.


----------



## lawson67

elbramso said:


> That is actually a bad advice if u ask me. LM needs a bit more preparation than just put it on the DIE and ur good to go^^ Further LM wouldn't make a bad DIE contact good out of sudden...
> I'm not saying LM is bad but it's a total waste on an air-cooled card tbh. And don't thin that Asus (or any other company) would not find out u opened the card just because the sticker is ok... they know which paste they used ;-)


I belive that people on this forum would know the dangers of not protecting the semiconductor's around the GPU die before applying LM, i use 3 coats of clear nail polish over them before applying the LM, and i never said just bang LM on and your good to go, i presumed rightly or wrong that he would at least know or would have done some research before using it, as for Asus etc not knowing that you had taken apart all i can say is that ive RMA 2 Asus cards in the past removing them from water blocks and doing a Stella job putting back the correct size thermal pads or original ones if they did not tear up removing air cooler repasting the GPU then reinstalled the Aircooler and never had any problems with RMA, yes its a risk you take taking apart a card but many will have to do it eventually as thermal paste will soon dry up pushing 450 -600 watts though these cards which was being reported by users on this forum within weeks of people owning there 4090's which is why i always use LM as it has significant heat transfer properties and does not dry up


----------



## lawson67

Panchovix said:


> I'm not sure if I can use bare LM on my 4090 TUF, since it can oxidize? (If I'm not wrong). I think the 4090 TUF cooler is aluminum so that may happen, but if it was cooper there wouldn't be issues. (Correct me if I'm wrong though)
> 
> 
> Thanks, I've already opened and closed the card, but it doesn't hurt to try again to be honest.


Of course LM wont help if you have bad mounting issues i didn't read your post properly , apologies, and *don't use liquid metal with aluminum* – it will embrittle the aluminum and form an alloy with the coldplate , i try to buy cards that have a nickel plated copper cold plate / vaper chamber in anticipation of using LM

Edit :- ive just looked at some reviews of the Asus 4090 TUF and it does indeed use a nickel plated Vaper chamber so using LM should not be a problem if you can sort out the mounting issues before hand, but please look on you tube on how to correctly apply LM if you go that route, der8auer i belive has a good you tube video showing how to apply LM


----------



## man from atlantis

How long do you guys run memtest vulkan? Is standard 5-minute test enough or leave it for XX minutes/hours?


----------



## 7empe

man from atlantis said:


> How long do you guys run memtest vulkan? Is standard 5-minute test enough or leave it for XX minutes/hours?


btw. how does it avoid running into ecc?


----------



## 7empe

7empe said:


> btw. how does it avoid running into ecc?


nvm, it's obvious when ecc kicks off due to bandwidth drop.


----------



## 2080tiowner

Krzych04650 said:


> You can flash any BIOS to any card this time around. Only concern could potentially be with Asus since it has additional HDMI port.


Perfect, thanks !


----------



## Xdrqgol

Krzych04650 said:


> You can flash any BIOS to any card this time around. Only concern could potentially be with Asus since it has additional HDMI port.


Hmm - FE cannot be flashed …right? Or something changed?


----------



## Krzych04650

Xdrqgol said:


> Hmm - FE cannot be flashed …right? Or something changed?


FE was never possible to flash for generations now, that goes without saying. But I've added emphasis on "custom" to the post just to be clear.


----------



## man from atlantis

+1950MHz MCLK, no error over 10mins


----------



## sanchitdang

Hi, Does anyone has the bios from the Gigabyte 4090 Xtreme Waterforce rev 1.1?


----------



## rahkmae

One hour played ED VR with Pimax 8KX, room temp was 24 when started now 26, this is not normal temp I thing?


----------



## yzonker

rahkmae said:


> View attachment 2591992
> 
> One hour played ED VR with Pimax 8KX, room temp was 24 when started now 26, this is not normal temp I thing?


Yea same problem as others. Hotspot is too high.


----------



## yzonker

And no 4090 Ti /Titan announced this morning. Guess everyone will have to struggle along with their 4090's for now. Probably next chance for anything above the 4090 is this summer.


----------



## rahkmae

Memory is ok, this mean bad contact with gpu?


----------



## alasdairvfr

J7SC said:


> Per earlier post re. rendering performance, I finally integrated the 3090 Strix into the older (Dec 2018) Threadripper build that already has 2x 2080 Ti to compare it to the RTX 4090. For now, the single 4090 is just a little faster (but not by much) in Octane bench compared to the twins and their new 3090 pal. Once I figure out how the heat and power holds up (dual 1300W HPC PSUs, 3x D5s etc), I can oc the 3x 'older gen' as well...1700W+, I reckon . All that said, it also does show how much progress the AdaL RTX4K actually represents given its power habits vs performance - including in work apps
> 
> View attachment 2591935


I did 2 runs while _really_ multitasking, middle of work. They are pretty close but the higher score was the _lower OC_ so probably my zillion or so other running things affecting









+150 core, 1200 mem










+255 core, 1500 mem


----------



## yzonker

rahkmae said:


> Memory is ok, this mean bad contact with gpu?


Most likely yes. Repaste would hopefully help unless there's some other issue like the cooler not being flat.


----------



## alasdairvfr

Here is some info on Octane bench with the distribution curve of about 270 bench samples:










I was hitting 1340, will try to push it a little harder when I have less stuff running on the system.


----------



## KingEngineRevUp

yzonker said:


> And no 4090 Ti /Titan announced this morning. Guess everyone will have to struggle along with their 4090's for now. Probably next chance for anything above the 4090 is this summer.


I imagine the 4090 Ti will be the last card the release in the 40 series. So the 4050-4070 Ti, 4080 Ti all will be released first. NVIDIA will probably bin some 4090 chips every now and then and save them to turn into 4090 Tis. Finally, they'll release the 4090 Ti 6 months before they release the 5080 and 5090 in a last ditch effort go get rid of as many 4090 dies as quick and soon as possible.


----------



## STR3T

rahkmae said:


> View attachment 2591992
> 
> One hour played ED VR with Pimax 8KX, room temp was 24 when started now 26, this is not normal temp I thing?


Thinking max fan 38% at those temps is the problem. Not sure why fan speed wouldn't be ramped up to at/near 100%.


----------



## rahkmae

STR3T said:


> Thinking max fan 38% at those temps is the problem. Not sure why fan speed wouldn't be ramped up to at/near 100%.


card is water-cooled EK asus strix block


----------



## STR3T

Ahhh. Sure that pump's pumping?


----------



## Panchovix

lawson67 said:


> Of course LM wont help if you have bad mounting issues i didn't read your post properly , apologies, and *don't use liquid metal with aluminum* – it will embrittle the aluminum and form an alloy with the coldplate , i try to buy cards that have a nickel plated copper cold plate / vaper chamber in anticipation of using LM
> 
> Edit :- ive just looked at some reviews of the Asus 4090 TUF and it does indeed use a nickel plated Vaper chamber so using LM should not be a problem if you can sort out the mounting issues before hand, but please look on you tube on how to correctly apply LM if you go that route, der8auer i belive has a good you tube video showing how to apply LM


Thanks man, gonna check how it goes with the pad (still haven't arrived yet though lol) and if it's worse, gonna evaluate LM. (But I guess it's probably safer to use Watercooling with just paste right?)

Here is some temps I got while using IA overnight, at 374W (training some ML models) and I can already get near 80°C on the hotspot with high fans speed. My cooler is definitely not good/faulty. (Ambient temp is about 32°C - 89.6°F)


----------



## J7SC

rahkmae said:


> card is water-cooled EK asus strix block





STR3T said:


> Ahhh. Sure that pump's pumping?


...s.th. going on that shouldn't be with cooling via a full water-block...mount ? Not enough rad space ? As mentioned already, pump issues ?



alasdairvfr said:


> I did 2 runs while _really_ multitasking, middle of work. They are pretty close but the higher score was the _lower OC_ so probably my zillion or so other running things affecting
> 
> View attachment 2591993
> 
> +150 core, 1200 mem
> 
> 
> View attachment 2591994
> 
> +255 core, 1500 mem


Nice ! OctaneBench is great fun if you're into rendering, isn't it ? 100% GPU usage, lots of heat... The 'threes' are now well deep into the 1500 score range while the 4090 'lingers' somewhere in the mid to high 1400s. More to come once I finished testing with full oc on both setups. That said, check the folks with *8x 4090*s below - and they're positively conservative when compared to the builds used for the top results...but will they play Crysis ?


----------



## Brads3cents

super disappointed with CES and complete lack of 4k oleds

I guess i will buy the new 49" qd-oled since it has the new samsung panel with supposedly 2000nits peak brightness and new processor 

the lg c3 is a complete disappointment. basically, the same panel as last years c2. no brightness booster and no new sizes
Additionally, its still only 120hz 

i figure 240hz > 120hz 
same ppi basically
better Oled panel overall 
width > height. it works better for our eyes. 

only issue is the insane price for something that is effectively going to be a 1440p display. I cant believe after 4years of gaming at 4k im considering a downgrade


----------



## alasdairvfr

J7SC said:


> Nice ! OctaneBench is great fun if you're into rendering, isn't it ? 100% GPU usage, lots of heat... The 'threes' are now well deep into the 1500 score range while the 4090 'lingers' somewhere in the mid to high 1400s. More to come once I finished testing with full oc on both setups. That said, check the folks with *8x 4090*s below - and they're positively conservative when compared to the builds used for the top results...but will they play Crysis ?
> View attachment 2592000


I suspect anyone running 8x 4090s will have to dial in power a bit, power delivery for 8x 450w cards plus heat dissipation would be a huge challenge!

Managed to hit 1471, topping the charts till others take the challenge

(forgot to screenshot, here's the csv instead)


















#1 single card on the chart (for now)


----------



## dboom

man from atlantis said:


> +1950MHz MCLK, no error over 10mins
> View attachment 2591978


Run some heavy RT, mine is crashing even if memtest is fine for 1h.
This memtest is great for 3090.


----------



## J7SC

alasdairvfr said:


> I suspect anyone running 8x 4090s will have to dial in power a bit, power delivery for 8x 450w cards plus heat dissipation would be a huge challenge!
> 
> Managed to hit 1471, topping the charts till others take the challenge
> 
> (forgot to screenshot, here's the csv instead)
> View attachment 2592003
> 
> 
> View attachment 2592004
> 
> 
> #1 single card on the chart (for now)


...very noice ! 
FYI, I am not posting results yet on purpose - it often seems to derail technical discussions.

Cooling multi-GPUs running at 100% usage is s.th. I learned during my HWBot years. Fortunately, I still have a lot of cooling equipment around from those years gone past. On the PSU front, the Antec HPC 1300W have a nifty 'OC link' which quickly connects two of them together for up to 2600W. That said, running 20+ high-po GPUs like some of the folks in Octane is a whole different ballgame - probably an industrial setup.


----------



## STR3T

Been running a 4090 FE for ~month now. I like it though was initially going for a TUF but reading those are high incident for coil whine. Correction, actually *wanted an EVGA!*

That said, sourced a Gigabyte Windforce by chance. 

The FE at 1.1mV, 133% PL, +215 Core, +1700 Mem on a 11700k (I know!) and 32GB 3600 DDR4: 
Port Royal best out of ~7 runs scored 27,130. So 50th out of top 100 w/ same 11700k today. That's the hardest I've pushed it...100% fan kept temps at 62c/72c.

I'm not looking to hit Top 20, as I said, been/am pretty darn happy w/ the FE. Looks ok altho "same" as 3080/3090...very little whine if any on her, modest white RGB lighting. At 100% fan does get a bit loud 

I thought the AIB's, even the base Gigabyte with 3-fans and which could get a BIOS update would be the most 'in demand'. But looking at sales on eBay, the FE sells for a decent amount more on average. Is that simply nVidia taking best silicon for the FE's overall or...??

Given the FE's performance above...is it even worth it to open up/test out the Windforce?

Will probably _limp along_ w/ the 11700k for a year or so until next round of CPU's come out...then figure out which way to go. I'm not playing any games currently that any 4090 wouldn't do exceedingly well on.

Next purchase will be 4k monitor and/or the CPU upgrade. I'm holding out for a ~27" - 28" version of the Alienware OLED or something else solid across the board including10+ HDR but at a solid price point.

So yeah, 4090 more than enough currently even "stock" w/ 11700k and 2k monitor (Dell S2721DGF and also just got a 27" NZXT QHD but haven't messed w/ it much yet). But long term, wondering if I should take a harder look at the Windforce.


----------



## jnagel777

motivman said:


> I beg to differ. 600W bios on my 4090 trio, card draws over 570W using 3X8 cable
> 
> View attachment 2579319


which bios did u flash to get the 600 w results?


----------



## motivman

jnagel777 said:


> which bios did u flash to get the 600 w results?


MSI suprim X 600W bios.


----------



## man from atlantis

dboom said:


> Run some heavy RT, mine is crashing even if memtest is fine for 1h.
> This memtest is great for 3090.


Yeah seems like memtest vulkan can't punish the card enough, because it passed even at +2000 oc but I already had artifacts at metro eee benchmark run. Maybe ethash mining can be the ultimate test but not gonna try it. Nice and round +1500 has passed my quake 2 rtx stress test and I'm ok with it.


----------



## Xdrqgol

STR3T said:


> Been running a 4090 FE for ~month now. I like it though was initially going for a TUF but reading those are high incident for coil whine. Correction, actually *wanted an EVGA!*
> 
> That said, sourced a Gigabyte Windforce by chance.
> 
> The FE at 1.1mV, 133% PL, +215 Core, +1700 Mem on a 11700k (I know!) and 32GB 3600 DDR4:
> Port Royal best out of ~7 runs scored 27,130. So 50th out of top 100 w/ same 11700k today. That's the hardest I've pushed it...100% fan kept temps at 62c/72c.
> 
> I'm not looking to hit Top 20, as I said, been/am pretty darn happy w/ the FE. Looks ok altho "same" as 3080/3090...very little whine if any on her, modest white RGB lighting. At 100% fan does get a bit loud
> 
> I thought the AIB's, even the base Gigabyte with 3-fans and which could get a BIOS update would be the most 'in demand'. But looking at sales on eBay, the FE sells for a decent amount more on average. Is that simply nVidia taking best silicon for the FE's overall or...??
> 
> Given the FE's performance above...is it even worth it to open up/test out the Windforce?
> 
> Will probably _limp along_ w/ the 11700k for a year or so until next round of CPU's come out...then figure out which way to go. I'm not playing any games currently that any 4090 wouldn't do exceedingly well on.
> 
> Next purchase will be 4k monitor and/or the CPU upgrade. I'm holding out for a ~27" - 28" version of the Alienware OLED or something else solid across the board including10+ HDR but at a solid price point.
> 
> So yeah, 4090 more than enough currently even "stock" w/ 11700k and 2k monitor (Dell S2721DGF and also just got a 27" NZXT QHD but haven't messed w/ it much yet). But long term, wondering if I should take a harder look at the Windforce.


I get 28.780 Port Royal with my FE …

This FE edition is one of the best Nvidia ever released! I would not even look at anything else, not sure why you mention Windforce for long term when this FE is a great product overall…. !?


----------



## STR3T

Xdrqgol said:


> This FE edition is one of the best Nvidia ever released! I would not even look at anything else, not sure why you mention Windforce for long term when this FE is a great product overall…. !?


Well, 1. I got the Windforce 10% off , whereas the FE was full MSRP, and 2. Seems all the 4090's are within spitting distance performance wise and generally speaking, I'm thinking the 3x fan coolers will do better long-term (no proof whatsoever). I mean, they do all seem to operate at really solid temps.

Other than that, it's a toss-up on looks/asthetics, at least to me. Seemed like given the 400+ pages here, there'd be some opinions either way and I'm open to listening.

I'm not a Giga fan, I know they have their own issues...but think I can get 1+ year by registering that GPU w/ them? So pro's/con's that I know of, was wanting to hear other's thoughts as well. Not much on youtube or even Google search specific to the Windforce that I've seen...mostly centered on the OC and Aorus models.


----------



## Xdrqgol

STR3T said:


> Well, 1. I got the Windforce 10% off , whereas the FE was full MSRP, and 2. Seems all the 4090's are within spitting distance performance wise and generally speaking, I'm thinking the 3x fan coolers will do better long-term (no proof whatsoever). I mean, they do all seem to operate at really solid temps.
> 
> Other than that, it's a toss-up on looks/asthetics, at least to me. Seemed like given the 400+ pages here, there'd be some opinions either way and I'm open to listening.
> 
> I'm not a Giga fan, I know they have their own issues...but think I can get 1+ year by registering that GPU w/ them? So pro's/con's that I know of, was wanting to hear other's thoughts as well. Not much on youtube or even Google search specific to the Windforce that I've seen...mostly centered on the OC and Aorus models.


Well 10% off is not much considering you have a card running 200+ core and +1700 Mem - majority of the people here cannot reach 1700 Mem .
I am coming back to the long term - the FE cooler ( even if it looks the same as the 3000 series - IT IS NOT THE SAME) is one of the best ever made by Nvidia - runs quiet and cool.

i would not trade it off if I were you for another card that:


you might not reach 200core/1700mem ( probabiliy pretty high)
coil whine , like crazy

If it was a different generation from the past I would have not commented, but this generation the FE is very very good!

But i guess in the end - “what floats your boat“


----------



## keikei

RTX Video Super Resolution Launches Next Month with 4K Upscaling of Footage Viewed on Chrome or Edge Browsers


NVIDIA announced RTX Video Super Resolution, a new technology that allows GeForce RTX 40 & 30 GPUs to upscale video content through AI.




wccftech.com


----------



## EarlZ

Has anyone had the chance to compare Arctic's new MX6 on the GPU core vs say Thermalright's TFX paste?


----------



## tubs2x4

newegg.ca has 4090s in stock right now if any cad people are interested...


----------



## bmagnien

Dlss3 improvements coming: NVIDIA's FPS-Increasing DLSS 3 Tech Is About To Get Even Better, Major Improvements To Image Quality In Games

Funny that they’re now promoting an ‘improved’ version of dlss3 in cp2077 in comparison to the original version of dlss3 in cp2077, which hasn’t even been released or given a ballpark release date ever since being announced at the release of the 4000 series (in addition to the beyond psycho RT settings). CDPR can’t stop tripping over themselves, still waiting on the Witcher 3 update to become playable as well. If I was Nvidia I’d hitch my wagon to another dev to showcase my new tech.


----------



## jediblr

motivman said:


> MSI suprim X 600W bios.


SUPRIM LIQUID X bios not SUPRIM X


----------



## yzonker

bmagnien said:


> Dlss3 improvements coming: NVIDIA's FPS-Increasing DLSS 3 Tech Is About To Get Even Better, Major Improvements To Image Quality In Games
> 
> Funny that they’re now promoting an ‘improved’ version of dlss3 in cp2077 in comparison to the original version of dlss3 in cp2077, which hasn’t even been released or given a ballpark release date ever since being announced at the release of the 4000 series (in addition to the beyond psycho RT settings). CDPR can’t stop tripping over themselves, still waiting on the Witcher 3 update to become playable as well. If I was Nvidia I’d hitch my wagon to another dev to showcase my new tech.


What's broken for you in Witcher? Seems to run really smooth with frame gen. Terrible without it though.


----------



## KingEngineRevUp

EarlZ said:


> Has anyone had the chance to compare Arctic's new MX6 on the GPU core vs say Thermalright's TFX paste?


I'm using MX6, i probably would have gotten 1-2C better with Kryonaut, but the reason I'm not using Kryonaut this gen is because I've noticed it gets dried up and crusty in less than a year.

I'm hoping MX6 lasts as long as Noctua and MX4.

Besides, 1-2C doesn't make it or break it asuch as it did the 30 series.


----------



## EarlZ

KingEngineRevUp said:


> I'm using MX6, i probably would have gotten 1-2C better with Kryonaut, but the reason I'm not using Kryonaut this gen is because I've noticed it gets dried up and crusty in less than a year.
> 
> I'm hoping MX6 lasts as long as Noctua and MX4.
> 
> Besides, 1-2C doesn't make it or break it asuch as it did the 30 series.


I can definitely attest to the Kryonaut being dried up. I replaced the paste on my 3090 which typically ran around 68-71c depending on the game and in less than a year I noticed the temperatures have spiked and the paste has totally dried. Was surprised as a lot of feedback said that it can handle 80c in and out.


----------



## J7SC

I have used Kryonaut for a few builds in '21 but also noticed that it dries out fairly easily. My normal go-to is Gelid GC Extreme - it's a bit thicker (helpful w/some die surface / cold-plate issues) and stays moist for years. Temps are within 1 C of fresh Kryonaut and remain great even after years. I also like MX4,5 but haven tried MX6 yet. FYI, Kryonaut is supposed to have a new formula mix afaik, but haven't tried that yet, either.


----------



## KingEngineRevUp

EarlZ said:


> I can definitely attest to the Kryonaut being dried up. I replaced the paste on my 3090 which typically ran around 68-71c depending on the game and in less than a year I noticed the temperatures have spiked and the paste has totally dried. Was surprised as a lot of feedback said that it can handle 80c in and out.


Yeah, unfortunately I used it on my nephew's PC, and his CPU temperatures have climbed to 95C and it's thermal throttling. I'll have to repaste his computer now.


----------



## Blameless

man from atlantis said:


> How long do you guys run memtest vulkan? Is standard 5-minute test enough or leave it for XX minutes/hours?


Several hours. I'd also recommend testing with fans/pump adjusted to get memory to at both temperature extremes.

Since it doesn't load the GPU particularly heavily it's possible to get a false sense of stability if testing with just memtest_vulkan with one's normal F/V curve.


----------



## bmagnien

yzonker said:


> What's broken for you in Witcher? Seems to run really smooth with frame gen. Terrible without it though.


Stutters super badly coming out of menus back to game (from map, inventory, settings), degradation over time (memory leak), random crashes, generally poor cpu thread optimization. Sometimes I’ll boot it up and it’ll run flawlessly, sometimes it’s a complete mess. But too inconsistent to enjoy in current state imo


----------



## yannick8403

Hey guys, first post over here, try my best with english as a Brusselois from Brussels speaking only bruxellois. I am about to go pick up my Phantom NO Gs for 1800€, well I just have to go grab it in a locker around the street ___ But..... (I am having buyer's remorse)!
Will play (and only play) on a 4k FV43U and the card cost me 1860€ instead of the others cards ~ around 2200 €,
Is it a good choose? I can still change but is it worth it? Come from 10.900K & RTX 3090

Will be paired with 1200W corsair plat, 13.700K DDR5 @6000 CL 30


Regards


----------



## Sheyster

KingEngineRevUp said:


> I'm using MX6, i probably would have gotten 1-2C better with Kryonaut, but the reason I'm not using Kryonaut this gen is because I've noticed it gets dried up and crusty in less than a year.
> 
> I'm hoping MX6 lasts as long as Noctua and MX4.
> 
> Besides, 1-2C doesn't make it or break it asuch as it did the 30 series.





EarlZ said:


> I can definitely attest to the Kryonaut being dried up. I replaced the paste on my 3090 which typically ran around 68-71c depending on the game and in less than a year I noticed the temperatures have spiked and the paste has totally dried. Was surprised as a lot of feedback said that it can handle 80c in and out.





J7SC said:


> I have used Kryonaut for a few builds in '21 but also noticed that it dries out fairly easily. My normal go-to is Gelid GC Extreme - it's a bit thicker (helpful w/some die surface / cold-plate issues) and stays moist for years. Temps are within 1 C of fresh Kryonaut and remain great even after years. I also like MX4,5 but haven tried MX6 yet. FYI, Kryonaut is supposed to have a new formula mix afaik, but haven't tried that yet, either.


I used to use Kryonaut up until about 2 years ago. I also noticed it would dry out fairly quickly. I switched to Thermalright TF8 and TFX, never looked back. Temps were virtually the same but these two lasted much longer. Thermalright recently introduced TF9 paste that I'll probably try out. It's supposed to be easier to spread than TFX.


----------



## sugi0lover

sugi0lover said:


> Doojin did some amazing 3DMark jobs with his 13900KF
> 
> 13900KF : SP119, P127, E101, MC88~89
> OC : P Cores 6300~6400Mhz, Ram 8800~8868MT
> 4090 Strix oc @ Galax Hof Bios / Mora420 out on balcony + liquid thermal / coolant temp 7C
> 
> 
> 
> Spoiler: 13900KF SP
> 
> 
> 
> 
> View attachment 2592077
> 
> 
> 
> 
> 
> 
> Spoiler: FIRE STRIKEl
> 
> 
> 
> 
> View attachment 2592074
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 74 970 in Fire Strike
> 
> 
> Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> View attachment 2592078
> 
> 
> 
> 
> 
> 
> Spoiler: FIRE STRIKE GS 100K
> 
> 
> 
> P 8 Cores, HT & E cores off
> View attachment 2592079
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 61 755 in Fire Strike
> 
> 
> Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: FIRE STRIKE EXTREME
> 
> 
> 
> 
> View attachment 2592081
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 50 202 in Fire Strike Extreme
> 
> 
> Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: FIRE STRIKE ULTRA
> 
> 
> 
> 
> View attachment 2592082
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 29 361 in Fire Strike Ultra
> 
> 
> Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: TIME SPY
> 
> 
> 
> 
> View attachment 2592083
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 40 457 in Time Spy
> 
> 
> Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: TIME SPY EXTREME
> 
> 
> 
> 
> View attachment 2592084
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 21 227 in Time Spy Extreme
> 
> 
> Intel Core i9-13900KF Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 10}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Original post
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 쿨엔조이,쿨앤조이 coolenjoy, cooln, 쿨엔, 검은동네
> 
> 
> 안녕하세요~RTX4090 3DMark 벤치 돌리면서 좋은결과가 나왔습니다. 물론 아주 잠시일 테지만 파스 1위를 찍어보게 되었네요^^;;4090 Strix oc @ Galax Ho
> 
> 
> 
> coolenjoy.net


----------



## J7SC

Miao said:


> no not all, you just look at some of the best silicones in this forum and just focus on the best ones, ignoring the rest, it's human.
> 
> my 4090 does 3060/+1200 and my 13900k has 97sp, what should i do? suicide?
> 
> however if it's just for benching, you can't make any real records on over ambient temp anyway, so... keep it
> and then always remember that you could find even a worse one


Well put ! I find terms such as 'golden chip', 'dud' and 'silicon lottery' somewhat annoying. I use systems for both work and play, and once you had 50+ GPUs to work with, you take it all less seriously...it is not like punishment from God if you card doesn't beat clock or fps records. In addition, there is more to be learned from the below-par GPUs (and CPUs, never mind oc'ing RAM and chasing that down a rabbit hole 🤪).

When you get a new card such as a 4090, you want to make sure there are no temp or fan issues (and may be coil-whine), otherwise, most 4090s are fairly closely spaced in terms of gaming performance...and if all else fails, there are those custom vbios (at least for non-FE cards), subject to your VRM/PCB and cooling abilities. As a former HWBot sub-zero competitor, it never really stops once one slides down that slippery slope...best to just enjoy your 4090 and may be a bit of friendly competition (for science ), without it turning into an obsession - or depression about your GPU...




alasdairvfr said:


> I suspect anyone running 8x 4090s will have to dial in power a bit, power delivery for 8x 450w cards plus heat dissipation would be a huge challenge!
> 
> Managed to hit 1471, topping the charts till others take the challenge
> 
> #1 single card on the chart (for now)


...I have not posted anything at the Octane site as I'm really just trying to delineate some things re. RTX 6000 Ada. Next will be the ML benchmark to compare setups below; just not sure yet if I go for the Geekbench ML or some other similar ML options.

Octane w/ 4090 at 1479.94 (...not my best run )










...and here's the OctaneBench at 1566.20 for the 'Frankenstein' / TR HEDT update build I just finished...still a fair amount on the table re.. oc headroom (the 3090 tops out ~2265 MHz core, the 2080 Tis around ~2180). I am satisfied now that the cooling system can handle 1700 Watts max after running various benchies. While this setup handily beats my 4090 in OctaneBench, it does so with far less efficiency...










...a quick note on memtest_vulkan....it is a very good tool, but not fool-proof. Besides, it stresses the built-in IMC a lot; typically the board power draw is just below 400 W with memtest_vulkanon my setup...I wouldn't run it for hours on end, but that's just 'IMO'.

I find that OptaneBench sniffs out unstable core and VRAM when doing a full run. As posted before, I also use some older artifact checkers. While I have benched with well over +1700 on VRAM, I typically leave at +1500.


----------



## VirtualSmile

Hey all! Does anyone know what thermal pad thickness I should get for the PNY 4090 XLR8 Gaming Verto EPIC-X RGB TF OC (VCG409024TFXXPB1-O)? I believe it is based off the reference PCB. See here: 




It's my first time buying seperate thermal pads - what sheet size is usually enough to cover an entire GPU? 

Thanks all!


----------



## Alemancio

*What seems to be the best tools to test for Core and Mem stability?* I thought I was +250 and +1200 stable, 3dMark stress tests easy passes and other games, but Wz2 crashes...


----------



## Krzych04650

Alemancio said:


> *What seems to be the best tools to test for Core and Mem stability?* I thought I was +250 and +1200 stable, 3dMark stress tests easy passes and other games, but Wz2 crashes...


There is always some variety in stability, you cannot just test it in one program/game and assume it is stable everywhere.

Just like with any overclocking, pushing for the absolute max with stress tests that you know and then stepping back a notch is the best practice. So for example running 3DMark stress tests and then shaving off 30-45 MHz from the core and 100-200 from the memory should give enough headroom to account for different workloads.


----------



## 8472

STR3T said:


> Been running a 4090 FE for ~month now. I like it though was initially going for a TUF but reading those are high incident for coil whine. Correction, actually *wanted an EVGA!*
> 
> That said, sourced a Gigabyte Windforce by chance.
> 
> The FE at 1.1mV, 133% PL, +215 Core, +1700 Mem on a 11700k (I know!) and 32GB 3600 DDR4:
> Port Royal best out of ~7 runs scored 27,130. So 50th out of top 100 w/ same 11700k today. That's the hardest I've pushed it...100% fan kept temps at 62c/72c.
> 
> I'm not looking to hit Top 20, as I said, been/am pretty darn happy w/ the FE. Looks ok altho "same" as 3080/3090...very little whine if any on her, modest white RGB lighting. At 100% fan does get a bit loud
> 
> I thought the AIB's, even the base Gigabyte with 3-fans and which could get a BIOS update would be the most 'in demand'. But looking at sales on eBay, the FE sells for a decent amount more on average. Is that simply nVidia taking best silicon for the FE's overall or...??
> 
> Given the FE's performance above...is it even worth it to open up/test out the Windforce?
> 
> Will probably _limp along_ w/ the 11700k for a year or so until next round of CPU's come out...then figure out which way to go. I'm not playing any games currently that any 4090 wouldn't do exceedingly well on.
> 
> Next purchase will be 4k monitor and/or the CPU upgrade. I'm holding out for a ~27" - 28" version of the Alienware OLED or something else solid across the board including10+ HDR but at a solid price point.
> 
> So yeah, 4090 more than enough currently even "stock" w/ 11700k and 2k monitor (Dell S2721DGF and also just got a 27" NZXT QHD but haven't messed w/ it much yet). But long term, wondering if I should take a harder look at the Windforce.


Outside of price and case fitment I really don't understand the level of demand for the FE. The difference in overclocked performance between models is not noticeable in actual games so Nvidia saving the best chips for themselves won't mean much in reality. 

Other models are more readily available, run quieter and cooler, have dual bios, RGB for those that like it, multiple HDMI ports for Asus, LEDs to let you know that the cable isn't plugged in all the way, etc. 



tubs2x4 said:


> newegg.ca has 4090s in stock right now if any cad people are interested...


The US Newegg site was glitching like crazy, it was showing almost all models in stock, then they would go out of stock only to be back in stock seconds later. Then they'd apparently add to cart, but your cart would be empty. 

Basically the cards were never actually available even though the add to cart button was showing. The same thing probably happened to their Canadian site as well. 

The stock trackers were alerting every few seconds.


----------



## th3illusiveman

Newegg is one of the worst places to buy tech - doesnt help they cater to scaling so much on their website. I'll never give them a penny. Thankful for my local hardware stores.


----------



## StreaMRoLLeR

Buy 4090 Gaming OC from caseking.de

For 2040 eur

@yannick8403


----------



## Blameless

yannick8403 said:


> Hey guys, first post over here, try my best with english as a Brusselois from Brussels speaking only bruxellois. I am about to go pick up my Phantom NO Gs for 1800€, well I just have to go grab it in a locker around the street ___ But..... (I am having buyer's remorse)!
> Will play (and only play) on a 4k FV43U and the card cost me 1860€ instead of the others cards ~ around 2200 €,
> Is it a good choose? I can still change but is it worth it? Come from 10.900K & RTX 3090


The Gainward should be using the same PCB as Palit-branded cards, which is generally regarded as one of the weaker ones.

That said, even the most cost cut RTX 4090s are still plenty sufficient for most uses and I'd think the savings were more than worth it.



J7SC said:


> Besides, it stresses the built-in IMC a lot; typically the board power draw is just below 400 W with memtest_vulkanon my setup...I wouldn't run it for hours on end, but that's just 'IMO'.


400w is nothing. I have gaming loads that easily surpass that. Nothing that exceeds the memory power draw, but as long as temps and current limits aren't being exceeded there, I'm not worried.

Anyway, if any RTX 4090 cannot handle the stock 450w power limit forever (24/7 until the warranty ends, at least), I'd consider it defective. This Windforce has one of the most cut down VRMs (missing ten core phases on a reference board knock-off) and one of the first things I did was put thirty hours of FurMark @ 600w on it to make sure it wasn't exceeding tolerances anywhere.

Assuming thermal cycling doesn't get it, the only area of concern I have with most of these boards is the lifetime of the surface mount capacitors. Some of them are only rated for 2k hours at 105C (which probably explains why the hottest row of them is the _only_ thing in thermal contact with the backplate on this Windforce). However, since none of them get hotter than 70-75C or so in torture test loads, that should still be at least 8k hours if I decided to do nothing but stress test at the highest power limits I could cool (on stock air). In more realistic, but still very heavy loads, they should last at least 30k hours.


----------



## 7empe

Hey Guys, I'm waiting for 4090 TUF to arrive tomorrow and I've ordered pro-acitvely the bequiet! adapter: be quiet!. Anyone has it and can share a feedback? Does it enable 600W without PCIe 5.0 signaling lines from the PSU? Thanks in advance.


----------



## GforceGTXer

Alemancio said:


> *What seems to be the best tools to test for Core and Mem stability?* I thought I was +250 and +1200 stable, 3dMark stress tests easy passes and other games, but Wz2 crashes...


MW2 is very weird. I can run 3045 core all day in stress tests, but have to back down to 3015 core for modern warfare 2 to not crash, even though it's not taxing at all on the GPU.


----------



## Pk1

Decided to finally throw her in while I wait for waterblock. ****, these things are fast.


----------



## lawson67

Panchovix said:


> Thanks man, gonna check how it goes with the pad (still haven't arrived yet though lol) and if it's worse, gonna evaluate LM. (But I guess it's probably safer to use Watercooling with just paste right?)
> 
> Here is some temps I got while using IA overnight, at 374W (training some ML models) and I can already get near 80°C on the hotspot with high fans speed. My cooler is definitely not good/faulty. (Ambient temp is about 32°C - 89.6°F)
> View attachment 2591997


Sorry for the late reply, yes you can use LM on a water-block as long as its Nickle plated which most are, i use LM with air coolers and water-blocks


----------



## bmagnien

lawson67 said:


> Sorry for the late reply, yes you can use LM on a water-block as long as its Nickle plated which most are, i use LM with air coolers and water-blocks


doesn't matter if it's nickel plated as long as it's pure copper. LM will blacken the pure copper surface but will not errode it or create pits/uneveness. You can remove the black with a light hydrochloric acid/water mix.


----------



## lawson67

bmagnien said:


> doesn't matter if it's nickel plated as long as it's pure copper. LM will blacken the pure copper surface but will not errode it or create pits/uneveness. You can remove the black with a light hydrochloric acid/water mix.


 I would avoid LM on copper as it absorbs the gallium which can pit the copper and will need repasting soon after and if the copper has been pitted it wont be effective anymore unless you lap the copper, ive been there and done that


----------



## J7SC

Blameless said:


> (...)
> 
> 400w is nothing. I have gaming loads that easily surpass that. Nothing that exceeds the memory power draw, but as long as temps and current limits aren't being exceeded there, I'm not worried.
> 
> Anyway, if any RTX 4090 cannot handle the stock 450w power limit forever (24/7 until the warranty ends, at least), I'd consider it defective. This Windforce has one of the most cut down VRMs (missing ten core phases on a reference board knock-off) and one of the first things I did was put thirty hours of FurMark @ 600w on it to make sure it wasn't exceeding tolerances anywhere.
> 
> Assuming thermal cycling doesn't get it, the only area of concern I have with most of these boards is the lifetime of the surface mount capacitors. Some of them are only rated for 2k hours at 105C (which probably explains why the hottest row of them is the _only_ thing in thermal contact with the backplate on this Windforce). However, since none of them get hotter than 70-75C or so in torture test loads, that should still be at least 8k hours if I decided to do nothing but stress test at the highest power limits I could cool (on stock air). In more realistic, but still very heavy loads, they should last at least 30k hours.


...you might have missed the 'IMO' part, and I do recall having a similar correspondence at the 6900 XT thread re. stress testing components (1500 - 2000 fps+). In any case, memtest_vulkan and its ~ 400 W mostly stresses just one particular segment of the N4 76.3 billion transistors (the one which relates to VRAM, obviously) rather than being a game or bench load of 500 W+ with a more balanced load I run all the time (below). The author of memtest_vulkan also suggests that more than two hours isn't usually necessary (below). I do like it, but within reason, and not as the only VRAM verification tool --- it is not foolproof in that sometimes, unstable VRAM can slip by (ie. in non-ECC settings)...I can and have run my card's VRAM as high as +1702 in some benches, but my 'go-to' is around +1500 or so.

















---

In general terms, to find your 'base oc profile' specific to your card sample, you should run all your typical games and all your typical benches and use the lowest common denominator speed for both core and memory. Just save that as one of the profiles in MSI AB - you can always 'clock up' from there when running s.th. which is a bit more forgiving. I also recommend OctaneBench as part of the repertoire of oc-checking tools because it stops on error* AND* gives you error messages that allow you to deduce whether it was core or memory.


----------



## coelacanth

8472 said:


> Outside of price and case fitment I really don't understand the level of demand for the FE. The difference in overclocked performance between models is not noticeable in actual games so Nvidia saving the best chips for themselves won't mean much in reality.
> 
> Other models are more readily available, run quieter and cooler, have dual bios, RGB for those that like it, multiple HDMI ports for Asus, LEDs to let you know that the cable isn't plugged in all the way, etc.


Not to mention the FE warranties are non-transferable.


----------



## mardon

Just caved and got a 4090Fe for MSRP. Will my 10900k's PCI3.0 hold it back much? I'd rather not upgrade if I'm not loosing too much performance at this stage. 5.2ghz with dual rank 32Gb of 4000mhz CL14.


----------



## Panchovix

Blameless said:


> The Gainward should be using the same PCB as Palit-branded cards, which is generally regarded as one of the weaker ones.
> 
> That said, even the most cost cut RTX 4090s are still plenty sufficient for most uses and I'd think the savings were more than worth it.
> 
> 
> 
> 400w is nothing. I have gaming loads that easily surpass that. Nothing that exceeds the memory power draw, but as long as temps and current limits aren't being exceeded there, I'm not worried.
> 
> Anyway, if any RTX 4090 cannot handle the stock 450w power limit forever (24/7 until the warranty ends, at least), I'd consider it defective. This Windforce has one of the most cut down VRMs (missing ten core phases on a reference board knock-off) and one of the first things I did was put thirty hours of FurMark @ 600w on it to make sure it wasn't exceeding tolerances anywhere.
> 
> Assuming thermal cycling doesn't get it, the only area of concern I have with most of these boards is the lifetime of the surface mount capacitors. Some of them are only rated for 2k hours at 105C (which probably explains why the hottest row of them is the _only_ thing in thermal contact with the backplate on this Windforce). However, since none of them get hotter than 70-75C or so in torture test loads, that should still be at least 8k hours if I decided to do nothing but stress test at the highest power limits I could cool (on stock air). In more realistic, but still very heavy loads, they should last at least 30k hours.


Man my TUF 4090 at 450W gets like 73°C core and 92°C hotspot, my ambient temps are about 30-32°C (near ~88°F) on air.

I'm pretty sure my cooler is faulty at this point, but here in Chile, if the "sticker" on the screw is in any way touched, warranty is instantly void, so I have to live with it.

Still haven't got the carbonaut pad, but not much hopes ATM, hotspot is the hottest part of the die or in any part of the card? If it's the latter, maybe I'm missing a thermal pad or something.



lawson67 said:


> Sorry for the late reply, yes you can use LM on a water-block as long as its Nickle plated which most are, i use LM with air coolers and water-blocks


Thanks man, that will be probably my last resort before watercooling, or just WC directly. Hotspot above 90°C on a 4090 is way too much IMO


----------



## STR3T

@mardon
Yeah even 13th Gen and 79xx CPU's are bottlenecking in some games. I'm using an 11700k and gaming is still "great"...unless you find a game you're playing at max resolution/settings that's struggling, holding off on a CPU/RAM upgrade can be done...it's just hard to do so!

Saw a review of i7 Gens recently w/ 4090 and 12-game avg.summary was 9700k at 168fps, 10700k at 174fps, 12700k at 183fps and 13700k at 191. Those were 2k numbers. You'd likely be getting ~15-30fps more w/ a new i7/i9 in various games at 2k, a bit less at 4k.


----------



## yzonker

Panchovix said:


> Man my TUF 4090 at 450W gets like 73°C core and 92°C hotspot, my ambient temps are about 30-32°C (near ~88°F) on air.
> 
> I'm pretty sure my cooler is faulty at this point, but here in Chile, if the "sticker" on the screw is in any way touched, warranty is instantly void, so I have to live with it.
> 
> Still haven't got the carbonaut pad, but not much hopes ATM, hotspot is the hottest part of the die or in any part of the card? If it's the latter, maybe I'm missing a thermal pad or something.
> 
> 
> Thanks man, that will be probably my last resort before watercooling, or just WC directly. Hotspot above 90°C on a 4090 is way too much IMO


If it's the same as 30 series, the hotspot includes the core and VRM.


----------



## Roacoe717

Anyone know where to get QC stickers that match Asus, MSI gigabyte etc?


----------



## mirkendargen

Roacoe717 said:


> Anyone know where to get QC stickers that match Asus, MSI gigabyte etc?


I want the Colorful one that says "NO TEARING UP"


----------



## Shoggoth

yannick8403 said:


> Hey guys, first post over here, try my best with english as a Brusselois from Brussels speaking only bruxellois. I am about to go pick up my Phantom NO Gs for 1800€, well I just have to go grab it in a locker around the street ___ But..... (I am having buyer's remorse)!
> Will play (and only play) on a 4k FV43U and the card cost me 1860€ instead of the others cards ~ around 2200 €,
> Is it a good choose? I can still change but is it worth it? Come from 10.900K & RTX 3090
> 
> Will be paired with 1200W corsair plat, 13.700K DDR5 @6000 CL 30



To continue in Blameless' vein, I have the Phantom GS which have "weaker" specs than all the big hitters (Strix, Suprim X, many others). It has 18-phase voltage regulation, vs 20, 24 and 26 phases for most top AIB models and also "just" 3 memory voltage phases, vs 4 stages for many others.

Nevertheless, I get +1700 stable memory OC and so far +250 core, for 3000Mhz. Quite possibly higher than that on the core, since +250 is just where I had to stop due to time constraints. Obviously, 18 phases are quite sufficient to ensure that my results are near the top of the stack. Also, for some reason the memory modules on the Phantom GS are Micron MT61K512M32KPA-24, specced for 24Gbps effective, whereas models like the Strix OC, Suprim X and Liquid X use modules specced for 21Gbps effective. Another excellent reason not to spend more.

Just like Blameless I would very much suggest that you hold on to the Phantom, as any gains from opting for a different and more expensive model is probably quite unlikely to be worth the additional cost - and said gains may very well not materialise at all.

Edit: the GS bios is limited to 500W and the highest power draw I've so far seen is stress tests or games is ~455W, which appears to be quite modest given the OC results.


----------



## STR3T

mardon said:


> Just caved and got a 4090Fe for MSRP. Will my 10900k's PCI3.0 hold it back much? I'd rather not upgrade if I'm not loosing too much performance at this stage. 5.2ghz with dual rank 32Gb of 4000mhz CL14.


This just hit, think the 5800x is right on top of 10900k performance so enjoy: 4090: 5800x vs. 13700kf


----------



## GraphicsWhore

Just bit on a Gigabyte Gaming. Any consensus on blocks for this thing? I see there's a good selection but I've only ever had experience with EK and Optimus.


----------



## mirkendargen

GraphicsWhore said:


> Just bit on a Gigabyte Gaming. Any consensus on blocks for this thing? I see there's a good selection but I've only ever had experience with EK and Optimus.


Most people have Bykski (myself included), none have complained.


----------



## Panchovix

yzonker said:


> If it's the same as 30 series, the hotspot includes the core and VRM.


Nice thanks, well I guess the cooler is faulty, used MX6 and the temps got better, but still 90°C+ with 450W with 33°C ambient.

Gonna get a watercooling probably in the next months. Unlucky since I have no WC system at all, but I guess that's better than the card burning itself by really high hotspot temps. Gonna limit the card to 350-400W in the meanwhile. 

Damn I was happy with my TUF 3080, but this TUF 4090 have been a completely deception. At least it was MSRP I guess.


----------



## yzonker

Panchovix said:


> Nice thanks, well I guess the cooler is faulty, used MX6 and the temps got better, but still 90°C+ with 450W with 33°C ambient.
> 
> Gonna get a watercooling probably in the next months. Unlucky since I have no WC system at all, but I guess that's better than the card burning itself by really high hotspot temps. Gonna limit the card to 350-400W in the meanwhile.
> 
> Damn I was happy with my TUF 3080, but this TUF 4090 have been a completely deception. At least it was MSRP I guess.


 IMO a 90-95C hotspot temp won't hurt the card at all. The limit is 105-110C, so you're well under that. 

Alphacool may eventually make their AIO for the TUF. That would be a lot cheaper if you don't want to go full custom.


----------



## J7SC

GraphicsWhore said:


> Just bit on a Gigabyte Gaming. Any consensus on blocks for this thing? I see there's a good selection but I've only ever had experience with EK and Optimus.





mirkendargen said:


> Most people have Bykski (myself included), none have complained.


...yeah, same here - zero issues with Giga-G-OC & Bykski in my setup


----------



## tubs2x4

STR3T said:


> This just hit, think the 5800x is right on top of 10900k performance so enjoy: 4090: 5800x vs. 13900kf


13700k


----------



## GRABibus

KingEngineRevUp said:


> I'm sure there are people doing +1700 here done launch who probably don't have a problem.


Confirmed.

I don’t imagine VRAM degrading after 2 weeks, especially if temps are under control.
Except if he was on the razor’s edge of stability with very cold ambient.


----------



## GRABibus

motivman said:


> Not sure how the galax bios plays with the gigabyte stock fans.


At 100% fan speed on my giga OC with Galax « The number of the beast » bios, there are some strange noises like the fans touching some parts of the card when they turn at 3200rpm full speed.
I cap the speed at 70% now.
No more issue.


----------



## J7SC

yzonker said:


> IMO a 90-95C hotspot temp won't hurt the card at all. The limit is 105-110C, so you're well under that.
> 
> Alphacool may eventually make their AIO for the TUF. That would be a lot cheaper if you don't want to go full custom.





Panchovix said:


> Nice thanks, well I guess the cooler is faulty, used MX6 and the temps got better, but still 90°C+ with 450W with 33°C ambient.
> 
> Gonna get a watercooling probably in the next months. Unlucky since I have no WC system at all, but I guess that's better than the card burning itself by really high hotspot temps. Gonna limit the card to 350-400W in the meanwhile.
> 
> Damn I was happy with my TUF 3080, but this TUF 4090 have been a completely deception. At least it was MSRP I guess.


...just check your card's (stock) vbios in GPUZ/Advanced/bios; my Giga-G-OC lists 88 C as max, which may (or may not) include GPU Hotspot. Anyway, lower Hotspot temp is better if you can get there.


----------



## yzonker

J7SC said:


> ...just check your card's (stock) vbios in GPUZ/Advanced/bios; my Giga-G-OC lists 88 C as max, which may (or may not) include GPU Hotspot. Anyway, lower Hotspot temp is better if you can get there.
> View attachment 2592192


It does not include hotspot. That's just max core temp. It's around 105C for start of throttling for all of them AFAIK. Similar to VRAM junction temp (110C). 

And yes lower is always better, I just don't think there is a risk of degradation given the throttle point is 100C+.


----------



## dboom

Panchovix said:


> Man my TUF 4090 at 450W gets like 73°C core and 92°C hotspot, my ambient temps are about 30-32°C (near ~88°F) on air.
> 
> I'm pretty sure my cooler is faulty at this point, but here in Chile, if the "sticker" on the screw is in any way touched, warranty is instantly void, so I have to live with it.
> 
> Still haven't got the carbonaut pad, but not much hopes ATM, hotspot is the hottest part of the die or in any part of the card? If it's the latter, maybe I'm missing a thermal pad or something.
> 
> 
> Thanks man, that will be probably my last resort before watercooling, or just WC directly. Hotspot above 90°C on a 4090 is way too much IMO


Ur temps are fine. I had those on TUF OC too with 23 24ambient. WB changed everything.


----------



## J7SC

yzonker said:


> It does not include hotspot. That's just max core temp. It's around 105C for start of throttling for all of them AFAIK. Similar to VRAM junction temp (110C).
> 
> And yes lower is always better, I just don't think there is a risk of degradation given the throttle point is 100C+.


I wouldn't worry about it for the shorter term, but when the manufacturer states 'Thermal Limits / Maximum / 88 C', it is worth keeping that in mind. Anyway, @Panchovix suggested he'll water-cool it in the coming months - problem solved.

---

Elsewhere, any bets on max number of cores to be officially announced at CES '23 for Zen4 / 3DV cache ? 8 cores seems a no-brainer, but I wouldn't mind if they announce a full 16 core...


----------



## Blameless

Got my 4090 off my test bench and into my real system, which is connected to my Samsung G7 rather than a 4k TV.

It looks like the the warning about DSC not allowing DSR, DLDSR, or NIS does *not* apply to Lovelace after all.

I'm running 1440p, 240Hz, 10-bits per channel color over DP 1.4a (which needs DSC)...DLDSR is working fine.


----------



## tubs2x4

8472 said:


> Outside of price and case fitment I really don't understand the level of demand for the FE. The difference in overclocked performance between models is not noticeable in actual games so Nvidia saving the best chips for themselves won't mean much in reality.
> 
> Other models are more readily available, run quieter and cooler, have dual bios, RGB for those that like it, multiple HDMI ports for Asus, LEDs to let you know that the cable isn't plugged in all the way, etc.
> 
> 
> 
> The US Newegg site was glitching like crazy, it was showing almost all models in stock, then they would go out of stock only to be back in stock seconds later. Then they'd apparently add to cart, but your cart would be empty.
> 
> Basically the cards were never actually available even though the add to cart button was showing. The same thing probably happened to their Canadian site as well.
> 
> The stock trackers were alerting every few seconds.


Looked again and it’s still showing 4090s for sale. New egg is selling them but like $200 above msrp. They trying to grab more profit seems like.


----------



## Blameless

Panchovix said:


> Man my TUF 4090 at 450W gets like 73°C core and 92°C hotspot, my ambient temps are about 30-32°C (near ~88°F) on air.
> 
> I'm pretty sure my cooler is faulty at this point, but here in Chile, if the "sticker" on the screw is in any way touched, warranty is instantly void, so I have to live with it.
> 
> Still haven't got the carbonaut pad, but not much hopes ATM, hotspot is the hottest part of the die or in any part of the card? If it's the latter, maybe I'm missing a thermal pad or something.


Did you inspect the card edge on to see if anything was amiss? What fan speed is this at?

30C ambient is pretty warm and if the fan isn't ramping up those temps would not be that out of line.



J7SC said:


> The author of memtest_vulkan also suggests that more than two hours isn't usually necessary (below).


I've encountered errors past two-hours, especially at borderline temperatures, which is why I recommend much longer test intervals.



J7SC said:


> I do like it, but within reason, and not as the only VRAM verification tool --- it is not foolproof


Nothing is foolproof and I'm not advocating memtest_vulkan as one's only memory test. However, it's one that does things others do not and can fail where others will not. That's it's niche.


----------



## yzonker

J7SC said:


> I wouldn't worry about it for the shorter term, but when the manufacturer states 'Thermal Limits / Maximum / 88 C', it is worth keeping that in mind. Anyway, @Panchovix suggested he'll water-cool it in the coming months - problem solved.
> 
> ---
> 
> Elsewhere, any bets on max number of cores to be officially announced at CES '23 for Zen4 / 3DV cache ? 8 cores seems a no-brainer, but I wouldn't mind if they announce a full 16 core...


I'll go with more than 8 cores. Just seems like the 5800x3D was a trial run for the real deal in Zen4.


----------



## Panchovix

yzonker said:


> IMO a 90-95C hotspot temp won't hurt the card at all. The limit is 105-110C, so you're well under that.
> 
> Alphacool may eventually make their AIO for the TUF. That would be a lot cheaper if you don't want to go full custom.


Nice to know, I hope they release the AIO soon. I'm not sure if it will be damaged or not, I guess I'm gonna discover it in some months lol.



Blameless said:


> Did you inspect the card edge on to see if anything was amiss? What fan speed is this at?
> 
> 30C ambient is pretty warm and if the fan isn't ramping up those temps would not be that out of line.


Today I was testing and the fans are with a custom curve at very high speeds, so for example I reached 67°C core and 84°C hotspot, 2500RPM and ~34°C (93.2°F) ambient temp at 400W. I feel that's pretty hot, except if ambient temps >30°C really affects that much.


----------



## bmagnien

J7SC said:


> Elsewhere, any bets on max number of cores to be officially announced at CES '23 for Zen4 / 3DV cache ? 8 cores seems a no-brainer, but I wouldn't mind if they announce a full 16 core...


already leaked


----------



## yzonker

bmagnien said:


> already leaked
> View attachment 2592247
> 
> View attachment 2592246


I'm not sure that's all that good given it's compared back to the 5800x3D.


----------



## J7SC

bmagnien said:


> already leaked
> View attachment 2592247
> 
> View attachment 2592246


...saw that, but I want Lisa Su to make it official since I heard that not everything on the internet is actually true


----------



## bmagnien

J7SC said:


> ...saw that, but I want Lisa Su to make it official since I heard that not everything on the internet is actually true


Who has the power to create a 7XXX3D Owners thread???


----------



## bmagnien

@J7SC a few pages ago you proposed pairing the eventual 7XXX3D with 64gb of ram - could you expand on the reasoning behind that? Wasn’t aware of more ram providing speed gains with increased l2/l3 cache. Why would 64 vs 32 be beneficial for these cpus specifically? Or what’s the hypothesis?


----------



## bmagnien

yzonker said:


> I'm not sure that's all that good given it's compared back to the 5800x3D.


I dunno, +21% @ 1080p for a single gen performance jump is pretty good in the CPU world. 5950x to 7950x is 16%, 12900k to 13900k is 14%.


----------



## Brads3cents

Amd just said 15% over the 5800x3d
So that graph is misleading.
Straight from Lisa’s mouth 

Also I don’t trust amd anymore after the last conference so I’ll wait for reviews

I’m not sure that particular cpu is of any interest to people in this thread since it will be slower than the 13900k. I can understand for people seeking simplicity( but this is overclock.net) and for people trying to save money (this is the 4090 thread so ultra high luxury gpu tier $) so it doesn’t really make sense for us

Now, the 7950x3d could be really interesting
I’d really like to see the reviews
It will probably outperform a stock 13900K/KS

I’m more interested though in 1% lows and also vs a memory tuned 13900k


----------



## Panchovix

@yzonker followed your suggestion and repasted. 

This is what looked like before reapplying the paste, and I though I used a good amount.










After repasting, hotspot is at ~70°C at 350W and ~79°C at 500W, so a pretty good improvement . (Using MX6) 

Used a **** ton of paste though, so I'm not sure if that was the way the go.


----------



## Shraf2k

So what's the consensus on which 4090 to buy besides "whatever one you can get your hands on"?


----------



## J7SC

bmagnien said:


> @J7SC a few pages ago you proposed pairing the eventual 7XXX3D with 64gb of ram - could you expand on the reasoning behind that? Wasn’t aware of more ram providing speed gains with increased l2/l3 cache. Why would 64 vs 32 be beneficial for these cpus specifically? Or what’s the hypothesis?


...a couple of specific-to-me and also some general reasons. This is severely abbreviated and some bits very much generalized...

I tend to buy a lot of the same type of sticks and use them in various configs and with interchangeability for work-and-play...thus with apps getting bigger, the minimum amount per DDR5 RAM stick for me will likely be 32 GB, and 64 GB minimum overall for any given new build. Next, Ryzen 7K's seems to have issues, at least for now, with top speed / tight timings when running four sticks rather than two, unlike my 5950X w/four sticks, so 2x 32 GB will meet my minimum requirements.

...in more general terms, the age-old 'pipeline' issue in computing has been moving data from the slowest part of a system (fixed power-off storage such as HDD, SSD these days etc) to the fastest (CPU). With some CPUs now near or at 6 GHz, it is still all about removing bottlenecks: More and faster cache L1-L3(4) and more and faster RAM. With an increasing size of stored apps, it again means more RAM because even with the upcoming PCIe 5.0 SSDs, they still only operate at a fraction of the speed of RAM - RAM can read roughly eight to ten times as fast (current setups) as fast NVMEs. That's also why some folks still play around with RAM-disks...and why the latest AMD Epyc Genoa (up to 96 cores / 192 threads) have introduced 12-channel DDR5 RAM capability.

To keep everything pressurized in-between the CPU and RAM modern system use the fastest cascading cache setups....as an example, look at the different speeds of RAM, L1 - L3 cache of an older Aida for my 5950X as well as CrystalDisk (WD SN850 2TB) below.

Finally, the Ryzen7K..X3DVcache has by definition much more cache and thus will benefit from more DDR5 RAM, IMO...32 GB dual channel might still be enough, but 64 GB makes much more sense to me to maximize the 'pipeline benefits' and meet my other specific requirements.


----------



## Krzych04650

yzonker said:


> I'm not sure that's all that good given it's compared back to the 5800x3D.





Brads3cents said:


> Amd just said 15% over the 5800x3d
> So that graph is misleading.
> Straight from Lisa’s mouth
> 
> Also I don’t trust amd anymore after the last conference so I’ll wait for reviews
> 
> I’m not sure that particular cpu is of any interest to people in this thread since it will be slower than the 13900k. I can understand for people seeking simplicity( but this is overclock.net) and for people trying to save money (this is the 4090 thread so ultra high luxury gpu tier $) so it doesn’t really make sense for us
> 
> Now, the 7950x3d could be really interesting
> I’d really like to see the reviews
> It will probably outperform a stock 13900K/KS
> 
> I’m more interested though in 1% lows and also vs a memory tuned 13900k


Yea it is really nothing special. TPU's 50 game 13900K vs 5800X3D benchmark has 13900K 6.2% faster and that is stock 13900K with DDR5-6000 XMP and performance penalty from 32 threads, so this is absolute best case scenario for X3D, and even in reviews like that 7000X3D is only going to be ~8% faster based on that 15% number from AMD. It is not going to beat tuned 13900K/KS.


----------



## KingEngineRevUp

Krzych04650 said:


> Yea it is really nothing special. TPU's 50 game 13900K vs 5800X3D benchmark has 13900K 6.2% faster and that is stock 13900K with DDR5-6000 XMP and performance penalty from 32 threads, so this is absolute best case scenario for X3D, and even in reviews like that 7000X3D is only going to be ~8% faster based on that 15% number from AMD. It is not going to beat tuned 13900K/KS.


Don't we already have a slide where AMD compared it to the 13900K?


----------



## 8472

Shraf2k said:


> So what's the consensus on which 4090 to buy besides "whatever one you can get your hands on"?


Which features matter to you?


----------



## Nico67

J7SC said:


> ...a couple of specific-to-me and also some general reasons. This is severely abbreviated and some bits very much generalized...
> 
> I tend to buy a lot of the same type of sticks and use them in various configs and with interchangeability for work-and-play...thus with apps getting bigger, the minimum amount per DDR5 RAM stick for me will likely be 32 GB, and 64 GB minimum overall for any given new build. Next, Ryzen 7K's seems to have issues, at least for now, with top speed / tight timings when running four sticks rather than two, unlike my 5950X w/four sticks, so 2x 32 GB will meet my minimum requirements.
> 
> ...in more general terms, the age-old 'pipeline' issue in computing has been moving data from the slowest part of a system (fixed power-off storage such as HDD, SSD these days etc) to the fastest (CPU). With some CPUs now near or at 6 GHz, it is still all about removing bottlenecks: More and faster cache L1-L3(4) and more and faster RAM. With an increasing size of stored apps, it again means more RAM because even with the upcoming PCIe 5.0 SSDs, they still only operate at a fraction of the speed of RAM - RAM can read roughly eight to ten times as fast (current setups) as fast NVMEs. That's also why some folks still play around with RAM-disks...and why the latest AMD Epyc Genoa (up to 96 cores / 192 threads) have introduced 12-channel DDR5 RAM capability.
> 
> To keep everything pressurized in-between the CPU and RAM modern system use the fastest cascading cache setups....as an example, look at the different speeds of RAM, L1 - L3 cache of an older Aida for my 5950X as well as CrystalDisk (WD SN850 2TB) below.
> 
> Finally, the Ryzen7K..X3DVcache has by definition much more cache and thus will benefit from more DDR5 RAM, IMO...32 GB dual channel might still be enough, but 64 GB makes much more sense to me to maximize the 'pipeline benefits' and meet my other specific requirements.
> View attachment 2592271


2x32 gives you dual rank, which is probably more advantageous with higher bandwidth, than fast timings for x3d cache 



Krzych04650 said:


> Yea it is really nothing special. TPU's 50 game 13900K vs 5800X3D benchmark has 13900K 6.2% faster and that is stock 13900K with DDR5-6000 XMP and performance penalty from 32 threads, so this is absolute best case scenario for X3D, and even in reviews like that 7000X3D is only going to be ~8% faster based on that 15% number from AMD. It is not going to beat tuned 13900K/KS.


Like the 5800x3d it will demolish the 13900ks in some games and lose to the 7700x in others. What will be interesting is how much better overall it is compared to the 5800x3d vs 12900ks. Also if a 7950x3d is any better, which it should be on the games that don't utilise cache well (were the 7800x3d is worse than 7700x due to clk speed drop)
What I am personal interested in seeing, is how preferential cores are going to work. How does it determine if an app likes 5.0ghz x3d or 5.7 non x3d? Assuming at this point that this is how the ccds will be configured, as it wouldn't seem likely the x3d and non x3d both do 5.7 given lower tdp? time will tell though


----------



## bada55

yzonker said:


> It does not include hotspot. That's just max core temp. It's around 105C for start of throttling for all of them AFAIK. Similar to VRAM junction temp (110C). And yes lower is always better, I just don't think there is a risk of degradation given the throttle point is 100C+.


 lol


----------



## alasdairvfr

J7SC said:


> Well put ! I find terms such as 'golden chip', 'dud' and 'silicon lottery' somewhat annoying. I use systems for both work and play, and once you had 50+ GPUs to work with, you take it all less seriously...it is not like punishment from God if you card doesn't beat clock or fps records. In addition, there is more to be learned from the below-par GPUs (and CPUs, never mind oc'ing RAM and chasing that down a rabbit hole 🤪).
> 
> 
> 
> 
> 
> 
> 
> When you get a new card such as a 4090, you want to make sure there are no temp or fan issues (and may be coil-whine), otherwise, most 4090s are fairly closely spaced in terms of gaming performance...and if all else fails, there are those custom vbios (at least for non-FE cards), subject to your VRM/PCB and cooling abilities. As a former HWBot sub-zero competitor, it never really stops once one slides down that slippery slope...best to just enjoy your 4090 and may be a bit of friendly competition (for science ￼), without it turning into an obsession - or depression about your GPU...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...I have not posted anything at the Octane site as I'm really just trying to delineate some things re. RTX 6000 Ada. Next will be the ML benchmark to compare setups below; just not sure yet if I go for the Geekbench ML or some other similar ML options.
> 
> 
> 
> 
> 
> 
> 
> Octane w/ 4090 at 1479.94 (...not my best run ￼)
> 
> 
> 
> ￼
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...and here's the OctaneBench at 1566.20 for the 'Frankenstein' / TR HEDT update build I just finished...still a fair amount on the table re.. oc headroom (the 3090 tops out ~2265 MHz core, the 2080 Tis around ~2180). I am satisfied now that the cooling system can handle 1700 Watts max after running various benchies. While this setup handily beats my 4090 in OctaneBench, it does so with far less efficiency...
> 
> 
> 
> ￼
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...a quick note on memtest_vulkan....it is a very good tool, but not fool-proof. Besides, it stresses the built-in IMC a lot; typically the board power draw is just below 400 W with memtest_vulkanon my setup...I wouldn't run it for hours on end, but that's just 'IMO'.
> 
> 
> 
> 
> 
> 
> 
> I find that OptaneBench sniffs out unstable core and VRAM when doing a full run. As posted before, I also use some older artifact checkers. While I have benched with well over +1700 on VRAM, I typically leave at +1500.



I'm on vacation for a week but will follow your findings upon return. For sure the MLPerf I'm going to dive into headfirst. Didnt have a chance besides the prelim setup. I am lucky that work and hobby overlap a fair bit!



Would be fascinating to see difference in CPU/GPU combos/arch as well. I bet a TR build with more memory bamdwidth would excel in some areas! Much ML is not CUDA specific, some AMD compute cards are absolute units in the ML space as well


----------



## alasdairvfr

Panchovix said:


> Nice thanks, well I guess the cooler is faulty, used MX6 and the temps got better, but still 90°C+ with 450W with 33°C ambient.
> 
> Gonna get a watercooling probably in the next months. Unlucky since I have no WC system at all, but I guess that's better than the card burning itself by really high hotspot temps. Gonna limit the card to 350-400W in the meanwhile.
> 
> Damn I was happy with my TUF 3080, but this TUF 4090 have been a completely deception. At least it was MSRP I guess.


33C ambient is HOT. Try sticking a thermometer in your case and bring you GPU up in heat, see what your case ambient is. Your cooler may be okay and suffocating with 45-50C case temp


----------



## GRABibus

J7SC said:


> I wouldn't worry about it for the shorter term, but when the manufacturer states 'Thermal Limits / Maximum / 88 C', it is worth keeping that in mind. Anyway, @Panchovix suggested he'll water-cool it in the coming months - problem solved.
> 
> ---
> 
> Elsewhere, any bets on max number of cores to be officially announced at CES '23 for Zen4 / 3DV cache ? 8 cores seems a no-brainer, but I wouldn't mind if they announce a full 16 core...


So, 7950X3D or 13900KS ? 😜


----------



## Nizzen

GRABibus said:


> So, 7950X3D or 13900KS ? 😜


Depends on game and memory oc. If you game


----------



## KingEngineRevUp

GRABibus said:


> So, 7950X3D or 13900KS ? 😜


7000X3D, because you'll have a motherboard ready for 8000X3D, 9000X3D and possibly 10000X3D


----------



## Nizzen

KingEngineRevUp said:


> 7000X3D, because you'll have a motherboard ready for 8000X3D, 9000X3D and possibly 10000X3D


Allways sell Cpu and MB together.  Then buy new cpu and MB.

Selling cpu only is sometimes harder than selling cpu+MB.


----------



## xrb936

Hi guys, finally got my Strix OC. Which BIOS should I go first? Thanks!


----------



## GRABibus

Nizzen said:


> Allways sell Cpu and MB together.  Then buy new cpu and MB.
> 
> Selling cpu only is sometimes harder than selling cpu+MB.


Yes,
I sold easily my x570 Crosshair VIII Hero + 5950X 😉


----------



## zzztopzzz

Shraf2k said:


> So what's the consensus on which 4090 to buy besides "whatever one you can get your hands on"?


Don't know about "consensus", but for a few dollars more I bought the MSI RTX4090 Suprim Liquid X which comes with the cooler attached. Dropped right into my Lian LI Lancool like they were made for each other. The fan connections are integrated so no external plugins. I'm not OCing this card as it hits over 40,000 on the Endgame bench and it did 10 on 3DMark Speedway. CP77, FC6, RDR2, AC Valhalla and Witcher are as fast and smooth as anything I ever played. Also, MSI managed to keep the card size down to about that of a 1080 and it only take up a slot and a half. I got mine at my local MicroCenter for MSRP.


----------



## yzonker

GRABibus said:


> So, 7950X3D or 13900KS ? 😜


Buy both and let us know which is better.


----------



## Sheyster

bmagnien said:


> already leaked
> View attachment 2592247


What happened to the rumored 200MB cache for the 7950x3D??

Based on this I would just go for the 7900x3D for gaming, without a doubt.


----------



## Sheyster

Nizzen said:


> Allways sell Cpu and MB together.  Then buy new cpu and MB.
> 
> Selling cpu only is sometimes harder than selling cpu+MB.


I actually found a local buyer for my old CPU, mobo, RAM, NVMe SSD, CPU cooler and power supply! All he needed was a case to have a working computer. I agree though, at a minimum try to sell the mobo with the CPU.


----------



## bmagnien

Sheyster said:


> What happened to the rumored 200MB cache for the 7950x3D??
> 
> Based on this I would just go for the 7900x3D for gaming, without a doubt.


only 1 ccd has the stacked cache. one ccd clocks high, one ccd utilizes cache. the cache ccd might be binned slightly higher on the 7900/7950x3ds but will still run significantly slower than the un-chache ccd. no idea how they're gonna handle scheduling when a game demands high single thread performance and large cache - it won't be possible with this arrangement.


----------



## coelacanth

bmagnien said:


> only 1 ccd has the stacked cache. one ccd clocks high, one ccd utilizes cache. the cache ccd might be binned slightly higher on the 7900/7950x3ds but will still run significantly slower than the un-chache ccd. no idea how they're gonna handle scheduling when a game demands high single thread performance and large cache - it won't be possible with this arrangement.


Yes it will be really interesting to see real world results, especially with the 7900 and 7950.


----------



## Sheyster

bmagnien said:


> only 1 ccd has the stacked cache. one ccd clocks high, one ccd utilizes cache. the cache ccd might be binned slightly higher on the 7900/7950x3ds but will still run significantly slower than the un-chache ccd. no idea how they're gonna handle scheduling when a game demands high single thread performance and large cache - it won't be possible with this arrangement.


Wow, so I assume Thread Director in Win11 will need an update to handle this architecture adequately..


----------



## Nd4spdvn

Sheyster said:


> Wow, so I assume Thread Director in Win11 will need an update to handle this architecture adequately..


... Or a 7800X3D will actually be AMD's top gaming performer, especially if it will have PBO working from the get go and if they bin these CCDs properly like they did with the 5800X3Ds.


----------



## Carillo

Anyone want to share the secret sauce to get 14800-15000 cpu score in Time Spy Extreme with 13900K ? So frustrating only achieving 13800 point with higher core clocks 🤓


----------



## Sheyster

Nd4spdvn said:


> ... Or a 7800X3D will actually be AMD's top gaming performer, especially if it will have PBO working from the get go and if they bin these CCDs properly like they did with the 5800X3Ds.


Well, anxiously awaiting the reviews and end-user feedback I guess..


----------



## Spiriva

xrb936 said:


> Hi guys, finally got my Strix OC. Which BIOS should I go first? Thanks!











GALAX RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com





Galax 666w


----------



## 7empe

Hey Guys, got my TUF (non OC) today and it does +1550 MHz vram offset (1506 MHz) which is 24.1 GT/s and +225 MHz on core (3015 MHz bench/gaming at 1.1v). With core voltage at 0% it is 2975 at 1050 mV. Is it an average silicon? Hotspot is 72C under load.

Waiting on a package from EKWB with a water block and an active backplate.


----------



## J7SC

Sheyster said:


> Well, anxiously awaiting the reviews and end-user feedback I guess..


Same here...no need to rush into purchasing, as much as I am likely going to go for a 7950X3DV; I like to see lots of real-world benchmarks and user comments first. Besides, early bios and Win updates for a new CPU variant tend to be, ahem, questionable sometimes.

As to the Windows scheduler if only some of the cores have the extra cache (not sure about that), they could accomplish that through L1L2 prefetch settings in the bios. Finally, rumours had this thing at up to 200 MB of combined cache (including L1, but that would only be one extra MB for all 16 cores combined). Still, looking at the table below from GN / YT, that's quite a jump for the plain-vanilla 7950X - which had been taking a beating on MSRP price anyway which will also show up in the used market.

I have never sold any used equipment (mobo, CPU, GPU etc) as they'll end up out on pasture for business use - so no panic on my part to be first inline for a new 7950X3DV as I still have all my setups running just fine. I also still have to figure out what mobo to get...the Asus X670E Gene would be great for 'fun only', but I need one with four RAM slots (even though initially will only use 2 slots w/ 2x 32 GB). It should also have as many PCIe (5, 4) as possible and come with two Ethernet connections (2.5, 10), s.th. along the line of Asus' ProArt may be.


----------



## KingEngineRevUp

Nizzen said:


> Selling cpu only is sometimes harder than selling cpu+MB


I've never had a problem selling a CPU, always gone in a few days of listing.

Then again, we in the North America have Reddit Hardwareswap, a great place to sell hardware.


----------



## mirkendargen

Sheyster said:


> Well, anxiously awaiting the reviews and end-user feedback I guess..


I'm super wary of this setup with one CCD having more cache and one CCD being clocked higher, with per-game scheduling hacks in the AMD chipset driver to manage where things run. We'll see though.


----------



## Blameless

bmagnien said:


> the cache ccd might be binned slightly higher on the 7900/7950x3ds but will still run significantly slower than the un-chache ccd.


Probably binning the v-cache CCD for low leakage and the standard CCD for clock speed. Leakier silicon tends to clock higher, but run hotter, which is sometimes desirable, but the opposite of what you'd want with Raphael-X.



J7SC said:


> As to the Windows scheduler if only some of the cores have the extra cache (not sure about that), they could accomplish that through L1L2 prefetch settings in the bios.


I'm not sure how L1 or L2 prefetch would work differently based on the amount of victim cache on each CCD.



J7SC said:


> Finally, rumours had this thing at up to 200 MB of combined cache (including L1, but that would only be one extra MB for all 16 cores combined).


The 200MiB comes from the sum of the L2 and L3 on two Raphael-X CCDs and is a rational way to count total available cache because the L2 cache is inclusive (anything in the L1 has to be in the L2 as well, so there is no extra unique data to be had by adding the L1), while the L3 is an exclusive victim cache filled with L2 evictions.

Of course, for most anything with dependencies, summing cache from multiple CCXs is misleading as getting data from another CCX is almost as slow as getting data from main memory.



mirkendargen said:


> I'm super wary of this setup with one CCD having more cache and one CCD being clocked higher, with per-game scheduling hacks in the AMD chipset driver to manage where things run. We'll see though.


I think game scheduling will be a nightmare. This will be AMD's first foray into heterogenous CPUs and I doubt Window's scheduler is up to task (it still doesn't know what to do with Intel's E-cores, or even multiple identical AMD CCXes, in some cases).

So yeah, 7800X3D for gaming, unless one is willing to set their own per-game affinities, or knows they need more than eight physical cores for a particular title.


----------



## J7SC

alasdairvfr said:


> I'm on vacation for a week but will follow your findings upon return. For sure the MLPerf I'm going to dive into headfirst. Didnt have a chance besides the prelim setup. I am lucky that work and hobby overlap a fair bit!
> 
> Would be fascinating to see difference in CPU/GPU combos/arch as well. I bet a TR build with more memory bamdwidth would excel in some areas! Much ML is not CUDA specific, some AMD compute cards are absolute units in the ML space as well


Enjoy your vacation; Canadian winter(s) await upon your return...just hoping you're not vacationing in California right now, what with atmospheric rivers and all.

You're right about the TR and its bandwidth; I don't want to stray too far off the 4090 theme, as much as the 4090 may also one day end up with the old (or new) Threadripper, so in the spoiler, you'll see some bandwidth numbers and a quick peek during leak testing of the 'TR w/ the triples'...once the rebuild is finished (tomorrow ?), I start ML benchmarking on all the setups.


Spoiler






















 
As to 7950X3DV, we obviously have to wait for real-world / third-party testing. I do think AMD will have tested much of this out though in gaming and benching by now.



Blameless said:


> (...) I'm not sure how L1 or L2 prefetch would work differently based on the amount of victim cache on each CCD.(...)


...w/ 5950X, L1L2 prefetch settings toggled the way cores internally identified as the best performers per micro-code were called to action in lower-core-count requiring games/apps. It all depends how they tag the cores with extra cache in the internal performance tables (if indeed there will be cache amount differences).

But as I posted multiple times now, best to wait for real-world third-party testing; AMD engineers have proven to be pretty smart ...if all else fails, there's always the 13900KS (oh, I forgot, P+E cores...)


----------



## Blameless

What sort of power consumption are you guys getting in Cyberpunk 2077 with everything on ultra at 4k, including ray tracing, plus some level of DLSS that still keeps the game GPU limited?

_Edit:_ I'm also using HDR and DLDSR.

I'm running ~2.8GHz effective clock with 1000mV and I'm peaking in the 570w range. This is even more than Path of Exile and a full 100w beyond Port Royal.



J7SC said:


> ...w/ 5950X, L1L2 prefetch settings toggled the way cores internally identified as the best performers per micro-code were called to action in lower-core-count requiring games/apps. It all depends how they tag the cores with extra cache in the internal performance tables (if indeed there will be cache amount differences).


Are you certain about that? That sounds like what CPPC is doing, not the prefetchers, and I don't recall manipulating the prefetchers doing anything to preferred core order.

I had assumed that the prefetchers were simply telling the caches to grab the next adjacent cache line whenever accessing higher level data, as that's all it's supposed to do. Of course, there are plenty of undocumented 'features' in AMD's AGESA, so it may well be doing more...


----------



## bmagnien

I sent a DM to @zhrooms to possibly make a 7XXXx3D Owners Thread to free up this thread. Although I maintain it's still slightly on topic given that the only reason we're all chomping at the bit for faster CPUs is due to the incredible performance of the 4090


----------



## J7SC

bmagnien said:


> I sent a DM to @zhrooms to possibly make a 7XXXx3D Owners Thread to free up this thread. Although I maintain it's still slightly on topic given that the only reason we're all chomping at the bit for faster CPUs is due to the incredible performance of the 4090


'zhrooms' does mostly GPUs afaik, but you can also try DMing 'admin' or 'enterprise'.


----------



## Nd4spdvn

Blameless said:


> What sort of power consumption are you guys getting in Cyberpunk 2077 with everything on ultra at 4k, including ray tracing, plus some level of DLSS that still keeps the game GPU limited?


The most I've seen was 525W, RT psycho, 4K DLSS Quality, HDR and a couple of reshade shaders. This is on a Suprim X on air with Galax bios at 1.1V and 3075/3060 requested clocks (about 3GHz effective) and +972 memory.


----------



## Blameless

Nd4spdvn said:


> The most I've seen was 525W, RT psycho, 4K DLSS Quality, HDR and a couple of reshade shaders. This is on a Suprim X on air with Galax bios at 1.1V and 3075/3060 requested clocks (about 3GHz effective) and +972 memory.


Are you running/reaching a power limit?

Does power consumption go up or down if you drop RT to 'ultra'?


----------



## bmagnien

Blameless said:


> _Edit:_ I'm also using HDR and DLDSR.


I’d imagine it’s DLDSR that is causing increased power usage, similar to if you run furmark at 4K vs 8k.


----------



## yzonker

I can't get over 500w no matter how I configure CP2077 on my machine. Even 1100mv, max settings with DLSS off is only about 500w. This is on the Strix bios.


----------



## J7SC

FYI, Galax vbios on my GPU hits between 530 W and 540 W w/ Cyberpunk '77.


----------



## Blameless

I'm not sure if it's my combination of settings, my saved game (which has ~300 hours on it), a leaky GPU sample, a core VRM being run above it's ideal efficiency, or some combination of these factors.



yzonker said:


> I can't get over 500w no matter how I configure CP2077 on my machine. Even 1100mv, max settings with DLSS off is only about 500w. This is on the Strix bios.


DLSS and DLDSR both increase load, as long as the game remains GPU limited, in my experience.



J7SC said:


> FYI, Galax vbios on my GPU hits between 530 W and 540 W w/ Cyberpunk '77.


What settings?

Just being under water could be responsible for much of that. At 570w my hotspot is almost 90C and hotter parts are less efficient parts. Still 570w at 1v feels like a lot for a game.


----------



## inedenimadam

mirkendargen said:


> I'm super wary of this setup with one CCD having more cache and one CCD being clocked higher, with per-game scheduling hacks in the AMD chipset driver to manage where things run. We'll see though.


I have a sneaking suspicion that process lasso will get an update pretty quick after release, if they don't already have a sample in hand. I have been using it since the launch of intels big.little. Great for managing core affinity, and priority. I too have my doubts as to how MS will handle the asymetic chips, but i will manage it by hand if need be.


----------



## J7SC

inedenimadam said:


> I have a sneaking suspicion that process lasso will get an update pretty quick after release, if they don't already have a sample in hand. I have been using it since the launch of intels big.little. Great for managing core affinity, and priority. I too have my doubts as to how MS will handle the asymetic chips, but i will manage it by hand if need be.


...process lasso is great, have been using it on a Threadripper for years for certain games and apps. Regarding 7950X3d differential chiplets, oc'ing etc, "FYI" > here

---

...have been looking at various reviews of the 4080 and 4080 mini 4070 Ti, turns out that the 4090 was a smart buying decision on top of everything else, especially at the original price !


----------



## Blameless

Alright I think I figured out what was going on with my Cyberpunk readings...










I'm not entirely sure what the first NVVDD Output Power reading is, but it's either an incorrect reading or a transient spike. Actual "GPU Power" peaked at ~420w, which seems more in line with my settings.


----------



## andreiga76

On A Plague Tale: Requiem, with DLAA enable at 4k, power usage is 540-545W at 1.09/1.095v, using Gigabyte Master bios.


----------



## Xdrqgol

inedenimadam said:


> I have a sneaking suspicion that process lasso will get an update pretty quick after release, if they don't already have a sample in hand. I have been using it since the launch of intels big.little. Great for managing core affinity, and priority. I too have my doubts as to how MS will handle the asymetic chips, but i will manage it by hand if need be.


More info here on how 7950X3D will handle workloads. This needs to work once it is released.
*Thus the reason we do not have an exact release day


----------



## Helmbo

Anyone used LM and got good results? - waterccoling that is


----------



## Shoggoth

Helmbo said:


> Anyone used LM and got good results? - waterccoling that is


I used liquid metal (Conductonaut) when I mounted my Alphacool water block. I don't really know how good the results are in relation to anything or anyone, but a 62C hotspot after 1.5h+ of looped Superposition seems good to me.


----------



## 7empe

Carillo said:


> Anyone want to share the secret sauce to get 14800-15000 cpu score in Time Spy Extreme with 13900K ? So frustrating only achieving 13800 point with higher core clocks 🤓


Below your requirements, but thought you could be intrested.


----------



## Helmbo

Shoggoth said:


> I used liquid metal (Conductonaut) when I mounted my Alphacool water block. I don't really know how good the results are in relation to anything or anyone, but a 62C hotspot after 1.5h+ of looped Superposition seems good to me.


Im picking up the exact same block tonight  - did you do temps, before and after?


----------



## Carillo

7empe said:


> Below your requirements, but thought you could be intrested.
> 
> View attachment 2592506


Thanks. Picked up a 4090 Phantom on sale. Not bad at all ☺


----------



## ttnuagmada

Helmbo said:


> Anyone used LM and got good results? - waterccoling that is


Used it on several cards over a period of years. Works great. Just be sure to either coat the surrounding caps on the GPU PCB in clear nail polish, or cover them with slivers of electrical tape (have done both on 3 cards each and have had 0 problems). I used Conductonaut on all of them. Never saw any degredation in temps on my 1080 Ti's or anything like that over a 3 year period. They also cleaned up fine when I put the air coolers back on after I upgraded to a 3090.


----------



## Helmbo

ttnuagmada said:


> Used it on several cards over a period of years. Works great. Just be sure to either coat the surrounding caps on the GPU PCB in clear nail polish, or cover them with slivers of electrical tape (have done both on 3 cards each and have had 0 problems). I used Conductonaut on all of them. Never saw any degredation in temps on my 1080 Ti's or anything like that over a 3 year period. They also cleaned up fine when I put the air coolers back on after I upgraded to a 3090.


LM and nailpolisher then


----------



## Panchovix

Blameless said:


> Alright I think I figured out what was going on with my Cyberpunk readings...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not entirely sure what the first NVVDD Output Power reading is, but it's either an incorrect reading or a transient spike. Actual "GPU Power" peaked at ~420w, which seems more in line with my settings.


Which card AIB do you have, or VBIOS? First time I see 4000RPM on a GPU fan


----------



## Shraf2k

)


Panchovix said:


> Which card AIB do you have, or VBIOS? First time I see 4000RPM on a GPU fan


Probably a glitch like my million rpm reading (ignore the highlight, that was for something else)


----------



## BRE1979

I have had my 4090 for a couple months now, was not sure how I would like it compared to the 3090 Ti I had, but its a solid 4k card and temps are way cooler then I thought they would be. My old 3090 was a heater at times then that prompted me to get a 3090 Ti which was better temp wise but the 4090 Fe I have now is very nice on temps and solid 4k game play. Looking forward to getting a waterblock on it sometime this year as I'm doing a custom loop build.


----------



## Van Doorn

sanchitdang said:


> Hi, Does anyone has the bios from the Gigabyte 4090 Xtreme Waterforce rev 1.1?


I just submitted my VBIOS using GPU-Z: VGA Bios Collection: Gigabyte RTX 4090 24 GB | TechPowerUp
Card was bought Dec 27 and sticker on side of box said Rev1.1.


----------



## MadPharma

Juste received my Alphacool waterblock for my Gigabyte Gaming OC. Anyone was able to remove the Gigabyte warranty sticker on the chip screw without destroying it? If so, how?

Thx!


----------



## Blameless

Panchovix said:


> Which card AIB do you have, or VBIOS? First time I see 4000RPM on a GPU fan


Gigabyte Windforce with the GALAX BIOS. The fan speed is a reporting error; I get occasional spikes to nearly double actual RPM.


----------



## Sheyster

Blameless said:


> Gigabyte Windforce with the GALAX BIOS. The fan speed is a reporting error; I get occasional spikes to nearly double actual RPM.


I've gotten this glitch on one fan as well, GB-G-OC with the Galax BIOS. I also encountered it with the ASUS Strix BIOS.


----------



## J7SC

Shraf2k said:


> )
> 
> Probably a glitch like my million rpm reading (ignore the highlight, that was for something else)
> View attachment 2592532


...My CPU_OPT fan 'only' turns at 45,344 RPM...then again, my super-duper water-cooling system allegedly flows over 1,000 l/min ...


----------



## Blameless

MadPharma said:


> Juste received my Alphacool waterblock for my Gigabyte Gaming OC. Anyone was able to remove the Gigabyte warranty sticker on the chip screw without destroying it? If so, how?
> 
> Thx!


My Windforce's sticker wasn't scored to tear upon removal, so all I had to do was lift an edge up with a dental probe and pull it off. From what I can see in teardown pictures, the sticker is similar on the Gaming OC.

For other parts with brands known to be stingy about those silly anti tamper stickers, sometimes I'll use a pair of thin pliers with some grip tape on the tip to pull the whole screw and replace it with one of my own.


----------



## MadPharma

Ok, ty Blameless.


----------



## Shoggoth

Helmbo said:


> Im picking up the exact same block tonight  - did you do temps, before and after?



Had I had room for the card in its original form I would have done so, but no.


----------



## J7SC

Helmbo said:


> Im picking up the exact same block tonight  - did you do temps, before and after?


...this may provide some reference points though this is for a Giga-G-OC with a Bykski block vs stock air


----------



## d3v0

Flashed my Gigabyte Windforce 4090 (480W) to the Gigabyte Gaming OC 4090 bios (600W) and ran a bunch of benchmarks/overclocking - AMA


----------



## Helmbo

J7SC said:


> ...this may provide some reference points though this is for a Giga-G-OC with a Bykski block vs stock air
> View attachment 2592600


Is this with Liquid metal ?


----------



## pantsoftime

Has anyone got an Alphacool block going with a Gigabyte Gaming OC or Aorus Master? I can't get a good mount on mine. I've remounted twice and I'm getting bad hot spots (~95C). When I remounted the first time I noticed very poor contact. My current thinking is that one of the gap pads is a bit too thick and causing the GPU contact to suffer, but I'm not sure which one(s). I haven't been able to see any problems visually. I tried a couple of different torque patterns for the spring-loaded screws but no luck yet.

Edit: Third time was a charm. I massaged the thermal pads to squeeze them as small as I could get them on the side with poor contact. I also replaced a couple with slightly thinner pads. Temps are phenomenal now. Hot spot is less than 10 degrees from avg GPU temp at peak loads.


----------



## kx11

How do i flash the bios?? no guide to do this on YT at all


----------



## J7SC

Helmbo said:


> Is this with Liquid metal ?


Nope, Gelid GC Extreme


----------



## STR3T

kx11 said:


> How do i flash the bios?? no guide to do this on YT at all


BIOS Flash Utilities


----------



## kx11

Flashed my Xtreme waterforce rev1.0 with rev1.1 bios. now power target is 133% max and 600w TDP


----------



## sanchitdang

Van Doorn said:


> I just submitted my VBIOS using GPU-Z: VGA Bios Collection: Gigabyte RTX 4090 24 GB | TechPowerUp
> Card was bought Dec 27 and sticker on side of box said Rev1.1.


Thanks a lot, that's what I was looking for


----------



## pantsoftime

I see a few of you are running the 600W waterforce BIOS - please report back if you see better perf than the Aorus Master. There was a measurable performance bump from Gaming OC -> Aorus Master, so I'd be curious if you see further improvements on the waterforce.


----------



## Nizzen

pantsoftime said:


> I see a few of you are running the 600W waterforce BIOS - please report back if you see better perf than the Aorus Master. There was a measurable performance bump from Gaming OC -> Aorus Master, so I'd be curious if you see further improvements on the waterforce.


It takes 1 minute to flash bios AND restart, so there are no reason to not test yourself 
Repport back with your findings, and make the forums better


----------



## pantsoftime

Nizzen said:


> It takes 1 minute to flash bios AND restart, so there are no reason to not test yourself
> Repport back with your findings, and make the forums better


Thanks. I do have plans to try it later today

Edit: I tried this BIOS - it's very good. It can hold slightly higher clocks (1-2 speed bins) during benches. I think for normal usage it's comparable to Aorus Master but maybe ever so slightly better. I set several new personal bests in my benching. It's definitely my new default.


----------



## kx11

Van Doorn said:


> I just submitted my VBIOS using GPU-Z: VGA Bios Collection: Gigabyte RTX 4090 24 GB | TechPowerUp
> Card was bought Dec 27 and sticker on side of box said Rev1.1.


Using your Vbios now on my Rev 1.0

so far so good


----------



## Pepillo

I have verified that if the Hot Spot exceeds 88º the thermal throttling does not come into action, it seems that it only works if the GPU exceeds that temperature. What is the maximum of the Hot Spot in these 4090?

Thanks


----------



## ALSTER868

pantsoftime said:


> There was a measurable performance bump from Gaming OC -> Aorus Master


Do you happen to have a link to Aorus Master bios?


----------



## pantsoftime

ALSTER868 said:


> Do you happen to have a link to Aorus Master bios?


This is the one I was using: Gigabyte RTX 4090 VBIOS


----------



## Sheyster

pantsoftime said:


> Thanks. I do have plans to try it later today
> 
> Edit: I tried this BIOS - it's very good. It can hold slightly higher clocks (1-2 speed bins) during benches. I think for normal usage it's comparable to Aorus Master but maybe ever so slightly better. I set several new personal bests in my benching. It's definitely my new default.


Have you tried the GALAX HOF BIOS? Many of us are using it even for daily gaming.


----------



## Zero989

For Canadians: https://www.bestbuy.ca/en-ca/produc...1-hdmi-2-1-x-1-display-port-1-4a-x-3/16551256


----------



## PLATOON TEKK

Things are about to get interesting.

also sorry for the **** watermark but last gen both sellers and blogs used all my pics.
Wonder how this will hold up against the current suprim/phantek. Also need a block


----------



## ALSTER868

Well, not bad for the obsolete Z590 and G-OC with Galax bios I think









I scored 11 119 in Speed Way


Intel Core i9-11900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com













I scored 28 546 in Port Royal


Intel Core i9-11900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com


----------



## J7SC

PLATOON TEKK said:


> View attachment 2592791
> 
> Things are about to get interesting.
> 
> also sorry for the **** watermark but last gen both sellers and blogs used all my pics.
> Wonder how this will hold up against the current suprim/phantek. Also need a block
> 
> View attachment 2592800


...drool ! 😋

---

FYI, I normally use OCCT but also tried out Furmark on both the 4090 and the 3090 Strix last night. Specifically, I am using the Asus ROG version of Furmark w/Vulkan setting 4K. Not sure if this has been mentioned here yet, but that ROG version has a built-in artifact checker, and too-high VRAM gets picked up by it.


----------



## tubs2x4

Zero989 said:


> For Canadians: https://www.bestbuy.ca/en-ca/produc...1-hdmi-2-1-x-1-display-port-1-4a-x-3/16551256


That’s the model I would like to get cause of its small size but they jacked the price up $100 more same with memory express. It was $2449 before. Guess if your at 2300 already for a trio x what’s another 250$ haha crazy prices.


----------



## Nizzen

PLATOON TEKK said:


> View attachment 2592791
> 
> Things are about to get interesting.
> 
> also sorry for the **** watermark but last gen both sellers and blogs used all my pics.
> Wonder how this will hold up against the current suprim/phantek. Also need a block
> 
> View attachment 2592800


Soooo sexy


----------



## neteng101

J7SC said:


> FYI, Galax vbios on my GPU hits between 530 W and 540 W w/ Cyberpunk '77.


There's no guarantee the Galax bios will report correctly in monitoring software. Maybe if someone tried it on another Galax 4090 we could see very different results.

@yzonker knows all too well from previous testing - don't trust software, especially with a vbios from another manufacturer.


----------



## d3v0

any benefit to bios swapping instead of just overclocking and leaving it? I flashed back from the gaming OC bios to my windforce bios because it was same OC performance anyway.


----------



## Zero989

tubs2x4 said:


> That’s the model I would like to get cause of its small size but they jacked the price up $100 more same with memory express. It was $2449 before. Guess if your at 2300 already for a trio x what’s another 250$ haha crazy prices.


MSI had to do this as Best Buy sell their Suprim non liquid @ 2450


----------



## J7SC

neteng101 said:


> There's no guarantee the Galax bios will report correctly in monitoring software. Maybe if someone tried it on another Galax 4090 we could see very different results.
> 
> @yzonker knows all too well from previous testing - don't trust software, especially with a vbios from another manufacturer.


...running it with both stock and Galax vbios, and using more than one piece of monitoring software each time...the _comparative_ numbers seem quite consistent. But I do have a kill-a-watt, just using it elsewhere to finish a work-build with 2x 1300 W HPC PSUs


----------



## SilenMar

pantsoftime said:


> I see a few of you are running the 600W waterforce BIOS - please report back if you see better perf than the Aorus Master. There was a measurable performance bump from Gaming OC -> Aorus Master, so I'd be curious if you see further improvements on the waterforce.


Funny Gigabyte already stated you need hardware components swap to properly use Rev1.1 bios.

It's either removing half-covered VRM or replace the fan controller.

Only applying Waterforce rev1.1 BIOS won't fix anything regrading fan spikes when the card is loading 500W+ average. 

And people are hallucinating enough to hail "great" temps of 45C at only 200W load on a water-cooled GPU?


----------



## Zero989

pantsoftime said:


> I see a few of you are running the 600W waterforce BIOS - please report back if you see better perf than the Aorus Master. There was a measurable performance bump from Gaming OC -> Aorus Master, so I'd be curious if you see further improvements on the waterforce.


How much of a difference? I just tested it and there's 0 diff in Speedway.

Just tested them...

Gigabyte OC and Aorus Master perform the same.

Galax = 100 more points.


----------



## yzonker

J7SC said:


> ...running it with both stock and Galax vbios, and using more than one piece of monitoring software each time...the _comparative_ numbers seem quite consistent. But I do have a kill-a-watt, just using it elsewhere to finish a work-build with 2x 1300 W HPC PSUs


Since you've been swapping back and forth between stock and Galax bios. 

How much of in increase do you see in effective clocks by going from 1050mv to 1100mv in CP2077? 

I just discovered something odd with my card. On either the TUF or Strix bios, I see almost no increase in effective clock between 1050 and 1100mv. But of course on the Galax bios I see an increase as would be expected.

I was expecting to see a difference but not to have the Asus bios show virtually no increase. 

This doesn't hold true in Port Royal though. Seems to just be some games. Does it in both CP2077 and GotG.


----------



## J7SC

yzonker said:


> Since you've been swapping back and forth between stock and Galax bios.
> 
> How much of in increase do you see in effective clocks by going from 1050mv to 1100mv in CP2077?
> 
> I just discovered something odd with my card. On either the TUF or Strix bios, I see almost no increase in effective clock between 1050 and 1100mv. But of course on the Galax bios I see an increase as would be expected.
> 
> I was expecting to see a difference but not to have the Asus bios show virtually no increase.
> 
> This doesn't hold true in Port Royal though. Seems to just be some games. Does it in both CP2077 and GotG.


...I really don't know as I usually game with 'only' 1.05v / 3045 MHz core / +1500 VRAM / 110%-115% PL with either vbios. From what I recall, the Galax doesn't seem to downclock much at all unless delta temps jump beyond a given threshold.

In other news, here's my latest creation (there's a pun in there, per pics)...TR + 2x2080 Ti + 3090 Strix, dual 1300W PSUs etc...the 3090 was bumped by the 4090 but it found a home in this ML and rendering setup


Spoiler


----------



## Sheyster

J7SC said:


> In other news, here's my latest creation (there's a pun in there, per pics)...TR + 2x2080 Ti + 3090 Strix, dual 1300W PSUs etc...the 3090 was bumped by the 4090 but it found a home in this ML and rendering setup
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2592861


Well done Dr. Frankenstein!


----------



## kx11

J7SC said:


> ...I really don't know as I usually game with 'only' 1.05v / 3045 MHz core / +1500 VRAM / 110%-115% PL with either vbios. From what I recall, the Galax doesn't seem to downclock much at all unless delta temps jump beyond a given threshold.
> 
> In other news, here's my latest creation (there's a pun in there, per pics)...TR + 2x2080 Ti + 3090 Strix, dual 1300W PSUs etc...the 3090 was bumped by the 4090 but it found a home in this ML and rendering setup
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2592861



A bit of wild branding you got going, Aorus, MSI and ASUS in the same build


----------



## J7SC

kx11 said:


> A bit of wild branding you got going, Aorus, MSI and ASUS in the same build


...yes, I guess I'm a lousy fan-boy 🥴


----------



## ALSTER868

---


----------



## yzonker

J7SC said:


> ...I really don't know as I usually game with 'only' 1.05v / 3045 MHz core / +1500 VRAM / 110%-115% PL with either vbios. From what I recall, the Galax doesn't seem to downclock much at all unless delta temps jump beyond a given threshold.
> 
> In other news, here's my latest creation (there's a pun in there, per pics)...TR + 2x2080 Ti + 3090 Strix, dual 1300W PSUs etc...the 3090 was bumped by the 4090 but it found a home in this ML and rendering setup
> 
> 
> Spoiler
> 
> 
> 
> 
> View attachment 2592861


It's interesting though in that this may be why people have found varying results in increasing core clock in games. 

In doing more testing, games that load the GPU less heavily (GTA 5 for one), effective clocks are fairly close to requested (within 30 mhz or so). But hammer it with a modern game like GotG on max settings 4k, effective clock drops dramatically and can be almost 200 mhz lower than requested with no gain at all compared to 1050mv. Basically requested goes up, but effective stays the same.

I can't find it right now, but I recall someone (probably Falkentyne) talking about how a 3090 will drop effective clocks when locking it at the full 1100mv, same as I'm seeing here. I think it had to do with MSVDD being too low possibly. 

And that may be what the Galax bios has fixed because it exhibits none of these odd behaviors.

BTW, I'm doing all blue on RGB now too. My Bykski block is stuck on blue right now though as I can't find an adapter that will work correctly with my mobo. I bought this adapter, but only get faint red and no control.









Bykski 5v Addressable RGB (RBW) Adapter Cable


The Bykski 5v Addressable RGB (RBW) Adapter Cable allows you to connect your 5V A-RGB LED header directly to your compatible board connection. Features Converts single motherboard LED header into native Bykski A-RGB connection Integrate your compatible fan, waterblock Specifications Brand:Bykski...




www.bykski.us





How did you connect yours? 

Sorry, I generally avoid RGB, so probably noob question. But I'm stuck with it on the mobo and ram, so decided to connect my blocks/res.


----------



## mirkendargen

yzonker said:


> BTW, I'm doing all blue on RGB now too. My Bykski block is stuck on blue right now though as I can't find an adapter that will work correctly with my mobo. I bought this adapter, but only get faint red and no control.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bykski 5v Addressable RGB (RBW) Adapter Cable
> 
> 
> The Bykski 5v Addressable RGB (RBW) Adapter Cable allows you to connect your 5V A-RGB LED header directly to your compatible board connection. Features Converts single motherboard LED header into native Bykski A-RGB connection Integrate your compatible fan, waterblock Specifications Brand:Bykski...
> 
> 
> 
> 
> www.bykski.us
> 
> 
> 
> 
> 
> How did you connect yours?
> 
> Sorry, I generally avoid RGB, so probably noob question. But I'm stuck with it on the mobo and ram, so decided to connect my blocks/res.


That's a 5V addressable RGB connection, make sure the block you got is 5V addressable not 12V. Given that you're saying it's faint, I'm guessing it's 12V. 5V addressable is 3 pins (+5V, ground, data) and gets a digital signal on the data pin telling it what colors to be, can be daisy chained and address different devices, etc. 12V is 4pin and just a dumb analog connection (+12Vx3 for a red, blue, and green channels, 1 ground, voltage is modified on the RGB channels to adjust the color).

The confusion comes when Bykski puts that tiny 4pin connector on the end of the block regardless of which RGB standard it's using, and you have to just guess at which adapter to connect to it.


----------



## yzonker

mirkendargen said:


> That's a 5V addressable RGB connection, make sure the block you got is 5V addressable not 12V. Given that you're saying it's faint, I'm guessing it's 12V. 5V addressable is 3 pins (+5V, ground, data) and gets a digital signal on the data pin telling it what colors to be, can be daisy chained and address different devices, etc. 12V is 4pin and just a dumb analog connection (+12Vx3 for a red, blue, and green channels, 1 ground, voltage is modified on the RGB channels to adjust the color).
> 
> The confusion comes when Bykski puts that tiny 4pin connector on the end of the block regardless of which RGB standard it's using, and you have to just guess at which adapter to connect to it.


Yea, the little adapter box it came with says 12v on it. It has a 4 pin cable that comes off of that, but it doesn't work either on the 4 pin header on my mobo. 

This just furthers my believe that RGB is total garbage. It's annoying that everything has it now with no option to opt out if you don't want that crap. I guess I have the reddit posers to thank for that.


----------



## mirkendargen

yzonker said:


> Yea, the little adapter box it came with says 12v on it. It has a 4 pin cable that comes off of that, but it doesn't work either on the 4 pin header on my mobo.
> 
> This just furthers my believe that RGB is total garbage. It's annoying that everything has it now with no option to opt out if you don't want that crap. I guess I have the reddit posers to thank for that.


Try one of the cheapo Bykski RGB controllers. They have direct connections to that tiny 4pin connector and you can skip the mobo, then a little remote where you can set stuff to whatever color and leave it there.


----------



## J7SC

mirkendargen said:


> Try one of the cheapo Bykski RGB controllers. They have direct connections to that tiny 4pin connector and you can skip the mobo, then a little remote where you can set stuff to whatever color and leave it there.


...I was quite amazed when I got my Bykski block box with all the extra things in there (for US$122...), including the RGB variants and the little remote control you mentioned. That said, everything worked fine on the Asus mobo w/o the remote on my setup (4090 = bottom right in pics).


----------



## yzonker

mirkendargen said:


> Try one of the cheapo Bykski RGB controllers. They have direct connections to that tiny 4pin connector and you can skip the mobo, then a little remote where you can set stuff to whatever color and leave it there.


Oh this I guess. I love how it came with a pull out strip for the battery but no battery.


----------



## yzonker

J7SC said:


> ...I was quite amazed when I got my Bykski block box with all the extra things in there (for US$122...), including the RGB variants and the little remote control you mentioned. That said, everything worked fine on the Asus mobo w/o the remote on my setup (4090 = bottom right in pics).
> View attachment 2593002


But what cable did you use to connect it to the mobo? Did you use the little controller box it came with and then connect to the mobo from there, or directly?


----------



## J7SC

yzonker said:


> But what cable did you use to connect it to the mobo? Did you use the little controller box it came with and then connect to the mobo from there, or directly?


...directly to a cable-tree/splitter that is connected to the Asus mobo


----------



## Blameless

The more I play with it, the more convinced I am that the main change to the Galax ROM, other than the power limit and fan tables, is the load-line and VRM PWM switching frequency. I'd need to scope it to be sure, but it seems slightly less efficient and more prone to high transient spikes on this cost-cut Windforce VRM. A less droopy load line would also explain the higher effective clocks in some loads.

Anyway, tentative final configuration of my current setup:

























This ASUS case is the smallest box I was sure I could fit everything in without space being unmanageable to work in. Not fond of the front panel connectors or the lack of space behind the board, but the GPU fits without major issue.

Will probably alter the cooling at some point and might replace the velociraptors in the front with SSDs (no sound deadening in this case at all and they can get a bit annoying). Definitely going to cut the silly sleeving off the Noctua 4-pin extension cables as they make cable routing more annoying than it has to be...though maybe I'll just dump all the molex connections and use 4-pin splitters straight off the board. Many of the parts I had laying around weren't originally intended for this build...it was sort of an emergency assembly after I was unexpectedly able to snag the 4090.



yzonker said:


> It's annoying that everything has it now with no option to opt out if you don't want that crap.


You can always disconnect it.


----------



## yzonker

Well there's some unfortunate news. 



Spoiler: Reddit link 





__
https://www.reddit.com/r/nvidia/comments/106lgu3


----------



## stahlhart

yzonker said:


> Well there's some unfortunate news.
> 
> 
> 
> Spoiler: Reddit link
> 
> 
> 
> 
> 
> __
> https://www.reddit.com/r/nvidia/comments/106lgu3


Heard, though I don't know from direct experience, that he could be a difficult individual to have to deal with anyway. Which I guess I would have been at times too, if I had been putting that much time into a non-monetized pursuit and felt that it wasn't appreciated.

Are there any outstanding issues with AB / RTSS still unresolved...? I've been using 4.6.4 / 7.3.3 for as long as I can remember without any trouble (that I am aware of).


----------



## yzonker

stahlhart said:


> Heard, though I don't know from direct experience, that he could be a difficult individual to have to deal with anyway. Which I guess I would have been at times too, if I had been putting that much time into a non-monetized pursuit and felt that it wasn't appreciated.
> 
> Are there any outstanding issues with AB / RTSS still unresolved...? I've been using 4.6.4 / 7.3.3 for as long as I can remember without any trouble (that I am aware of).


It works fine now, but will possibly not work correctly with 50 series. Slightly older versions didn't work with the 4090 when it was released. So a problem down the road.


----------



## Frosted racquet

He said RTSS will continue development, and there are "implementations" other than MSI Afterburner that could be used in the future


----------



## neteng101

Frosted racquet said:


> He said RTSS will continue development, and there are "implementations" other than MSI Afterburner that could be used in the future


Guess its time to install Asus GPU Tweak III and see if its any good. EVGA Precision X1 has likely seen its last update ever to support the unlaunched next gen EVGA card.


----------



## cptclutch

Just got a Gigabyte Gaming OC today and running Port Royal I've been getting around 27200 tops with an average clock at 3ghz and memory around 1500mhz. I noticed that most scores that are around 28k and above have similar speeds listed. Is there a specific setting or something holding me back? I noticed that everyone seems to be running windows 11, does that give that extra 1-2k?


----------



## neteng101

cptclutch said:


> Is there a specific setting or something holding me back? I noticed that everyone seems to be running windows 11, does that give that extra 1-2k?


Possibly slower system memory and CPU. PR seems to love CPU cache and faster RAM.


----------



## bmagnien

Hey all - on the advice of @zhrooms and @ENTERPRISE, I went ahead and created the "Official" Zen 4 X3D thread. Hopefully we can get some traction there from the folks interested here in order to bump this page up in the search rankings to become the source of truth and conversation for all things Ryzen 7000 Series w/ 3D V-Cache. 









[Official] Zen 4 X3D Owner's Club (7800x3D /...


This thread was created to serve as a meeting place to share thoughts, speculate, and theory-craft on the upcoming AMD Zen 4 Raphael-X 7XXXx3D chips: The Ryzen 7 7800X3D, Ryzen 9 7900X3D, and Ryzen 9 7950X3d. After launch, this thread will serve as a place for owners to share experience...




www.overclock.net





For all those chiming in on the topic over the last few pages, apologies for advance on the heads-up tag:
@J7SC @yzonker @Brads3cents @Krzych04650 @KingEngineRevUp @Nico67 @GRABibus @Nizzen @Sheyster @coelacanth @Nd4spdvn @mirkendargen @Blameless @inedenimadam @Xdrqgol


----------



## cptclutch

neteng101 said:


> Possibly slower system memory and CPU. PR seems to love CPU cache and faster RAM.


That's probably it, I'm running DDR4 still with my 13700k.


----------



## mirkendargen

cptclutch said:


> Just got a Gigabyte Gaming OC today and running Port Royal I've been getting around 27200 tops with an average clock at 3ghz and memory around 1500mhz. I noticed that most scores that are around 28k and above have similar speeds listed. Is there a specific setting or something holding me back? I noticed that everyone seems to be running windows 11, does that give that extra 1-2k?


Make sure you don't have Gsync or any kind of frame limiter enabled. That score is about what you get with a 120fps limit.


----------



## kryptonfly

neteng101 said:


> Possibly slower system memory and CPU. PR seems to love CPU cache and faster RAM.


No, PR is only dependent to GPU. For unknown reason I get my best score with my 13900K (same with 12900K) at 1P HToff-8E at 4.5ghz/3.9ghz and ring at 4 ghz, with DDR4 at stock 3600C14.
I scored 29 700 in Port Royal

@cptclutch : you have to force REbar in Nvidia profile inspector for Port Royal.


----------



## J7SC

With AMD at least, system RAM plays a big role in Port Royal, IMO


----------



## Blameless

cptclutch said:


> Just got a Gigabyte Gaming OC today and running Port Royal I've been getting around 27200 tops with an average clock at 3ghz and memory around 1500mhz. I noticed that most scores that are around 28k and above have similar speeds listed. Is there a specific setting or something holding me back? I noticed that everyone seems to be running windows 11, does that give that extra 1-2k?


Borderline CPU limited in the default Port Royal with my 5800X3D and tuned B-die. Still get ~27200 without ReBar, but with HAGS enabled, at ~2.85GHz core, +1594 memory. Windows Server 2022.


----------



## Manya3084

cptclutch said:


> Just got a Gigabyte Gaming OC today and running Port Royal I've been getting around 27200 tops with an average clock at 3ghz and memory around 1500mhz. I noticed that most scores that are around 28k and above have similar speeds listed. Is there a specific setting or something holding me back? I noticed that everyone seems to be running windows 11, does that give that extra 1-2k?


I'm jealous of that memory OC. I can only hit 1295-1320.

This is my highest result using the Gaming OC with the Galax bios.

That's only because I spent days tweaking the DDR5 RAM oc.









I scored 29 082 in Port Royal


AMD Ryzen 9 7950X, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com


----------



## menko2

I need advice de deciding with 4090 to get here in Spain. My options:

MSI suprim liquid 2200€
MSI suprim x  2200€
MSI gaming trio x. 2050€
Gigabyte Gaming OC 1950€
KFA2 GS 1900€
PNY Verto epic-x(non-oc) 1800€

I like the way MSI is making this generation but let me know what's your recommendation looking at the prices. The FE imposible to get in Nvidia website at 1849€ price.

I'll use for gaming 4k120 not for much for benchmarks.


----------



## Zero989

menko2 said:


> I need advice de deciding with 4090 to get here in Spain. My options:
> 
> MSI suprim liquid 2200€
> MSI suprim x 2200€
> MSI gaming trio x. 2050€
> Gigabyte Gaming OC 1950€
> KFA2 GS 1900€
> PNY Verto epic-x(non-oc) 1800€
> 
> I like the way MSI is making this generation but let me know what's your recommendation looking at the prices. The FE imposible to get in Nvidia website at 1849€ price.
> 
> I'll use for gaming 4k120 not for much for benchmarks.


Get the cheapest


----------



## hnizdo

menko2 said:


> I need advice de deciding with 4090 to get here in Spain. My options:
> 
> MSI suprim liquid 2200€
> MSI suprim x 2200€
> MSI gaming trio x. 2050€
> Gigabyte Gaming OC 1950€
> KFA2 GS 1900€
> PNY Verto epic-x(non-oc) 1800€
> 
> I like the way MSI is making this generation but let me know what's your recommendation looking at the prices. The FE imposible to get in Nvidia website at 1849€ price.
> 
> I'll use for gaming 4k120 not for much for benchmarks.


Any "liquid" have an advantage to put heat outside your case, not inside it. I have MSI Liquid 4090 because of that. MSI have probably superior electronics - see first page of this thread.

From fps point of view, there are nil differences.


----------



## Zero989

hnizdo said:


> Any "liquid" have an advantage to put heat outside your case, not inside it. I have MSI Liquid 4090 because of that. MSI have probably superior electronics - see first page of this thread.
> 
> From fps point of view, there are nil differences.


MSI liquid is a roll of dice on coil whine. Mine from November was bad with it. Some others report it too. Bigger heatsinks mask it by absorbing vibrations. MSI liquid not ideal except for size/space.


----------



## Avacado

Zero989 said:


> MSI liquid roll of dice on coil whine. Mine from November was bad with it. Some others report it too.


Just imagine coil whine as your GPU saying "Thanks big daddy" over and over again. If it ain't screaming at you, you ain't doin it right.


----------



## menko2

Zero989 said:


> Get the cheapest


Thank you for the advice.

I know that performance will be the same for gaming at 4k in all of them.
Problems and longevity is what I'm looking for in the card.

Which brand and model is "supposed" to give less problems?


----------



## Zero989

menko2 said:


> Thank you for the advice.
> 
> I know that performance will be the same for gaming at 4k in all of them.
> Problems and longevity is what I'm looking for in the card.
> 
> Which brand and model is "supposed" to give less problems?


All of them have overbuilt heatsinks. some have weaker VRMs than Nvidia FE. For example I wouldnt flash a gamerock with a galax bios. The transient spikes might shorten the life.

Best vram temps go to gigabyte gaming OC. But those don't matter too much.

Nvidia reference had problems at launch due to vbios.

Personally I'd just get the PNY unless the warranty duration is poor. KFA is also a good pick for galax bios with 22vrm


----------



## Blameless

Turns out ReBar was good for almost 700 points:









+150MHz @1025mV, +1594 memory.


----------



## Silkcloth0999

Hi. After watching der8auer's video about AMD reference cards overheating I decided to check my graphics card temperatures and noticed that the graphics card I own reach up to 100C on hotspot, The core peaked at 75C. Removing side panel made no difference as well as laying PC on the side so graphics card is turned vertically. No changes whatsoever. So I come to a conclusion that thermal paste is not evenly spread on the GPU. I have had similar problem with 2060s that thermal throttled due to thermal paste gaps on the GPU. Should o be worried? Or leave it as is for now?
This kinda sucks because after watching reviews I expected very good, cold and quiet card. In the reviews this graphics card peaked at 75 hotspot so it's much higher in my case.
I would repaste the card myself but there's a warranty sticker and I would like to not void the warranty.
What is the peak temperature before card starts to thermal throttle? In AMDs case it's 110, does it apply the same for NVIDIA?
Thanks.
After playing Tiny tina's wonderlands for some time:








Zrzut-ekranu-20230108-214942


Image Zrzut-ekranu-20230108-214942 hosted in ImgBB




ibb.co




How it should be according to review.


----------



## STR3T

4090's should not be hitting 100+c under load. I'd look to Innod3D to replace it (or your vendor you bought it from) vs. re-doing thermal pads/paste.


----------



## hnizdo

Zero989 said:


> MSI liquid is a roll of dice on coil whine. Mine from November was bad with it. Some others report it too. Bigger heatsinks mask it by absorbing vibrations. MSI liquid not ideal except for size/space.


Absolutelly zero whine yet. Bught 2nd half of 12-2022. But any coil can sing, there is a not much things to do something about that. What i criticize about MSI Liquid is small water cooler, and inevitably worse temps. 400W needs 360AiO+. But it has excellent fan control, AiO from CM, components and mechanic quality in comparison with Gigabyte. My opinion.


----------



## Zero989

hnizdo said:


> Absolutelly zero whine yet. Bught 2nd half of 12-2022. But any coil can sing, there is a not much things to do something about that. What i criticize about MSI Liquid is small water cooler, and inevitably worse temps. 400W needs 360AiO+. But it has excellent fan control, AiO from CM, components and mechanic quality in comparison with Gigabyte. My opinion.


My MSI had amazing temps. I run open air though, which is also why I notice coil whine.


----------



## Blameless

Silkcloth0999 said:


> Hi. After watching der8auer's video about AMD reference cards overheating I decided to check my graphics card temperatures and noticed that the graphics card I own reach up to 100C on hotspot, The core peaked at 75C. Removing side panel made no difference as well as laying PC on the side so graphics card is turned vertically. No changes whatsoever. So I come to a conclusion that thermal paste is not evenly spread on the GPU. I have had similar problem with 2060s that thermal throttled due to thermal paste gaps on the GPU. Should o be worried? Or leave it as is for now?
> This kinda sucks because after watching reviews I expected very good, cold and quiet card. In the reviews this graphics card peaked at 75 hotspot so it's much higher in my case.
> I would repaste the card myself but there's a warranty sticker and I would like to not void the warranty.
> What is the peak temperature before card starts to thermal throttle? In AMDs case it's 110, does it apply the same for NVIDIA?
> Thanks.
> After playing Tiny tina's wonderlands for some time:
> 
> 
> 
> 
> 
> 
> 
> 
> Zrzut-ekranu-20230108-214942
> 
> 
> Image Zrzut-ekranu-20230108-214942 hosted in ImgBB
> 
> 
> 
> 
> ibb.co
> 
> 
> 
> 
> How it should be according to review.


A 25C delta is a bit high for only ~430w. Quite possibly a bad cooler mount.

That said, different loads can result in significantly different deltas.


----------



## Silkcloth0999

Blameless said:


> A 25C delta is a bit high for only ~430w. Quite possibly a bad cooler mount.
> 
> That said, different loads can result in significantly different deltas.


I'll try to tighten the screws on of the card maybe its not tightened enough. 
One question for the people who put waterblock on their cards. Do you say goodbye to warranty or do you somehow remove the warranty sticker uncharmed? I live in EU.


----------



## KedarWolf

Silkcloth0999 said:


> I'll try to tighten the screws on of the card maybe its not tightened enough.
> One question for the people who put waterblock on their cards. Do you say goodbye to warranty or do you somehow remove the warranty sticker uncharmed? I live in EU.


When I bought my card from Memory Express here in Canada, I paid $200 more for the extended warranty when they assured me it covers throwing a water block on my card.


----------



## neteng101

kryptonfly said:


> No, PR is only dependent to GPU. For unknown reason I get my best score with my 13900K (same with 12900K) at 1P HToff-8E at 4.5ghz/3.9ghz and ring at 4 ghz, with DDR4 at stock 3600C14.
> I scored 29 700 in Port Royal
> 
> @cptclutch : you have to force REbar in Nvidia profile inspector for Port Royal.


That's incorrect - PR is dependent on transfers between the CPU and GPU. That's why forcing resizable bar has an effect on the PR scores. If they loaded everything into VRAM, resizable bar would not have the impact it does for PR scores.

As you noted - optimizing those transfers by tweaking your CPU setup can influence your PR scores. I don't think anyone knows the full secret sauce, but suffice to say its far more than GPU only that affects PR scores. I suspect by doing what you're doing, there's more cache for the process to utilize, thus decreasing the need for fast RAM, if its all CPU cache to GPU.


----------



## StreaMRoLLeR

menko2 said:


> I need advice de deciding with 4090 to get here in Spain. My options:
> 
> MSI suprim liquid 2200€
> MSI suprim x 2200€
> MSI gaming trio x.  2050€
> Gigabyte Gaming OC 1950€
> KFA2 GS 1900€
> PNY Verto epic-x(non-oc) 1800€
> 
> I like the way MSI is making this generation but let me know what's your recommendation looking at the prices. The FE imposible to get in Nvidia website at 1849€ price.
> 
> I'll use for gaming 4k120 not for much for benchmarks.


Do not get the cheapest !

Best balanced 4090 is Gaming OC = Cheap, Good cooling and best memory cooling

Best PCB and Cooling and Quality is = Suprim Liquid

Ignore air suprim X 

You can reduce CW. Under 370-360w and 1.000mV, CW is reduced to tolerable level.


----------



## StreaMRoLLeR

hnizdo said:


> Absolutelly zero whine yet. Bught 2nd half of 12-2022. But any coil can sing, there is a not much things to do something about that. What i criticize about MSI Liquid is small water cooler, and inevitably worse temps. 400W needs 360AiO+. But it has excellent fan control, AiO from CM, components and mechanic quality in comparison with Gigabyte. My opinion.





> 400W needs 360AiO+


Sure it helps to have 360 AIO but remember Aorus is full cover block while suprim have vram + core. Fans on Suprim is 2 or 3x better then trash gigabyte fans. Also the Cooler-Master pump on Suprim is suprisingly good. Zero pump gurgle


----------



## Silkcloth0999

Imagine that guys... Zrzut-ekranu-20230109-171800 ... seriously screws around gpu weren't tightened...


----------



## tubs2x4

Zero989 said:


> My MSI had amazing temps. I run open air though, which is also why I notice coil whine.


You remember how much you paid for the liquid x?


----------



## Blameless

Doing some more retesting with ReBar.

Biggest difference so far is in Superposition 4k...a 2% bump to score, but ~35 fps better minimum frame rates:









Same +150Mhz @ 1025mV, +1594MHz memory.



neteng101 said:


> If they loaded everything into VRAM, resizable bar would not have the impact it does for PR scores.


ReBAR is less about PCI-E traffic itself as the memory management overhead.



Silkcloth0999 said:


> Do you say goodbye to warranty or do you somehow remove the warranty sticker uncharmed? I live in EU.


I can't imagine consumer protection laws in most of the EU actually allowing warranty void stickers to be enforceable.



Silkcloth0999 said:


> Imagine that guys... Zrzut-ekranu-20230109-171800 ... seriously screws around gpu weren't tightened...


Not surprising.

Can't see image, but I assume temperatures are closer to expected levels now?


----------



## Zero989

tubs2x4 said:


> You remember how much you paid for the liquid x?


2449 cdn plus tax


----------



## Silkcloth0999

Not surprising.

Can't see image, but I assume temperatures are closer to expected levels now?
[/QUOTE]
Yeah, much better


http://imgur.com/SgwySZ8


----------



## yzonker

neteng101 said:


> That's incorrect - PR is dependent on transfers between the CPU and GPU. That's why forcing resizable bar has an effect on the PR scores. If they loaded everything into VRAM, resizable bar would not have the impact it does for PR scores.
> 
> As you noted - optimizing those transfers by tweaking your CPU setup can influence your PR scores. I don't think anyone knows the full secret sauce, but suffice to say its far more than GPU only that affects PR scores. I suspect by doing what you're doing, there's more cache for the process to utilize, thus decreasing the need for fast RAM, if its all CPU cache to GPU.


I've tested this before and not seen a large change in PR scores, for example between XMP timings and OC'ed/tuned RAM. 100pts or less.


----------



## Xavier233

Hi guys, I have a Gigabyte 4090 OC, and a quick question.

In some games, I am getting random FPS drops, even when the GPU is at 40-50% usage. That said, in those instances, my CPU threads (and cores) are not maxed out. So I set the specific game to run in performance mode under the Nvidia Control panel (for the specific game in question). When launching the game, I still see the frequency of the GPU at a low frequency. Any ideas why this is happening?

I prefer not to set the global settings on performance mode, as the GPU will always idle at a higher frequency all the time.

Z790 Hero, 6000 DDR5, 12900K CPU


----------



## J7SC

Port Royal on AMD CPU might be a different animal. I have been running PR ever since it came out, first on a 2950X, then on a 3950X and now 5950X - all showed a score impact when a.) maxing single core performance and b.) maxing the memory 'combo' of bandwidth and timings. Case in point is the TR 2950X; it had an unusual feature (unlike many other TRs of the same series) to switch from NUMA to UMA...that added ~ 400+ points or so and got me up to 15th in HoF / overall back then from low 20s, on 2x water-cooled 2080 Tis.


----------



## neteng101

yzonker said:


> I've tested this before and not seen a large change in PR scores, for example between XMP timings and OC'ed/tuned RAM. 100pts or less.


Still a difference, just not large. I suspect cache matters a lot - 5800X3D seems to do really well in PR too. The lower Intel SKUs just can't score as high as the i9s seems like looking at the leaderboard, which would point to something like larger cache being a factor.


----------



## jootn2kx

Xavier233 said:


> Hi guys, I have a Gigabyte 4090 OC, and a quick question.
> 
> In some games, I am getting random FPS drops, even when the GPU is at 40-50% usage. That said, in those instances, my CPU threads (and cores) are not maxed out. So I set the specific game to run in performance mode under the Nvidia Control panel (for the specific game in question). When launching the game, I still see the frequency of the GPU at a low frequency. Any ideas why this is happening?
> 
> I prefer not to set the global settings on performance mode, as the GPU will always idle at a higher frequency all the time.
> 
> Z790 Hero, 6000 DDR5, 12900K CPU


 Yes what you are experiencing is your CPU that can't process enough data coming from your GPU or the game that is not optimized for multithreading on CPU.
In a lot of games right now you'll be CPU limited with the RTX 4090 even @ 4K resolution.
Even if your CPU does only show 20% used it can still be bottlenecking your GPU.

I'm having a 5800X3D and have some severe cpu bottlenecks too where gpu usage is going to 50/60% in the witcher 3 (only some parts) + RT and Callisto protcol for example.
Nothing you can do about it except of hoping they will update it with DLSS3 (frame generation) or wait for the next generation CPU's that can handle the RTX 4090.
Don't think future games will be better optimized for PC, at the same time ray tracing in modern games is eating up alot of resources.


----------



## Xavier233

jootn2kx said:


> Yes what you are experiencing is your CPU that can't process enough data coming from your GPU.
> In a lot of games right now you'll be CPU limited with this graphics card even @ 4K resolution.
> Even if your CPU doesn't show maxed out it still bottlenecking your GPU most likely.
> 
> I'm having a 5800X3D and have some severe cpu bottlenecks too where gpu usage is going to 50/60% in the witcher 3 + RT and Callisto protcol for example.
> Nothing you can do about it except of hoping they will update it with DLSS3 (frame generation) or wait for the next generation CPU's that can handle the RTX 4090.


Thats what I initially thought as well. But looking at the CPU threads individually while gaming, none of them is really at 100% (thread wise). I am playing on 1440p, and capped all games to 160 FPS.


----------



## yzonker

neteng101 said:


> Still a difference, just not large. I suspect cache matters a lot - 5800X3D seems to do really well in PR too. The lower Intel SKUs just can't score as high as the i9s seems like looking at the leaderboard, which would point to something like larger cache being a factor.


That's definitely not correct. I have both a 5800x and 5800x3D. 5800x always scored higher (50-100pts). Others like Clavenger (sp?) back when they were released saw the same thing. Just doesn't have the raw clock speed, a little higher latency, and cache does nothing. I saw a drop in most GPU bound benchmarks. Only in high framerate benches did it break even or score a little higher (200+ fps).


----------



## kryptonfly

neteng101 said:


> That's incorrect - PR is dependent on transfers between the CPU and GPU. That's why forcing resizable bar has an effect on the PR scores. If they loaded everything into VRAM, resizable bar would not have the impact it does for PR scores.
> 
> As you noted - optimizing those transfers by tweaking your CPU setup can influence your PR scores. I don't think anyone knows the full secret sauce, but suffice to say its far more than GPU only that affects PR scores. I suspect by doing what you're doing, there's more cache for the process to utilize, thus decreasing the need for fast RAM, if its all CPU cache to GPU.


I don't know honestly, in my case there's not much difference, here's with OC at 8P (6 Ghz) 8E (4.7 Ghz), ring 5200mhz with DDR4 at 4300C15 compared with 1P-8E (4.5-3.9 Ghz), ring 4000mhz, DDR4 at 3600C14 stock : Result

Actually there's no difference, vram was a bit higher +1740 instead of +1700 when OC... I agree with @yzonker


----------



## yzonker

kryptonfly said:


> I don't know honestly, in my case there's not much difference, here's with OC at 8P (6 Ghz) 8E (4.7 Ghz), ring 5200mhz with DDR4 at 4300C15 compared with 1P-8E (4.5-3.9 Ghz), ring 4000mhz, DDR4 at 3600C14 stock : Result
> 
> Actually there's no difference, vram was a bit higher +1740 instead of +1700 when OC... I agree with @yzonker


Yup, that's what I see in those runs too (VRAM is a bit higher). Although I haven't tested that interesting combination you came up with (1P/8E), I tried several other permutations and kept ending up back at just stock config on the CPU to run PR. Never found anything that scored higher (13900k).

Even on the Ryzen side, I went from some crap C-die to tuned B-die with my 5800x and saw no (significant) increase in score in PR with my 3090. I could see that changing a little bit with the 4090 possibly, but it's still not going to be big IMO.

I also, in my view anyway, debunked the internet wisdom that Intel platforms score higher than AMD in PR when I originally built the 12900k system. Very very close to the same scores between the 5800x and 12900k, both with tuned RAM. In fact, PR, TS, and TSE were all essentially the same. I did set my highest PR/TS scores with the 12900k, but only by like 20 pts. 

Although with the 4090, I'm fairly confident the 13900k will score a bit better (graphics) in TS since fps is so high now (vs Zen 3). But given how well @J7SC has done in some of these, I suspect the difference is still fairly small. Probably have to step up to 1080p FS, or Superposition 4k to really see significant differences.

Oh, and BTW, I also tested the newest Nvidia driver last weekend and saw no improvement in PR.


----------



## Xavier233

Is it "normal" that when I set a specific game to use "High performance" in the nvidia control panel, when gaming the GPU frequency is not a max?


----------



## yzonker

Xavier233 said:


> Is it "normal" that when I set a specific game to use "High performance" in the nvidia control panel, when gaming the GPU frequency is not a max?


It should be when there is a high load on the GPU. Menus, frame limited, or CPU bound situations may cause it to downclock though.


----------



## Xavier233

yzonker said:


> It should be when there is a high load on the GPU. Menus, frame limited, or CPU bound situations may cause it to downclock though.


I thought it would let the GPU clock highest regardless of load on the GPU? 

Like when u set the general settings to High performance, and the GPU goes full blast on the frequency even when its sitting idle on the desktop?


----------



## ALSTER868

Is anybody expiriencing the same in Warzone 2 and MW2? When at max.settings @3840*1600 the GPU power draw is only 220-270W while maintaining target clocks? The framerate is 170-200 fps in the meanwhile, the CPU is loaded like 30-60%. CPU bottlenecked? Apex Legends, which is not demanding too much to the GPU, in the same time keeps pulling up to 400W at the same max settigs.


----------



## Xavier233

ALSTER868 said:


> Is anybody expiriencing the same in Warzone 2 and MW2? When at max.settings @3840*1600 the GPU power draw is only 220-270W while maintaining target clocks? The framerate is 170-200 fps in the meanwhile, the CPU is loaded like 30-60%. CPU bottlenecked? Apex Legends, which is not demanding too much to the GPU, in the same time keeps pulling up to 400W at the same max settigs.


While gaming open up Task manager, and check the per-thread CPU usage. That would give u a better idea what is happening. If one thread is maxed out, but other threads are like 50% or 80%, then it means simply that the game is not using CPU threads properly. Its not the fault of the CPU per-say, but getting a CPU with the highest clock and performance per core might actually help in this case.


----------



## neteng101

yzonker said:


> I also, in my view anyway, debunked the internet wisdom that Intel platforms score higher than AMD in PR when I originally built the 12900k system. Very very close to the same scores between the 5800x and 12900k, both with tuned RAM. In fact, PR, TS, and TSE were all essentially the same. I did set my highest PR/TS scores with the 12900k, but only by like 20 pts.


Guess its time I gave the Galax bios a try then, force resizable bar and depending on what I get for GPU OC - I should be close to the scores others I see here getting. I've always felt over the generations something or other was holding back PR scores... always further behind in PR, than I am in TS/TSE graphics score with the same GPU.


----------



## zzztopzzz

menko2 said:


> I need advice de deciding with 4090 to get here in Spain. My options:
> 
> MSI suprim liquid 2200€
> MSI suprim x 2200€
> MSI gaming trio x. 2050€
> Gigabyte Gaming OC 1950€
> KFA2 GS 1900€
> PNY Verto epic-x(non-oc) 1800€
> 
> I like the way MSI is making this generation but let me know what's your recommendation looking at the prices. The FE imposible to get in Nvidia website at 1849€ price.
> 
> I'll use for gaming 4k120 not for much for benchmarks.


I have the MSI 4090 Suprim Liquid X and heat is not a problem. The unit costs a little more but when you figure the cost of a custom cooling system you'll come out ahead.


----------



## ALSTER868

Xavier233 said:


> ts not the fault of the CPU per-say, but getting a CPU with the highest clock and performance per core might actually help in this case.


I couldn't imagine that a game like Warzone 2, quite demanding on the GPU would behave like this in 4K and be bottlenecked so badly by a not too old 11900K with tweaked memory. No other games I tested so far showed this picture.


----------



## Xavier233

ALSTER868 said:


> I couldn't imagine that a game like Warzone 2, quite demanding on the GPU would behave like this in 4K and be bottlenecked so badly by a not too old 11900K with tweaked memory. No other games I tested so far showed this picture.


Did you confirm though? I was explaining the method, not the results specific to a certain game  Do you have a screenshot of CPU/thread usage while gaming?


----------



## Xavier233

ALSTER868 said:


> I couldn't imagine that a game like Warzone 2, quite demanding on the GPU would behave like this in 4K and be bottlenecked so badly by a not too old 11900K with tweaked memory. No other games I tested so far showed this picture.


My 12900K can be a bottleneck in some game with the 4090. You would have to get a 13900K with a good core-OC. Its insane I know


----------



## menko2

StreaMRoLLeR said:


> Do not get the cheapest !
> 
> Best balanced 4090 is Gaming OC = Cheap, Good cooling and best memory cooling
> 
> Best PCB and Cooling and Quality is = Suprim Liquid
> 
> Ignore air suprim X
> 
> You can reduce CW. Under 370-360w and 1.000mV, CW is reduced to tolerable level.


I went down in the list and exclude three of the options (PNY, MSI trio X and Gigabyte Gaming OC). Gigabyte looks great overall like you mention but I had problem with their products in the past.

My options:
MSI suprim liquid 2200€
MSI suprim x 2200€
KFA2 GS 1900€

The three of them have good reviews. I would go for Liquid but I don't know how much will last an AIO. The Suprim X (air) I like the way is built and reviews are great. KFA2-GALAX is a golden sample but not many reviews and build quality don't look very good as MSI.

Your thoughts? Important decision...


----------



## kryptonfly

yzonker said:


> Yup, that's what I see in those runs too (VRAM is a bit higher). Although I haven't tested that interesting combination you came up with (1P/8E), I tried several other permutations and kept ending up back at just stock config on the CPU to run PR. Never found anything that scored higher (13900k).
> 
> Even on the Ryzen side, I went from some crap C-die to tuned B-die with my 5800x and saw no (significant) increase in score in PR with my 3090. I could see that changing a little bit with the 4090 possibly, but it's still not going to be big IMO.
> 
> I also, in my view anyway, debunked the internet wisdom that Intel platforms score higher than AMD in PR when I originally built the 12900k system. Very very close to the same scores between the 5800x and 12900k, both with tuned RAM. In fact, PR, TS, and TSE were all essentially the same. I did set my highest PR/TS scores with the 12900k, but only by like 20 pts.
> 
> Although with the 4090, I'm fairly confident the 13900k will score a bit better (graphics) in TS since fps is so high now (vs Zen 3). But given how well @J7SC has done in some of these, I suspect the difference is still fairly small. Probably have to step up to 1080p FS, or Superposition 4k to really see significant differences.
> 
> Oh, and BTW, I also tested the newest Nvidia driver last weekend and saw no improvement in PR.


Indeed in TS I could improve my graphic score just a bit by disabling HT and 8 e-cores (16 threads total), my new graphic record : I scored 37 259 in Time Spy

This guy took my place but with e-cores disabled (read the description, interesting) : I scored 36 680 in Time Spy
To compare with me : Result
As regard the cpu all p-cores, I don't think he forced REbar because I do better in cpu score with 8 e-cores instead of 8 HT.
But just with 16 threads it seems better than a 7950X too...
I still prefer TS, SW and PR because FS depends a lot about tweaks and I don't want to lose too much on cpu score.

Try Friz Chess Benchmark, my normal OC with all cores do ~99 but with 8P (6 Ghz)/8E (4.7 Ghz) (16t) it does ~119 ! It depends a lot of ram, it's a good indicator for gaming performance.


----------



## joyzao

Hi,Guys

RTX 4090 MSI gaming trio, can i overcome the 500w limitation by changing the bios? Or does the board have some limitation? recommend this model? 

Thanks.


----------



## dboom

2nd block, applied loctite on the top only, none of the screws were touched.


----------



## yzonker

joyzao said:


> Hi,Guys
> 
> RTX 4090 MSI gaming trio, can i overcome the 500w limitation by changing the bios? Or does the board have some limitation? recommend this model?
> 
> Thanks.


You can flash any bios other than the FE. Best/fastest bios is the one in my sig.


----------



## joyzao

yzonker said:


> You can flash any bios other than the FE. Best/fastest bios is the one in my sig.


Does it work well on this msi? Because I read somewhere that the board had a limitation on the pcb.

Here where I live, the MSI warranty works very well, so I'm thinking about it vs the other brands.


----------



## yzonker

joyzao said:


> Does it work well on this msi? Because I read somewhere that the board had a limitation on the pcb.
> 
> Here where I live, the MSI warranty works very well, so I'm thinking about it vs the other brands.


I personally wouldn't worry about it. Might want to refrain from hitting it with Kombustor and pulling the full 666w. Most games/benches won't go that high (you could always limit it to 550-600w to be safe). But if you're concerned, then one of the other 600w bios will work as well. I'm not sure which ones give the best fan control for air cooling. Might just have to try one or more and see which you prefer.


----------



## yzonker

kryptonfly said:


> Indeed in TS I could improve my graphic score just a bit by disabling HT and 8 e-cores (16 threads total), my new graphic record : I scored 37 259 in Time Spy
> 
> This guy took my place but with e-cores disabled (read the description, interesting) : I scored 36 680 in Time Spy
> To compare with me : Result
> As regard the cpu all p-cores, I don't think he forced REbar because I do better in cpu score with 8 e-cores instead of 8 HT.
> But just with 16 threads it seems better than a 7950X too...
> I still prefer TS, SW and PR because FS depends a lot about tweaks and I don't want to lose too much on cpu score.
> 
> Try Friz Chess Benchmark, my normal OC with all cores do ~99 but with 8P (6 Ghz)/8E (4.7 Ghz) (16t) it does ~119 ! It depends a lot of ram, it's a good indicator for gaming performance.


Yea FS is a total PITA to run on the 13900k. The CPU/Combined tests are inconsistent for me, so I end up running it multiple times to try to get the best total score. Just yesterday I got my highest graphics score, but the friggin' physics score was nerfed for no obvious reason. Next run, graphics went down a little bit, but physics back up. Changed nothing. Just hit run again.

So I get my best score in part by just luck of them all lining up in one run like this one from yesterday,









I scored 48 916 in Fire Strike Extreme


Intel Core i9-13900K Processor, NVIDIA GeForce RTX 4090 x 1, 32768 MB, 64-bit Windows 11}




www.3dmark.com





The run right before it was several hundred points lower with no changes to settings.


----------



## ALSTER868

Xavier233 said:


> Did you confirm though? I was explaining the method, not the results specific to a certain game


There it is the thread usage while gaming and a screenshot from the game with <200W power draw. And temp 40C or so.


----------



## ALSTER868

Xavier233 said:


> You would have to get a 13900K with a good core-OC. Its insane I know


I thought I could keep this proc for a while until 14th gen, but now I don't know. BTW somehow CPU bound benchmarks figures like PR are not that far form 13900K, so it is Speed Way.
And in GPU bound scenarios it is OK judging by the GPU load/temps and powe draw as well as framerate. Only MW2 and Warzone look weird.


----------



## Xavier233

ALSTER868 said:


> There it is the thread usage while gaming and a screenshot from the game with <200W power draw. And temp 40C or so.
> 
> View attachment 2593163
> View attachment 2593163
> 
> View attachment 2593162
> View attachment 2593162


All cores are utilized, which is good, but you can see Core 0 (thread #2) is constantly at or near 100%. That may or may not be the reason for the FPS drops, I could not tell you. But getting a 13900K or 13900KS will have a much higher frequency and performance per-core, so you might not experience the same FPS drops anymore with those CPUs and this GPU.


----------



## Xavier233

ALSTER868 said:


> I thought I could keep this proc for a while until 14th gen, but now I don't know. BTW somehow CPU bound benchmarks figures like PR are not that far form 13900K, so it is Speed Way.
> And in GPU bound scenarios it is OK judging by the GPU load/temps and powe draw as well as framerate. Only MW2 and Warzone look weird.


I believe the 14th gen will not be on LGA1700, which means, a different motherboard and CPU socket. In your case, a 13th gen would also mean the same, so you can either wait for the 14th, or get a 13900KS I guess.


----------



## StreaMRoLLeR

menko2 said:


> I went down in the list and exclude three of the options (PNY, MSI trio X and Gigabyte Gaming OC). Gigabyte looks great overall like you mention but I had problem with their products in the past.
> 
> My options:
> MSI suprim liquid 2200€
> MSI suprim x 2200€
> KFA2 GS 1900€
> 
> The three of them have good reviews. I would go for Liquid but I don't know how much will last an AIO. The Suprim X (air) I like the way is built and reviews are great. KFA2-GALAX is a golden sample but not many reviews and build quality don't look very good as MSI.
> 
> Your thoughts? Important decision...


Cooler master pump on Liquid. Dunno if MSI supervised CM to make a better pump then their inferior CM CPU AIO’s 

So far I am happy with my Liquid X. Flow is better compared to my 3080 ti strix LC ( both 400w pull 1600rpm t30’s )

If you buy Air suprim it will dump all waste heat into motherboard + ram + vrms. If you overclock your ddr5 ram it wont like this.

Suprim Liquid is closest thing we can buy compared to 3090 Kingpin’s. ( pcb+Quality+ cooling+ binned chip)


----------



## StreaMRoLLeR

joyzao said:


> Hi,Guys
> 
> RTX 4090 MSI gaming trio, can i overcome the 500w limitation by changing the bios? Or does the board have some limitation? recommend this model?
> 
> Thanks.


You can flash 666W Galax Bios.

But for gaming purposes you rarely need over 500W for MSI cards


----------



## J7SC

@kx11 ...I hold you personally responsible for...



Spoiler



the absurdly / (almost) brandless appearance update. Also, big thanks


----------



## yzonker

So, some good news and some bad news.

Good news, Asus 1kw XOC bios!









Asus RTX 4090 VBIOS


24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory




www.techpowerup.com





Bad news is that although it has the special sauce, it doesn't seem to be quite a special as the sauce Galax used. I'm about 50-100 pts away from my best PR score. Effective clocks are incredibly close to requested though. Looked closer than Galax, so I'm not sure why the score isn't there.

Let us know if you get different results please.

It's also funky in that GPUZ shows MSI, but this is definitely 1kw as I easily hit 700w with Kombustor (set to 70%). It also defaults to memory running at 100% and PL defaults to 100% and only goes down from there.


----------



## yzonker

Update, I got close to my best score within about 20 pts. It seemed to tolerate water temps below 10C better as I actually ran that at about 8C which I've not been able to do with any other bios. I think that's partly why I got this close to my highest Galax bios score.











And GPUZ seems to be borked as it's still showing MSI even though I'm back on the Galax bios. I must need to DDU or something. Although both my daily OS and bench OS were behaving the same way. If anyone has any ideas on this let me know.

But, if you were to volt mod, this bios would be the way to go since it is close at least in performance to the Galax with 1kw PL.


----------



## PLATOON TEKK

Hope all has been good.

Been messy since I got back, just had a chance to install both HOF OC Plus and the Suprim. I didn't bother with the GOD chiller setup in Platoon Tekk main since there are no waterblocks for these HOFs yet. The cards feel a lot more quality than last gen did and it seems like Galax got their **** in check again (980-2080 era).

The cards come shipped with a 666w "Performance" bios and a 450w "Silent" bios. The TechPowerup 666w bios is the same revision as mine is, however, the 450w silent bios doesn't seem to be available online (to my knowledge). I will gladly upload it if anyone is interested. I am also asking my connect for both the true Galax XOC bios (might try Asus 1k) and GOC voltage control software (unfortunately v3.0.0.4 doesn't address the cards). In regards to WB, I'm hoping Bykski or Bitspower (as they did for 3090) drop a block for this asap.

For power, I used a MEG Ai 1300w for both HOF setups and used the 12 and 4-converted power plugs, that's why the plugs look different. The "HOF Panel" is identical to last gen and there is slightly less RGB overall on the card's fans (for the better imo). The "RGB crown" is detachable and magnetic, same as the lcd panel. The card's RGB (and separate indicator light) also report if the card is powered and connected fully.

I will put them both to the test and see how the perform vs Suprim/Phanteks on average. Will list temps and power draw etc shorty too, already saw 690w during brief MW2 session. Any specific tests/runs or info anybody wants?

Sorry once more for the watermarks but I got some headache through stolen HOF pics from resellers before.











Spoiler: Pics & Setup



























































Videos of Quake RTX test session after messy install.


----------



## inedenimadam

joyzao said:


> Hi,Guys
> 
> RTX 4090 MSI gaming trio, can i overcome the 500w limitation by changing the bios? Or does the board have some limitation? recommend this model?
> 
> Thanks.


I have a gaming x trio. I have flashed a few different bioses, but there are VERY few instances where it will eat 500+ watts, and i get no gains from going 1.05V to 1.1V. We don't have the strongest power delivery on the gaming x. Only in RT heavy tech demo titles have I been able to pull 500+ watts for more than a few seconds. Mostly it stays in its intended 480W or lower power target when overclocked.


----------



## Xavier233

Hey guys, if I have custom nvidia control panel settings, anyway to save them for all games, so if I update drivers, or reinstall through DDU, I wont have to redo all that work all over again?


----------



## yzonker

Xavier233 said:


> Hey guys, if I have custom nvidia control panel settings, anyway to save them for all games, so if I update drivers, or reinstall through DDU, I wont have to redo all that work all over again?


I haven't used it, but NVProfileInspector can import/export with the buttons I circled.


----------



## Xavier233

yzonker said:


> I haven't used it, but NVProfileInspector can import/export with the buttons I circled.
> 
> View attachment 2593231


I will try it out, thank you!


----------



## J7SC

PLATOON TEKK said:


> Hope all has been good.
> 
> Been messy since I got back, just had a chance to install both HOF OC Plus and the Suprim. I didn't bother with the GOD chiller setup in Platoon Tekk main since there are no waterblocks for these HOFs yet. The cards feel a lot more quality than last gen did and it seems like Galax got their **** in check again (980-2080 era).
> 
> The cards come shipped with a 666w "Performance" bios and a 450w "Silent" bios. The TechPowerup 666w bios is the same revision as mine is, however, the 450w silent bios doesn't seem to be available online (to my knowledge). I will gladly upload it if anyone is interested. I am also asking my connect for both the true Galax XOC bios (might try Asus 1k) and GOC voltage control software (unfortunately v3.0.0.4 doesn't address the cards). In regards to WB, I'm hoping Bykski or Bitspower (as they did for 3090) drop a block for this asap.
> 
> For power, I used a MEG Ai 1300w for both HOF setups and used the 12 and 4-converted power plugs, that's why the plugs look different. The "HOF Panel" is identical to last gen and there is slightly less RGB overall on the card's fans (for the better imo). The "RGB crown" is detachable and magnetic, same as the lcd panel. The card's RGB (and separate indicator light) also report if the card is powered and connected fully.
> 
> I will put them both to the test and see how the perform vs Suprim/Phanteks on average. Will list temps and power draw etc shorty too, already saw 690w during brief MW2 session. Any specific tests/runs or info anybody wants?
> 
> Sorry once more for the watermarks but I got some headache through stolen HOF pics from resellers before.
> 
> View attachment 2593208
> 
> 
> 
> Spoiler: Pics & Setup
> 
> 
> 
> 
> View attachment 2593197
> View attachment 2593194
> View attachment 2593193
> View attachment 2593196
> View attachment 2593198
> 
> View attachment 2593200
> 
> View attachment 2593195
> 
> 
> 
> 
> Videos of Quake RTX test session after messy install.


Very Noice ! 
One thing I have been wondering about is the 2nd 12VHPWR connector on the Galax HoF; specifically, why the 667W performance vbios actually works on other cards (like mine) with 1x 12VHPWR. I realize that there must be a Galax XoC vbios at ~ 1320 W or so that would require both 12VHPWR connectors, but would still like to learn more about how the Galax HoF card distributes between the two 12VHPR connectors when on '''only''' the 667W performance vbios (which I had at almost 700W on my card 🤪).


----------



## yzonker

J7SC said:


> Very Noice !
> One thing I have been wondering about is the 2nd 12VHPWR connector on the Galax HoF; specifically, why the 667W performance vbios actually works on other cards (like mine) with 1x 12VHPWR. I realize that there must be a Galax XoC vbios at ~ 1320 W or so that would require both 12VHPWR connectors, but would still like to learn more about how the Galax HoF card distributes between the two 12VHPR connectors when on '''only''' the 667W performance vbios (which I had at almost 700W on my card 🤪).


I think we already have the answer to that. The card just sees one power connector despite being 2. That's why there is only one in GPUZ. Galax may have just combined them together because Nvidia's bios is probably designed for just one connector. Easier than getting Nvidia to modify the bios for them. You know how much Nvidia loves their AIB's....

Maybe @PLATOON TEKK can get a GPUZ screenshot, but I assume we'll see the same thing (1 connector).


----------



## yzonker

Got GPUZ fixed. Really dumb reason it wasn't working (hint: I have my iGPU plugged in to my 2nd monitor since the Galax bios borks the 2nd HDMI port on my TUF).

Anyway it just shows "16 pin" power/voltage. I suspect the bios provided by Nvidia is only capable of recognizing one. They already got rid of pesky EVGA, now they just need to kill off Galax so they can become a full Apple clone. lol


----------



## J7SC

...I guess I am wondering about the XoC bios and how i kicks in the second 12VHPWR to full power (teamed?). Also, I love to see that Galax OC software working on the VRAM voltage slider in a screenie 😋


----------



## yzonker

J7SC said:


> ...I guess I am wondering about the XoC bios and how i kicks in the second 12VHPWR to full power (teamed?). Also, I love to see that Galax OC software working on the VRAM voltage slider in a screenie 😋


Seems like that's no different. I would think it would work the same. Heck, Asus is doing 1000w on a bios intended for a single connector card. I will bet that the Galax XOC will still just show "16 pin power/voltage", but with a much higher PL. Almost has to since the production Galax bios works that way.


----------



## J7SC




----------



## yzonker

J7SC said:


>


I think I see what you're getting at now. But I'd be willing to bet both of those connectors draw power with the stock bios. There's no turning on a 2nd connector. They are always hot probably.


----------



## MDXZFR

Hey guys. I'm planning to swap the stock 25mm (thickness) aluminum radiator to 40mm full copper radiator by Bykski. Wanna hear what enthusiasts here think.

The tube's circumference is about 50mm, so the diameter is 16mm (od). So I think I could just plug and play the tubes to to the tube's fitting adapter. In other words, I can just swap the radiator to the custom open-loop radiator.

It's 4090 Aorus Xtreme Waterforce gpu.


----------



## neteng101

Really curious if the Galax card was meant to be a 1.33kw card! That's sorta insane and cool too. Doing my bench runs on the Galax bios - max I could get with PR was 2880x. The max on a 13700k is on 29.1k. This run was with forced rebar. OC to what I can do on the card +165/+1700 on the Galax bios (next increments up will break).









I scored 28 804 in Port Royal


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com





If anything, PR seems dependent on system components more than you'd think. Speedway OTOH, seems to be very much all GPU...









I scored 11 211 in Speed Way


Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}




www.3dmark.com





Unlike PR, the spread of scores on the 4090 leaderboard is much tighter.


----------



## Gking62

yzonker said:


> So, some good news and some bad news.
> 
> Good news, Asus 1kw XOC bios!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Asus RTX 4090 VBIOS
> 
> 
> 24 GB GDDR6X, 2520 MHz GPU, 1313 MHz Memory
> 
> 
> 
> 
> www.techpowerup.com
> 
> 
> 
> 
> 
> Bad news is that although it has the special sauce, it doesn't seem to be quite a special as the sauce Galax used. I'm about 50-100 pts away from my best PR score. Effective clocks are incredibly close to requested though. Looked closer than Galax, so I'm not sure why the score isn't there.
> 
> Let us know if you get different results please.
> 
> It's also funky in that GPUZ shows MSI, but this is definitely 1kw as I easily hit 700w with Kombustor (set to 70%). It also defaults to memory running at 100% and PL defaults to 100% and only goes down from there.


for whatever reason, it made me reinstall the graphics driver, otherwise no issues...


----------



## Xavier233

yzonker said:


> I haven't used it, but NVProfileInspector can import/export with the buttons I circled.
> 
> View attachment 2593231


Works well, was able to restore changes on all custom profiles. Nice


----------



## J7SC

@PLATOON TEKK ....no rush, but if you use the HoF AI software, I would love to know by how much the VRAM slider increases mV by (ditto for core voltage slider).


----------



## tubs2x4

Zero989 said:


> 2449 cdn plus tax


Newegg posted the liquid x for 2499 tonight. Canceled my bestbuy order for the triox and ordered this one. Out of my dam mind paying this much for a vid card! Haha but curious about the performance and how it responds on my system compared to the 3070.


----------



## elbramso

Gking62 said:


> for whatever reason, it made me reinstall the graphics driver, otherwise no issues...
> 
> View attachment 2593267
> 
> View attachment 2593266


Wow! 
Do you need direct voltage control to be able to force this amount of power? 
How do you do it 😅?


----------



## StreaMRoLLeR

@yzonker Did you observe gained bin ( 15mhz-30mhz) compared to Galax Bios ?


----------



## StreaMRoLLeR

PLATOON TEKK said:


> Hope all has been good.
> 
> Been messy since I got back, just had a chance to install both HOF OC Plus and the Suprim. I didn't bother with the GOD chiller setup in Platoon Tekk main since there are no waterblocks for these HOFs yet. The cards feel a lot more quality than last gen did and it seems like Galax got their **** in check again (980-2080 era).
> 
> The cards come shipped with a 666w "Performance" bios and a 450w "Silent" bios. The TechPowerup 666w bios is the same revision as mine is, however, the 450w silent bios doesn't seem to be available online (to my knowledge). I will gladly upload it if anyone is interested. I am also asking my connect for both the true Galax XOC bios (might try Asus 1k) and GOC voltage control software (unfortunately v3.0.0.4 doesn't address the cards). In regards to WB, I'm hoping Bykski or Bitspower (as they did for 3090) drop a block for this asap.
> 
> For power, I used a MEG Ai 1300w for both HOF setups and used the 12 and 4-converted power plugs, that's why the plugs look different. The "HOF Panel" is identical to last gen and there is slightly less RGB overall on the card's fans (for the better imo). The "RGB crown" is detachable and magnetic, same as the lcd panel. The card's RGB (and separate indicator light) also report if the card is powered and connected fully.
> 
> I will put them both to the test and see how the perform vs Suprim/Phanteks on average. Will list temps and power draw etc shorty too, already saw 690w during brief MW2 session. Any specific tests/runs or info anybody wants?
> 
> Sorry once more for the watermarks but I got some headache through stolen HOF pics from resellers before.
> 
> View attachment 2593208
> 
> 
> 
> Spoiler: Pics & Setup
> 
> 
> 
> 
> View attachment 2593197
> View attachment 2593194
> View attachment 2593193
> View attachment 2593196
> View attachment 2593198
> 
> View attachment 2593200
> 
> View attachment 2593195
> 
> 
> 
> 
> Videos of Quake RTX test session after messy install.


Can you test GOC tool on Galax bios flashed Suprim ? Will it work ?

At %100 power slider ( default ) whats the approx draw at CP2077K, Plague Tale Requiem


----------



## menko2

StreaMRoLLeR said:


> Cooler master pump on Liquid. Dunno if MSI supervised CM to make a better pump then their inferior CM CPU AIO’s
> 
> So far I am happy with my Liquid X. Flow is better compared to my 3080 ti strix LC ( both 400w pull 1600rpm t30’s )
> 
> If you buy Air suprim it will dump all waste heat into motherboard + ram + vrms. If you overclock your ddr5 ram it wont like this.
> 
> Suprim Liquid is closest thing we can buy compared to 3090 Kingpin’s. ( pcb+Quality+ cooling+ binned chip)


Do you have ideo of the AIO of from the Suprim liquid how long should it last?

If not much I'll go with either Suprim X (air) 2200€ or the KAF2 1900€.


----------



## StreaMRoLLeR

MDXZFR said:


> Hey guys. I'm planning to swap the stock 25mm (thickness) aluminum radiator to 40mm full copper radiator by Bykski. Wanna hear what enthusiasts here think.
> 
> The tube's circumference is about 50mm, so the diameter is 16mm (od). So I think I could just plug and play the tubes to to the tube's fitting adapter. In other words, I can just swap the radiator to the custom open-loop radiator.
> 
> It's 4090 Aorus Xtreme Waterforce gpu.
> View attachment 2593262


Dont bother with it. There is a Der8auer video showing exactly what you need. The end result was not expected. If you are not happy with performance, I would swap those bad giga fans with Phanteks T30. You can see decrease of 2-3temps probably


----------



## Faltzer

inedenimadam said:


> I have a gaming x trio. I have flashed a few different bioses, but there are VERY few instances where it will eat 500+ watts, and i get no gains from going 1.05V to 1.1V. We don't have the strongest power delivery on the gaming x. Only in RT heavy tech demo titles have I been able to pull 500+ watts for more than a few seconds. Mostly it stays in its intended 480W or lower power target when overclocked.


Haha you'll be amazed to hear that there is a Lego game I got free from epic games, which actually pulls over 500 Watts.. At least I was surprised


----------



## andreiga76

neteng101 said:


> Really curious if the Galax card was meant to be a 1.33kw card! That's sorta insane and cool too. Doing my bench runs on the Galax bios - max I could get with PR was 2880x. The max on a 13700k is on 29.1k. This run was with forced rebar. OC to what I can do on the card +165/+1700 on the Galax bios (next increments up will break).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 28 804 in Port Royal
> 
> 
> Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> If anything, PR seems dependent on system components more than you'd think. Speedway OTOH, seems to be very much all GPU...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I scored 11 211 in Speed Way
> 
> 
> Intel Core i7-13700K Processor, NVIDIA GeForce RTX 4090 x 1, 65536 MB, 64-bit Windows 11}
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Unlike PR, the spread of scores on the 4090 leaderboard is much tighter.


PR is much more dependent on CPU/Sys Mem overclock than Speedway.
On my average 5950x, not reaching 5ghz in tests, I have better ranking scores on Speedway, air cooled GPU.
11320 Speedway link
28837 Port Royale link


----------



## StreaMRoLLeR

menko2 said:


> Do you have ideo of the AIO of from the Suprim liquid how long should it last?
> 
> If not much I'll go with either Suprim X (air) 2200€ or the KAF2 1900€.


At least solid 3-4 years. Generally its wise to sell AIO gpu’s before their warranty expires.

But what is new for this gen is;

Pump in suprim can go idle unlike Asetek pumps going full throttle all the time.


----------



## menko2

StreaMRoLLeR said:


> At least solid 3-4 years. Generally its wise to sell AIO gpu’s before their warranty expires.
> 
> But what is new for this gen is;
> 
> Pump in suprim can go idle unlike Asetek pumps going full throttle all the time.


Ok I'm going for MSI.

You would go with the SUPRIM X LIQUID or the SURPIM X AIR? 

Both have the same price in Spain.


----------



## MDXZFR

StreaMRoLLeR said:


> Dont bother with it. There is a Der8auer video showing exactly what you need. The end result was not expected. If you are not happy with performance, I would swap those bad giga fans with Phanteks T30. You can see decrease of 2-3temps probably


What's the end result? Can u elaborate more? It's just a simple mod with swapping the radiator. 
The temps are good, it's just me who wants to lower them more as low as they could. Maybe like almost as cool as custom waterblock.


----------



## yzonker

StreaMRoLLeR said:


> @yzonker Did you observe gained bin ( 15mhz-30mhz) compared to Galax Bios ?


No, don't think so. The only time I saw anything like that was when I dropped my chiller temp below 10C where I was able to add a couple more bins manually.


----------



## StreaMRoLLeR

@menko2 Go with Liquid. Air cards dump waste heat into system. 

@MDXZFR 




17.40min


----------



## neteng101

andreiga76 said:


> PR is much more dependent on CPU/Sys Mem overclock than Speedway.
> On my average 5950x, not reaching 5ghz in tests, I have better ranking scores on Speedway, air cooled GPU.
> 11320 Speedway link
> 28837 Port Royale link


Your scores are similar to mine - SW improvement due to higher OC but PR is almost the same despite you having a higher GPU OC.

I said the same - PR is dependent on CPU/memory but others here don’t agree. I think it’s also possibly CPU cache can help too. Just what I’m noticing looking at the leaderboard combinations.


----------



## Zero989

J7SC said:


> @PLATOON TEKK ....no rush, but if you use the HoF AI software, I would love to know by how much the VRAM slider increases mV by (ditto for core voltage slider).
> View attachment 2593275


Anyone know how to reverse engineer software for science?

I presume this HoF has an unlocked vram voltage regulator.


----------



## MDXZFR

StreaMRoLLeR said:


> @menko2 Go with Liquid. Air cards dump waste heat into system.
> 
> @MDXZFR
> 
> 
> 
> 
> 17.40min


Thanks for the vid. 1st of all, the gpu in the video was only covered the die & vram unlike Aorus card, the copper plate covers everything including the vrms.

2nd, he did saying that the noise is lower, almost double with copper radiator & lot of surface area.

3rd, he said that it is possible to convert it to the custom watercooling with a little effort without modification.

Yeah, it has the benefits behind it. So i think it's fine to proceed. About the warranty, I think it's the same with using custom waterblock (void in some countries, not void in others)


----------



## joyzao

inedenimadam said:


> I have a gaming x trio. I have flashed a few different bioses, but there are VERY few instances where it will eat 500+ watts, and i get no gains from going 1.05V to 1.1V. We don't have the strongest power delivery on the gaming x. Only in RT heavy tech demo titles have I been able to pull 500+ watts for more than a few seconds. Mostly it stays in its intended 480W or lower power target when overclocked.


Recommend this gpu ? But in tests like portroyal with this 600w bios, does it go from 500-550w?

Thanks


----------



## menko2

MDXZFR said:


> Thanks for the vid. 1st of all, the gpu in the video was only covered the die & vram unlike Aourus card, the copper plate covers everything including the vrms.
> 
> 2nd, he did saying that the noise is lower, almost double with copper radiator & lot of surface area.
> 
> 3rd, he said that it is possible to convert it to the custom watercooling with a little effort without modification.
> 
> Yeah, it has the benefits behind it. So i think it's fine to proceed. About the warranty, I think it's the same with using custom waterblock (void in some countries, not void in others)


The Suprim Liquid doesn't conver everything?

The Aorus Waterforce I read bad things about it's built. Is it good comparing to the Suprim?


----------



## Zero989

menko2 said:


> Ok I'm going for MSI.
> 
> You would go with the SUPRIM X LIQUID or the SURPIM X AIR?
> 
> Both have the same price in Spain.


You're wasting your money. Just get the KFA2.


----------



## 8472

Success!!! 

Just put my Suprim X in, no coil whine!!!!! I'm talking about outside of menus of course. Even then, it's only a very hard to hear buzz. My TUF was insanely loud with coil whine even with the power limit at 70%.

I like the look of it too, it looks better in person than in videos. 

It's actually very similar dimensions to my 3090 Suprim X. They're almost the exact same length and width, the only difference is that it is an extra slot taller. This would explain why I thought the comments about the 4090's size were exaggerated since my 3090 was already pretty much the same size. 

The included Nvidia adapter had one 8 pin that wouldn't accept my cables for some reason. Not a big deal since the other three work plus I have cablemod and corsair 12vhpwr cables. But I'll need to buy a replacement adapter for when I try to sell the card in the future.


----------



## menko2

8472 said:


> Success!!!
> 
> Just put my Suprim X in, no coil whine!!!!! I'm talking about outside of menus of course. Even then, it's only a very hard to hear buzz. My TUF was insanely loud with coil whine even with the power limit at 70%.
> 
> I like the look of it too, it looks better in person than in videos.
> 
> It's actually very similar dimensions to my 3090 Suprim X. They're almost the exact same length and width, the only difference is that it is an extra slot taller. This would explain why I thought the comments about the 4090's size were exaggerated since my 3090 was already pretty much the same size.
> 
> The included Nvidia adapter had one 8 pin that wouldn't accept my cables for some reason. Not a big deal since the other three work plus I have cablemod and corsair 12vhpwr cables. But I'll need to buy a replacement adapter for when I try to sell the card in the future.


Is it the Suprim Air or the Liquid one?

I'm about to buy one but can't decide. Same price both here in Spain.


----------



## 8472

menko2 said:


> Is it the Suprim Air or the Liquid one?
> 
> I'm about to buy one but can't decide. Same price both here in Spain.


It's the air. I considered the liquid but I was concerned about the AIO and how long it would last. If MSI sold gpu AIO kits like EVGA used to, I would have bought the liquid along with an extra AIO so I wouldn't have to send it in if there was an issue. 

I don't know what MSI's RMA process is like especially regarding wait times, but I felt more comfortable with the air cooled version since basically the only thing that can fail cooling wise are the fans.


----------



## MDXZFR

menko2 said:


> The Suprim Liquid doesn't conver everything?
> 
> The Aorus Waterforce I read bad things about it's built. Is it good comparing to the Suprim?


Yup, Liquid X doesn't cover all the vrms just like the 3090 Strix in the vid. That's why they need a fan on the cards. 

U read, but I own it. It's a crazy good in temps. The reason I wanna change to the bigger radiator is to get to the custom loop level. I mean, not as cold as custom loop, but at least almost cooler as it is. 

The Aorus Xtreme card has the same size of tubes just like custom loop's which is around 16mm OD (not sure the ID bcs I can't measure it of course). Just swap the radiator and refill it using submerged technique or use a small reservoir or also can use the external pump (reservoir + pump combo). At least that's the plan.


----------



## StreaMRoLLeR

MDXZFR said:


> Thanks for the vid. 1st of all, the gpu in the video was only covered the die & vram unlike Aorus card, the copper plate covers everything including the vrms.
> 
> 2nd, he did saying that the noise is lower, almost double with copper radiator & lot of surface area.
> 
> 3rd, he said that it is possible to convert it to the custom watercooling with a little effort without modification.
> 
> Yeah, it has the benefits behind it. So i think it's fine to proceed. About the warranty, I think it's the same with using custom waterblock (void in some countries, not void in others)


You will butcher you warranty the moment knife touches the hoses.

You can lower noise-dba via better fans AKA Phanteks T30

But go for it and share your experiment with us


----------



## Xavier233

Anyone having weird FPS drops after going to 528.02? I did a clean install, and a DDU before, but I still have 20 FPS weird drops on some games with that driver. Rolled back to 527 and those drops are gone


----------



## MDXZFR

StreaMRoLLeR said:


> You will butcher you warranty the moment knife touches the hoses.
> 
> You can lower noise-dba via better fans AKA Phanteks T30
> 
> But go for it and share your experiment with us


I'm not gonna cut a thing. They just glued the tubes and used heat shrink wrap. I'll blow the hot air and pull it out. About the warranty, how cared peoples with it for going for the custom loop route, so do I. We're pro users 😂 (actually I cared. It's a 2000 dollars card. But.. ) 

I know t30 fans are recommended but well, I need some rgb for my build to look fancy 😂. I'm using unifan sl120, 140.

I think I'll do the experiment. Nothing is gonna hurt. (hope so). I'll share here soon.


----------



## menko2

StreaMRoLLeR said:


> At least solid 3-4 years. Generally its wise to sell AIO gpu’s before their warranty expires.
> 
> But what is new for this gen is;
> 
> Pump in suprim can go idle unlike Asetek pumps going full throttle all the time.


MSI here has 2 year warranty. Not much.

I have the option to get the FE for 1850€ or the Suprim Liquid for 2200€.

Will you get the Liquid for 350€ difference over the FE for gaming at 4k?


----------



## Zero989

menko2 said:


> MSI here has 2 year warranty. Not much.
> 
> I have the option to get the FE for 1850€ or the Suprim Liquid for 2200€.
> 
> Will you get the Liquid for 350€ difference over the FE for gaming at 4k?


Dude def get the FE, has amazing resale value as well. The surpim liquid is not worth the money unless you plan on gaming out in the desert.


----------



## Sheyster

yzonker said:


> No, don't think so. The only time I saw anything like that was when I dropped my chiller temp below 10C where I was able to add a couple more bins manually.


How about memory OC? Any difference there? I may try it just to see if it'll help me go past +1400. Interestingly enough, with the Galax BIOS I was able to do +1450 stable after the memory was warmed up. I could not do this with any other BIOS.


----------



## yzonker

Sheyster said:


> How about memory OC? Any difference there? I may try it just to see if it'll help me go past +1400. Interestingly enough, with the Galax BIOS I was to do +1450 stable after the memory was warmed up. I could not do this with any other BIOS.


No, about the same also.


----------



## StreaMRoLLeR

menko2 said:


> MSI here has 2 year warranty. Not much.
> 
> I have the option to get the FE for 1850€ or the Suprim Liquid for 2200€.
> 
> Will you get the Liquid for 350€ difference over the FE for gaming at 4k?


Oh.. In my country its 3 year warranty, so i planned accordingly. Its not like the pump will fail after 720 days 

Both are good options. ( you can use that saved 400$) for custom watercooling later


----------



## Pepillo

menko2 said:


> MSI here has 2 year warranty. Not much.
> 
> I have the option to get the FE for 1850€ or the Suprim Liquid for 2200€.
> 
> Will you get the Liquid for 350€ difference over the FE for gaming at 4k?


Where do you live? In Spain the guarantee is three years by law.


----------



## dentnu

I was just able to order a gaming trio X from best buy a little while ago. While it was not the card I wanted as I was waiting for an FE I just happened be find it in stock. I started doing some reading and the gaming X only has 480W, 50A power stages. Compared to the other top cards that can do 600watts and have 70A power stages. I am starting to reconsider my purchase as these cards are not exactly cheap. I am in no rush for a 4090 just want to get a card that is designed to deliver the full power of a 4090 and leaves no performance behind. I plan to mostly just game and do some overclocking. What's the verdict on this card is this a goof card to get ?


----------



## StreaMRoLLeR

dentnu said:


> I was just able to order a gaming trio X from best buy a little while ago. While it was not the card I wanted as I was waiting for an FE I just happened be find it in stock. I started doing some reading and the gaming X only has 480W, 50A power stages. Compared to the other top cards that can do 600watts and have 70A power stages. I am starting to reconsider my purchase as these cards are not exactly cheap. I am in no rush for a 4090 just want to get a card that is designed to deliver the full power of a 4090 and leaves no performance behind. I plan to mostly just game and do some overclocking. What's the verdict on this card is this a goof card to get ?


4090 Trio have one of the worst cards for 90 series. MSI cheaped out on Vapor Chamber. There is NONE ! I would generally recommend G-G OC for cheap alternative for mrsp prices


----------



## J7SC

Zero989 said:


> Anyone know how to reverse engineer software for science?
> 
> I presume this HoF has an unlocked vram voltage regulator.


Yeah, that's why I was asking - once the additional VRAM voltage is known via HoF sliders, one can either use Elmor's EVC2SE or may be even a right-sized shunt mod. Then again, there's this (time-stamped) YT which is working there on the '_it's not a 4090_ 4090' EVGA (watch the core voltage, mind you w/o load).


----------



## neteng101

dentnu said:


> What's the verdict on this card is this a goof card to get ?


Have the non X (no factory OC) version. Not a top tier card but even it is perfectly fine running maxed out… I’ve tried the 600w SuprimX bios and recently the 666w Galax OC bios. Has some coil whine mostly noticeable when benching.

I would get this over Gigabyte cards and the Zotac/PNY cards. Most of those are still using analog VRM controllers ie the 9512 while MSI stuck to the same one as Nvidia and Asus which are digital. Edit - even the Asus Tuf model is using the more basic VRM controller.


----------



## coelacanth

dentnu said:


> I was just able to order a gaming trio X from best buy a little while ago. While it was not the card I wanted as I was waiting for an FE I just happened be find it in stock. I started doing some reading and the gaming X only has 480W, 50A power stages. Compared to the other top cards that can do 600watts and have 70A power stages. I am starting to reconsider my purchase as these cards are not exactly cheap. I am in no rush for a 4090 just want to get a card that is designed to deliver the full power of a 4090 and leaves no performance behind. I plan to mostly just game and do some overclocking. What's the verdict on this card is this a goof card to get ?


Just get whatever is cheapest, especially if it's just for gaming with some light OCing. From this thread it seems the Asus cards have the worst coil whine, but it's luck of the draw.


----------



## PLATOON TEKK

Haven't had a chance to bench yet, but i did have a quick COD session. I'm on the 666w P bios. Unless something is off, it seems i can max out 2k slider in afterburner (1562mhz) on ram and not crash in cod (which is really unstable for OCs for me typically) I can also do 3060hz on core stable. Temps were around 50c ("hyper mode"/full fans was off )This was with no curve and fans on auto. This is on AW3424DW @ 3440x1440



J7SC said:


> Very Noice !
> One thing I have been wondering about is the 2nd 12VHPWR connector on the Galax HoF; specifically, why the 667W performance vbios actually works on other cards (like mine) with 1x 12VHPWR. I realize that there must be a Galax XoC vbios at ~ 1320 W or so that would require both 12VHPWR connectors, but would still like to learn more about how the Galax HoF card distributes between the two 12VHPR connectors when on '''only''' the 667W performance vbios (which I had at almost 700W on my card 🤪).


It seems @yzonker is correct about the readings. see the attached pic with gpuz, hwinfo, cod and the hof ai suite running.



J7SC said:


> @PLATOON TEKK ....no rush, but if you use the HoF AI software, I would love to know by how much the VRAM slider increases mV by (ditto for core voltage slider).
> View attachment 2593275


Of course, never an issue! Oddly mine is greyed out too (using the latest version), it doesn't seem to be detecting the GPU fully.

Hopefully ill get my hands on the GOC software soon.



StreaMRoLLeR said:


> Can you test GOC tool on Galax bios flashed Suprim ? Will it work ?
> 
> At %100 power slider ( default ) whats the approx draw at CP2077K, Plague Tale Requiem


The GOC software typically only works on Galax cards (flashed bios wont cut it), however, the current version 3.0.0.4 doesnt support 4090s. I actually coincidentally tested with my suprim to no avail. 

There doesn't seem to be any blocks on the horizon either for now, so might try my frankenstein block on this. Ill post more numbers once I get this work done and have a little more time with the cards.



Spoiler: Screenshot


----------



## J7SC

neteng101 said:


> Have the non X (no factory OC) version. Not a top tier card but even it is perfectly fine running maxed out… I’ve tried the 600w SuprimX bios and recently the 666w Galax OC bios. Has some coil whine mostly noticeable when benching.
> 
> I would get this over Gigabyte cards and the Zotac/PNY cards. Most of those are still using analog VRM controllers ie the 9512 while MSI stuck to the same one as Nvidia and Asus which are digital. Edit - even the Asus Tuf model is using the more basic VRM controller.


...see KingPin's comment about analog in the timestamped YT above.

In any event, I can highly recommend the Giga-G-OC but I only have this one sample, and it is my only 4090...still, judging by PCB / power phase the Giga-G-OC is decent - and leaving out $s and availability for a moment - the Galax HoF would be my first choice, followed closely by the Strix OC and MSI Suprim X.



PLATOON TEKK said:


> (...)
> Of course, never an issue! Oddly mine is greyed out too (using the latest version), it doesn't seem to be detecting the GPU fully.
> 
> Hopefully ill get my hands on the GOC software soon.
> 
> The GOC software typically only works on Galax cards (flashed bios wont cut it), however, the current version 3.0.0.4 doesnt support 4090s.
> 
> There doesn't seem to be any blocks on the horizon either for now, so might try my frankenstein block on this. Ill post more numbers once I get this work done and have a little more time with the ofcards.
> 
> 
> 
> Spoiler: Screenshot
> 
> 
> 
> 
> View attachment 2593358


Thanks   ...screenshot is a bit hard to read, but it seems to only show 'half' of the true GPU power (so teamed power which was my question) though it could also just be an HWInfo thing given that there are hardly any 2x 12VHPWR cards out there.

I managed to load the HoF Ai software on my Giga-G-OC running the Galax 667 W vbios, but sadly, no voltage control either. In any case looks like you got great VRAM clocks ! I typically stick around +1500 MHz on the slider though found a few memory holes and managed to do some benching at +1702 (but rarely...).

I really hope you get your hands on the special OCL software to unlock the voltages (ie. VRAM, and core beyond 1.1V)...forums at HWBot are usually a good spot for some 'intelligence' on it, as much as the top LN2ers have to sign NDAs (probably the same for the Strix 1 kW)


----------



## Xavier233

Xavier233 said:


> Anyone having weird FPS drops after going to 528.02? I did a clean install, and a DDU before, but I still have 20 FPS weird drops on some games with that driver. Rolled back to 527 and those drops are gone


Any feedback?


----------



## dboom

2nd block:










































Loctite on EK acrilic hard tube, no reaction:










EK 3090 backplate, no reaction










Anyone have EK strix/tuf WB?


----------



## Artjsalina5

dboom said:


> 2nd block:
> 
> View attachment 2593364
> 
> View attachment 2593365
> 
> View attachment 2593366
> 
> View attachment 2593367
> 
> View attachment 2593368
> 
> 
> Loctite on EK acrilic hard tube, no reaction:
> 
> View attachment 2593369
> 
> 
> EK 3090 backplate, no reaction
> 
> View attachment 2593370
> 
> 
> Anyone have EK strix/tuf WB?


Please stop


----------



## Avacado

dboom said:


> 2nd block:
> 
> View attachment 2593364
> 
> View attachment 2593365
> 
> View attachment 2593366
> 
> View attachment 2593367
> 
> View attachment 2593368
> 
> 
> Loctite on EK acrilic hard tube, no reaction:
> 
> View attachment 2593369
> 
> 
> EK 3090 backplate, no reaction
> 
> View attachment 2593370
> 
> 
> Anyone have EK strix/tuf WB?


Stop over tightening the screws in your acrylic.


----------



## dboom

LOL, factory defaults!!!!!!!!!!!!!!!!!!
Thx for "the advice" anyway.


----------



## dboom

Artjsalina5 said:


> Please stop


Please stop if you don't have the block.


----------



## Artjsalina5

dboom said:


> LOL, factory defaults!!!!!!!!!!!!!!!!!!
> Thx for "the advice" anyway.


Thats an RMA then


----------



## mirkendargen

dboom said:


> LOL, factory defaults!!!!!!!!!!!!!!!!!!
> Thx for "the advice" anyway.


I'm no lover of EK and think they're overpriced garbage riding on the coattails of their former glory, but do they list torque specs or something? Otherwise how do you know they're "factory default" if you took them out, put loctite on them, and put them back in?


----------



## neteng101

J7SC said:


> ...see KingPin's comment about analog in the timestamped YT above.
> 
> In any event, I can highly recommend the Giga-G-OC but I only have this one sample, and it is my only 4090...still, judging by PCB / power phase the Giga-G-OC is decent - and leaving out $s and availability for a moment - the Galax HoF would be my first choice, followed closely by the Strix OC and MSI Suprim X.


Yes - you can voltage mod with analog VRM. I've noticed Gigabyte has a tendency to cut a few corners in the name of cost - the Strix and Suprim both use the more expensive digital VRM that Nvidia uses for the FE card. From what I've seen though, pretty sure the Gigabyte Gaming OC is decent and not one of their questionable cards - but I'll never view Gigabyte as Asus/MSI tier for quality. Don't want to open a whole can of worms so I'll just leave it at that. 4090s are so overbuilt even the most basic ones are probably fine.


----------



## MDXZFR

Guys, need help for compression fitting. For 10/16mm fitting, it supports 10mm inner diameter and 16mm outer diameter. What I don't really get it is the outer diameter (16mm). Does it support max <16mm OD only? Since it's for the soft tubes, can it fit 14 or 15 OD with 10/16 fitting?


----------



## J7SC

neteng101 said:


> Yes - you can voltage mod with analog VRM. I've noticed Gigabyte has a tendency to cut a few corners in the name of cost - the Strix and Suprim both use the more expensive digital VRM that Nvidia uses for the FE card. From what I've seen though, pretty sure the Gigabyte Gaming OC is decent and not one of their questionable cards - but I'll never view Gigabyte as Asus/MSI tier for quality. Don't want to open a whole can of worms so I'll just leave it at that. 4090s are so overbuilt even the most basic ones are probably fine.


...I don't really think we disagree on the 'order' of cards anyway (as per above, my fav being the Galax HoF, closely followed by the Strix and Suprim) but 20+4 phases and 1000 amps / c on my Giga-G-OC (same as Aorus Master) seem plenty... the analog doesn't bother me, given 3240 MHz top speed and all that...I did get a kick though out of KingPin's _cardboard_-stabilized analog controllers...if it works, it works !


----------



## yzonker

Did a little back to back testing of the Galax vs Asus XOC bios. Rather than trying to do a max OC run, I decided to test for efficiency by running the exact same core/mem clocks in games/benches.

I tested PR, TSE, SW, CP2077, and GotG.

So everything was run:

3045 core (which I monitored through all runs to be sure it was maintained)
+1600 mem

It does take one more bin in core offset to get 3045 on the Asus bios. (+180 Galax, +195 Asus)

I didn't run with Sysinfo just because it takes so much longer. Didn't seem necessary since I was verifying clocks manually and keeping them constant.

Everything was run on my daily OS with HWINFO and Afterburner running, reBar off, so scores aren't all that high.

Bottom line, they are pretty much identical given run to run variance I think. At least it gives us a 2nd option if one of them does not work correctly for some cards. And of course gives a 1000w limit for anyone that wants to try to volt mod.

Not sure why I couldn't quite get to the same score in PR last night. Could be the newest driver isn't quite as fast as the one I ran before? Although those max OC runs are kinda hard for me to repeat since I'm constantly trying to balance water temp vs core clock vs mem clock to get the best score. Might still get there if I put a little more effort in. Results below suggest that.



Spoiler: Lots of pics



Galax,









Asus,









Galax,









Asus,









Galax,









Asus,









Galax,









Asus,









Galax,









Asus,


----------



## Zero989

yzonker said:


> Did a little back to back testing of the Galax vs Asus XOC bios. Rather than trying to do a max OC run, I decided to test for efficiency by running the exact same core/mem clocks in games/benches.
> 
> I tested PR, TSE, SW, CP2077, and GotG.
> 
> So everything was run:
> 
> 3045 core (which I monitored through all runs to be sure it was maintained)
> +1600 mem
> 
> It does take one more bin in core offset to get 3045 on the Asus bios. (+180 Galax, +195 Asus)
> 
> I didn't run with Sysinfo just because it takes so much longer. Didn't seem necessary since I was verifying clocks manually and keeping them constant.
> 
> Everything was run on my daily OS with HWINFO and Afterburner running, reBar off, so scores aren't all that high.
> 
> Bottom line, they are pretty much identical given run to run variance I think. At least it gives us a 2nd option if one of them does not work correctly for some cards. And of course gives a 1000w limit for anyone that wants to try to volt mod.
> 
> Not sure why I couldn't quite get to the same score in PR last night. Could be the newest driver isn't quite as fast as the one I ran before? Although those max OC runs are kinda hard for me to repeat since I'm constantly trying to balance water temp vs core clock vs mem clock to get the best score. Might still get there if I put a little more effort in. Results below suggest that.
> 
> 
> 
> Spoiler: Lots of pics
> 
> 
> 
> 
> View attachment 2593409
> 
> 
> 
> View attachment 2593410


Can you retest CP without DLSS? Previous BIOS' would throttle clocks at 4K w/ Psycho RT and no DLSS.


----------



## yzonker

Zero989 said:


> Can you retest CP without DLSS? Previous BIOS' would throttle clocks at 4K w/ Psycho RT and no DLSS.


Yea that's a good idea. It would be a maximum load. Do that in a bit. 

Which bios are you referring to that would downclock though? Galax or one of the others?


----------



## bmagnien

MDXZFR said:


> Guys, need help for compression fitting. For 10/16mm fitting, it supports 10mm inner diameter and 16mm outer diameter. What I don't really get it is the outer diameter (16mm). Does it support max <16mm OD only? Since it's for the soft tubes, can it fit 14 or 15 OD with 10/16 fitting?
> View attachment 2593414


What soft tubing have you found that’s 10mm ID and NOT 16mm OD?


----------



## Zero989

yzonker said:


> Yea that's a good idea. It would be a maximum load. Do that in a bit.
> 
> Which bios are you referring to that would downclock though? Galax or one of the others?


Anything not Galax or Asus 1kW. You will know you're artificially limited by the bios by turning on "Power limit" in MSI AB overlay. It will show in the overlay for a brief second during high power loads near 527W-570W depending on the 4090 when its maxing out. TimeSpy test 2 is another easy way to test it.


----------



## Gking62

yzonker said:


> Did a little back to back testing of the Galax vs Asus XOC bios. Rather than trying to do a max OC run, I decided to test for efficiency by running the exact same core/mem clocks in games/benches.
> 
> I tested PR, TSE, SW, CP2077, and GotG.
> 
> So everything was run:
> 
> 3045 core (which I monitored through all runs to be sure it was maintained)
> +1600 mem
> 
> It does take one more bin in core offset to get 3045 on the Asus bios. (+180 Galax, +195 Asus)
> 
> I didn't run with Sysinfo just because it takes so much longer. Didn't seem necessary since I was verifying clocks manually and keeping them constant.
> 
> Everything was run on my daily OS with HWINFO and Afterburner running, reBar off, so scores aren't all that high.
> 
> Bottom line, they are pretty much identical given run to run variance I think. At least it gives us a 2nd option if one of them does not work correctly for some cards. And of course gives a 1000w limit for anyone that wants to try to volt mod.
> 
> Not sure why I couldn't quite get to the same score in PR last night. Could be the newest driver isn't quite as fast as the one I ran before? Although those max OC runs are kinda hard for me to repeat since I'm constantly trying to balance water temp vs core clock vs mem clock to get the best score. Might still get there if I put a little more effort in. Results below suggest that.
> 
> 
> 
> Spoiler: Lots of pics
> 
> 
> 
> Galax,
> View attachment 2593401
> 
> 
> Asus,
> View attachment 2593402
> 
> 
> Galax,
> View attachment 2593403
> 
> 
> Asus,
> View attachment 2593404
> 
> 
> Galax,
> View attachment 2593405
> 
> 
> Asus,
> View attachment 2593406
> 
> 
> Galax,
> View attachment 2593407
> 
> 
> Asus,
> View attachment 2593408
> 
> 
> Galax,
> View attachment 2593409
> 
> 
> Asus,
> View attachment 2593410


thanks for the testing on this, I get pretty similar results minus a few hundred with my lower clocks (+160, +900 ) due my 14.8 W/mK pads on memory which I'll be replacing soon, if I may, what are you using on yours?


----------



## yzonker

Zero989 said:


> Anything not Galax or Asus 1kW. You will know you're artificially limited by the bios by turning on "Power limit" in MSI AB overlay. It will show in the overlay for a brief second during high power loads near 527W-570W depending on the 4090 when its maxing out. TimeSpy test 2 is another easy way to test it.


Oh yea, definitely the normal Strix bios will drop effective clocks a lot in CP and GotG. That's what I posted about a couple of days ago. It's so bad in GotG that there's no gain in fps going from 1050 to 1100 mv. I think with my original CP2077 config above, the Asus Strix bios scores around 86 fps average. So about 2 fps difference.

But no difference between the Galax and Asus. DLSS off, Phsycho. Weird jump's in max/min's though. 

Galax,









Asus,


----------



## yzonker

Gking62 said:


> thanks for the testing on this, I get pretty similar results minus a few hundred with my lower clocks (+160, +900 ) due my 14.8 W/mK pads on memory which I'll be replacing soon, if I may, what are you using on yours?


I have 10 W/mK thermal putty on mine. Originally with the air cooler, my card could daily +1800. I can game at +1700, but it will occasionally black screen if the water is too cool on the desktop. So I just use +1600 to avoid that. Which I honestly don't think will be fixed by remounting with lower conductivity pads. It will still cold soak. Just take a little longer.


----------



## mirkendargen

bmagnien said:


> What soft tubing have you found that’s 10mm ID and NOT 16mm OD?


There's lots of 10mm/13mm, it's a common standard.


----------



## Gking62

yzonker said:


> I have 10 W/mK thermal putty on mine. Originally with the air cooler, my card could daily +1800. I can game at +1700, but it will occasionally black screen if the water is too cool on the desktop. So I just use +1600 to avoid that. Which I honestly don't think will be fixed by remounting with lower conductivity pads. It will still cold soak. Just take a little longer.


much appreciated, yeah I have to get in there anyway to replace the Kryonaut Extreme on the core with TFX and may drop the EK pads on the memory, btw how do you get that graphic with the CP2077 settings?


----------



## neteng101

Benched Suprim 600w vs. Galax 666w bios on my MSI Gaming Trio, Galax bios gives a 1.x% boost consistently...









Result







www.3dmark.com













Result







www.3dmark.com













Result







www.3dmark.com





Also noticed on air, the Galax bios was running my fans at a constant ~1,700 rpm speed, while the Suprim X on my card has fan stop but ramps up to 2.1-2.4k rpm and fan 1 / fan 2 speeds are not in sync. Neither is really a good daily bios for me so back to the 520w Suprim air bios. Did post my highest scores in 3DMark on the Galax bios though.


----------



## MDXZFR

bmagnien said:


> What soft tubing have you found that’s 10mm ID and NOT 16mm OD?


Wdym by 10 ID but not 16 OD? Soft tubes can be found easily in 12.7, 13 & 16, 19mm OD, at least here. 
I'm asking about the compression fitting to fit the tubes with 14.6 OD. Which one should I get? 10/13mm or 10/16mm? Or any of them should be fine?


----------



## MDXZFR

Gking62 said:


> much appreciated, yeah I have to get in there anyway to replace the Kryonaut Extreme on the core with TFX and may drop the EK pads on the memory, btw how do you get that graphic with the CP2077 settings?


Thermalright TFX is surely a good thermal paste for something like gpu die. But man, u're gonna struggle to apply it onto the die. It's a different type of regular thermal paste, almost like putty. It's kinda hard to apply to the die since the die is polished metal and hard to stick. I end up using my finger to apply it. Paste spreader won't help. But it's better than kryonaut, at least to me


----------



## Blameless

MDXZFR said:


> Thermalright TFX is surely a good thermal paste for something like gpu die. But man, u're gonna struggle to apply it onto the die. It's a different type of regular thermal paste, almost like putty. It's kinda hard to apply to the die since the die is polished metal and hard to stick. I end up using my finger to apply it. Paste spreader won't help. But it's better than kryonaut, at least to me


Don't need to spread it. Just tint both surfaces, so it flows more freely when squished, then apply a thick line down the middle with a little dot near each corner to make sure you get full coverage, and use as much mounting pressure as practical.


----------



## MDXZFR

Blameless said:


> Don't need to spread it. Just tint both surfaces, so it flows more freely when squished, then apply a thick line down the middle with a little dot near each corner to make sure you get full coverage, and use as much mounting pressure as practical.


Oh I see. I was almost give up applying it 😂


----------



## Jack Chim

Sheyster said:


> I'm using the Galax HOF BIOS as my daily gaming BIOS, at default settings.
> 
> Do you happen to have the latest GB Master BIOS? I believe it's the "F2" version. If so can you please post it? You might have to rename the file extension to .TXT first.


Haha, actually I'm using the same bios as you, 666w is so cool. I have master bios, it is only slightly better than gaming oc, not as good as galax 666 at all, at first I just thought that you need to control the rgb of the fan while you need performance, so I recommended master. If you don't need to control the lights, galax is definitely the best choice


----------



## HeLeX63

Hey guys, just installed an Alphacool Ful Cover WB on my MSI RTX 4090 Suprim, and I have noticed lower memory OC potential. I could easily hit +1250Mhz on air, but now have to dial back to +1150Mhz. Temps for the GDDR6 modules reduced from 55 to 60C on air to about 35 to 40C max on water... Is this right ??


----------



## J7SC

HeLeX63 said:


> Hey guys, just installed an Alphacool Ful Cover WB on my MSI RTX 4090 Suprim, and I have noticed lower memory OC potential. I could easily hit +1250Mhz on air, but now have to dial back to +1150Mhz. Temps for the GDDR6 modules reduced from 55 to 60C on air to about 35 to 40C max on water... Is this right ??


...happens a lot w/this VRAM; colder =/= better...55 C - 60 C was close to optimal


----------



## HeLeX63

J7SC said:


> ...happens a lot w/this VRAM; colder =/= better...55 C - 60 C was close to optimal


Ohk interesting. Im just a bit worried because I did a test boot with the block installed (before fully completing the loop) to see if the GPU worked. Also ran Unigene heaven for 10 seconds to see if my riser cable was seated properly and read X16 at 4.0 in GPU-z before I shut it down. Temps rose to + 80C on GPU or thereabouts, and I was just wondering in the corner of my brain, if somehow, I screwed something up by doing this ??


----------



## Jack Chim

Faltzer said:


> I can confirm alongside other users here that the Galax bios is indeed better performing.
> BUT!
> Is it not dangerous in any way to run higher than specced wattage? My 4090 waterforce ran at almost 651 watts, while the 12vhpwr cable is only rated to 600 watts. Stock vbios didn't run above 530, and the original update decreased it to under 500.
> 
> I am afraid to run furmark as that will probably get it over 700watts
> 
> Couldn't this cause potential card damage?


In fact, in addition to the external power cord, the pcie slot on the motherboard can also provide a standard 75w, so the total safe value is 675w, and 666 is obviously not exceeded. And 675 is just a conservative value, he still has redundancy.
You may remember that on the master 3090 two years ago, ggb only provided two 8pin external connections, and it provided a total of 375w including the slot, but the bios gave 390w, which was given by the official and can be used in practice number. So in many cases, the upper limit of the power supply of the wire and the slot is higher than expected


----------



## HAQ0901

HeLeX63 said:


> Hey guys, just installed an Alphacool Ful Cover WB on my MSI RTX 4090 Suprim, and I have noticed lower memory OC potential. I could easily hit +1250Mhz on air, but now have to dial back to +1150Mhz. Temps for the GDDR6 modules reduced from 55 to 60C on air to about 35 to 40C max on water... Is this right ??


Vram cold bug is real. The colder the worse it overclocks… unfortunately.


----------



## Jack Chim

Faltzer said:


> I have applied the bios update via their official website. It seems to cause no harm.
> Wattage is still capped at 600.
> I have previously said that my card runs okay at +190 core +1390 memory oc, but I have decreased it due to artifacts to 185 / 1350
> 
> I am afraid to use the galax 660w bios, as my 12hpw cable (or however it is being spelled) has a big 600w description written on it. I happen to have a gigabyte PSU which supplied me with an original cable, so I assume it fits perfectly with the waterforce gpu. The stock bios was really capped at 500, so the extra 100 watts is the maximum I am willing to go in order not to risk anything.


The cable is 600w yes, the slot still has 75w


----------



## Blameless

ckjim said:


> In fact, in addition to the external power cord, the pcie slot on the motherboard can also provide a standard 75w


These boards aren't ever drawing near the limit from the slot.

I can see 600w+ thought the 12VHPWR adapter while the slot is still delivering less than 10w.


----------



## sultanofswing

Holy hell how do people with Founders cards deal with the stupid loud fans.
I've got a Dell 2U server here and it's not as loud as one of these cards.


----------



## dboom

mirkendargen said:


> I'm no lover of EK and think they're overpriced garbage riding on the coattails of their former glory, but do they list torque specs or something? Otherwise how do you know they're "factory default" if you took them out, put loctite on them, and put them back in?


As you can see in the pictures there is loctite only between screw head and plexi hole. I didn't touch the screws,


----------



## Nico67

MDXZFR said:


> Wdym by 10 ID but not 16 OD? Soft tubes can be found easily in 12.7, 13 & 16, 19mm OD, at least here.
> I'm asking about the compression fitting to fit the tubes with 14.6 OD. Which one should I get? 10/13mm or 10/16mm? Or any of them should be fine?


10/13, will be tight, but 10/16 might blow off if it doesn't grip enough. The sizing is set to fit snuggly on the id and compress to the od.
The other option is barbs and clamps be clip or cable tie etc which is more tolerant to weird size od.


----------



## andreiga76

neteng101 said:


> Benched Suprim 600w vs. Galax 666w bios on my MSI Gaming Trio, Galax bios gives a 1.x% boost consistently...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Result
> 
> 
> 
> 
> 
> 
> 
> www.3dmark.com
> 
> 
> 
> 
> 
> Also noticed on air, the Galax bios was running my fans at a constant ~1,700 rpm speed, while the Suprim X on my card has fan stop but ramps up to 2.1-2.4k rpm and fan 1 / fan 2 speeds are not in sync. Neither is really a good daily bios for me so back to the 520w Suprim air bios. Did post my highest scores in 3DMark on the Galax bios though.


Regarding air cooling, same with fans on my Gigabyte Master, running loud at min 1500 rpm and cannot be controlled in Afterburner.
I think it's because of different fans on Galax card, they have 2x112 and 1x92 and the voltage is done for those.


----------



## andreiga76

J7SC said:


> ...I don't really think we disagree on the 'order' of cards anyway (as per above, my fav being the Galax HoF, closely followed by the Strix and Suprim) but 20+4 phases and 1000 amps / c on my Giga-G-OC (same as Aorus Master) seem plenty... the analog doesn't bother me, given 3240 MHz top speed and all that...I did get a kick though out of KingPin's _cardboard_-stabilized analog controllers...if it works, it works !
> View attachment 2593398


Gigabyte Master has 24+4 phase with 1200 amps on GPU, but you are correct is much more dependent on the GPU/mem lottery for these boards, the only exception may be HOF OC Labs since it's suppose to be binned by manufacturer.


----------



## yzonker

Gking62 said:


> much appreciated, yeah I have to get in there anyway to replace the Kryonaut Extreme on the core with TFX and may drop the EK pads on the memory, btw how do you get that graphic with the CP2077 settings?


That's just the built in benchmark in CP.


----------



## Jack Chim

Blameless said:


> These boards aren't ever drawing near the limit from the slot.
> 
> I can see 600w+ thought the 12VHPWR adapter while the slot is still delivering less than 10w.


That's just a number and it will work when needed.
3090 also has a 2*8pin version, each 8pin interface is theoretically 150W, if the slot only provides 10W, then the maximum is only 310W, but all 2*8pin 3090 bios will not be lower than 350W, generally can give 375W , while the Gigabyte master gave 390w. If the slot really can only give 10w, it proves that each 8pin can be more Withstand at least 20w-40w. According to this calculation, the 4*8pin of 4090 can reach 680W-760W, so no matter how you calculate, 666W is definitely a safe value you can afford


----------



## bada55

can someone post link how to force rebar enable in benchmarks ?


----------



## MDXZFR

Nico67 said:


> 10/13, will be tight, but 10/16 might blow off if it doesn't grip enough. The sizing is set to fit snuggly on the id and compress to the od.
> The other option is barbs and clamps be clip or cable tie etc which is more tolerant to weird size od.


I see. I thought of that one too, the barb fitting. But I can still buy the compression fitting right? Because it has barb too. If it's too small/big for the OD, I can just remove the compression collar right?


----------



## Avacado

Delete.


----------



## Jack Chim

Avacado said:


> Delete.


Yes, I am very clear that the slot can provide 75w, just to reply to the doubts of the other two friends. When the nvidia graphics card bios could be edited, I also pulled out more than 75w on my gtx750, which can probably run to Around 95, when the number exceeds 105, the 3dmark score will start to drop, so I set the graphics card at 75w-95w, because my card at that time did not have 6pin


----------



## ttnuagmada

HeLeX63 said:


> Hey guys, just installed an Alphacool Ful Cover WB on my MSI RTX 4090 Suprim, and I have noticed lower memory OC potential. I could easily hit +1250Mhz on air, but now have to dial back to +1150Mhz. Temps for the GDDR6 modules reduced from 55 to 60C on air to about 35 to 40C max on water... Is this right ??


Yeah I bought aftermarket pads and everything, temps max in the 40's on the VRAM, and I can't break +1250


----------



## Krzych04650

HeLeX63 said:


> Hey guys, just installed an Alphacool Ful Cover WB on my MSI RTX 4090 Suprim, and I have noticed lower memory OC potential. I could easily hit +1250Mhz on air, but now have to dial back to +1150Mhz. Temps for the GDDR6 modules reduced from 55 to 60C on air to about 35 to 40C max on water... Is this right ??


Yea unfortunately it is. I also had a card that could do +1250 and +1350 for benchmarks, but once it went under water I could no longer do it. At +1350 it is like instant black screen. I had to back down to +1100 for daily, +1200 is still stable if memory is relatively warm (~40C) but if I launch something with cold water and memory right after first boot it can get unstable.

And the problem is that this instability is not always obvious, those errors tend to kind of "stick" for later, for example you can run benchmark at not daily stable settings, get through it just fine, then back down to your stable daily settings, and then get a crash or BSOD few minutes later, so you really don't want to run at the edge of stability for daily, it may seem stable but cause all kinds of weirdness.


----------



## dentnu

StreaMRoLLeR said:


> 4090 Trio have one of the worst cards for 90 series. MSI cheaped out on Vapor Chamber. There is NONE ! I would generally recommend G-G OC for cheap alternative for mrsp prices





neteng101 said:


> Have the non X (no factory OC) version. Not a top tier card but even it is perfectly fine running maxed out… I’ve tried the 600w SuprimX bios and recently the 666w Galax OC bios. Has some coil whine mostly noticeable when benching.
> 
> I would get this over Gigabyte cards and the Zotac/PNY cards. Most of those are still using analog VRM controllers ie the 9512 while MSI stuck to the same one as Nvidia and Asus which are digital. Edit - even the Asus Tuf model is using the more basic VRM controller.





coelacanth said:


> Just get whatever is cheapest, especially if it's just for gaming with some light OCing. From this thread it seems the Asus cards have the worst coil whine, but it's luck of the draw.


Thanks for all the reply. I think I am just going to pick it up and test it and see what I think best buy has a 15 day return window I can use if for whatever reason. These cards are extremely difficult to get and while the Gaming X Trio is not my first pick you got to take what you can find. I read even though it is locked to 450 watts it can be unlocked with a bios update using a HOF or Suprim bios? I also want to know if it I will need a 4 way power adapter to be able to pull the +450 watts safely cause I believe it comes with a 3 way power adapter ?


----------



## Xavier233

Is there a separate thread on overclock to discuss drivers for nvidia (specially for the 4090)?


----------



## Spiriva

bada55 said:


> can someone post link how to force rebar enable in benchmarks ?


----------



## bmagnien

New afterburner beta with increased voltage control on 4090s 😳









Download: MSI Afterburner 4.6.5 (Beta 4)


After yesterday's vast majority of clickbait assumptions/conclusions from many media, we're happy to release MSI Afterburner 4.6.5 (Beta 4). This release adds voltage support towards the latest gra...




www.guru3d.com


----------



## Nizzen

bmagnien said:


> New afterburner beta with increased voltage control on 4090s 😳
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Download: MSI Afterburner 4.6.5 (Beta 4)
> 
> 
> After yesterday's vast majority of clickbait assumptions/conclusions from many media, we're happy to release MSI Afterburner 4.6.5 (Beta 4). This release adds voltage support towards the latest gra...
> 
> 
> 
> 
> www.guru3d.com


No


----------



## Benni231990

so has anybody patch notes to the new Afterburner version?


----------



## bmagnien

Nizzen said:


> No


Yah my bad. Mis-read abbreviated notes. Support added for newer AMD cards, voltage options added for 3000 NV series.


----------



## bmagnien

Benni231990 said:


> so has anybody patch notes to the new Afterburner version?


*Version 4.6.4*


Added new MSI Windows 11 themed skins (Light and Dark editions) by Drerex design
Added voltage control for reference design AMD RADEON RX 6700 XT series graphics cards
Added experimental support for Intel 11th generation CPUs
Added experimental support for Intel 12th generation CPUs
Added experimental support for mobile AMD Ryzen CPUs
Fixed issue with missing memory temperature sensor on AMD RADEON 5700 / 5700 XT series graphics cards
Fixed issue which could prevent MSI Afterburner from opening from tray via main application icon after accessing properties via clicking monitoring tray icon
Increased memory overclocking limit for NVIDIA GeForce RTX 30x0 series graphics cards
Added workaround for internal DirectInput issue, which could cause hotkey handler to stop processing hotkeys correctly after locking/unlocking PC from keyboard with <Ctrl>+<Alt>+<Del> or <Win>+<L>. To bypass it MSI Afterburner is resetting hotkey handler state after lock screen transition now
Optimized monitoring profiles switching implementation for situations when profiles contain different sets of data sources displayed in monitoring tray icons
Application tray icon is DPI aware now:
OS level tray icons scaling is disabled now to prevent tray icon text distortion. Power users may revert back to the previous DPI unaware tray icon rendering mode via configuration file if necessary
Added new bigger tray icon fonts for >=150% and >=200% DPI scaling ratios. Power users may also select tray icon font independently of selected DPI scaling ratio via configuration file if necessary

Application installer is DPI aware now
RivaTuner Statistics Server has been upgraded to v7.3.3


----------



## yzonker

bmagnien said:


> New afterburner beta with increased voltage control on 4090s 😳
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Download: MSI Afterburner 4.6.5 (Beta 4)
> 
> 
> After yesterday's vast majority of clickbait assumptions/conclusions from many media, we're happy to release MSI Afterburner 4.6.5 (Beta 4). This release adds voltage support towards the latest gra...
> 
> 
> 
> 
> www.guru3d.com


They probably just added support for the 4070ti or something like that.


----------



## neteng101

dentnu said:


> I also want to know if it I will need a 4 way power adapter to be able to pull the +450 watts safely cause I believe it comes with a 3 way power adapter ?


I have used the 3-way with the 600w Suprim bios and yes it will go above 450w. Stock bios is up to 480w. I run the 520w Suprim bios daily. Have since got the cablemod cable which is likely overkill - I have the 4 PCIe version of their cable.


----------



## yzonker

neteng101 said:


> I have used the 3-way with the 600w Suprim bios and yes it will go above 450w. Stock bios is up to 480w. I run the 520w Suprim bios daily. Have since got the cablemod cable which is likely overkill - I have the 4 PCIe version of their cable.


The 150w doesn't apply to the PSU side. It's 300w per 8pin, hence the Corsair cable rated at 600w with 2x8pin. 

That said, I did switch from the Corsair to Fasgear with 3x8pin since I occasionally hit mine with more than 600w and it has slightly less voltage droop.


----------



## neteng101

yzonker said:


> The 150w doesn't apply to the PSU side. It's 300w per 8pin, hence the Corsair cable rated at 600w with 2x8pin.
> 
> That said, I did switch from the Corsair to Fasgear with 3x8pin since I occasionally hit mine with more than 600w and it has slightly less voltage droop.


Yeah - if I had to do it again I would have just gotten the 3x8pin cablemod cable instead. No idea why they even bothered to build and offer this version of their cable.

Edit - I can't even find it anymore, maybe they have since stopped building the 4x8-pin to 12V cable entirely cause its pointless.


----------



## menko2

yzonker said:


> The 150w doesn't apply to the PSU side. It's 300w per 8pin, hence the Corsair cable rated at 600w with 2x8pin.
> 
> That said, I did switch from the Corsair to Fasgear with 3x8pin since I occasionally hit mine with more than 600w and it has slightly less voltage droop.


I just received the Corsair Cable before my 4090 FE arrives and it's only 2x8pin....even if it's rated for 600W. Will it be safe to get cablemod with 4x8pin?


----------



## yzonker

menko2 said:


> I just received the Corsair Cable before my 4090 FE arrives and it's only 2x8pin....even if it's rated for 600W. Will it be safe to get cablemod with 4x8pin?


It won't hurt to get the Cablemod, but probably isn't necessary unless you are going to actually exceed 600w with the card.


----------



## Gking62

yzonker said:


> That's just the built in benchmark in CP.


right but I was referring to the screens that were depicting settings, thanks


----------



## kryptonfly

bada55 said:


> can someone post link how to force rebar enable in benchmarks ?


NvidiaProfileInspector link


----------

